Thursday, November 14, 2019

ResNeXt

ResNeXt

2019/10/21

ResNeXt 是論文《Aggregated residual transformations for deep neural networks》提出的模型。在 WRN 基於 ResNet 加寬網路後,ResNeXt 試著增加網路的「基數」(cardinality)。原本在 AlexNet 受限當時 GPU 記憶體而分兩路進行的權宜之計,GoogLeNet 增加基數為四,而 ResNeXt 則將其一般化,每個路徑的卷積核大小也設為一致。參數量接近,但精度更高。

-----


# ResNeXt

-----


# ResNeXt

-----


# ResNeXt

-----


# ResNeXt

-----


# ResNeXt

-----


# ResNeXt

-----


# ResNeXt

-----

References

# ResNeXt
Xie, Saining, et al. "Aggregated residual transformations for deep neural networks." Proceedings of the IEEE conference on computer vision and pattern recognition. 2017.
http://openaccess.thecvf.com/content_cvpr_2017/papers/Xie_Aggregated_Residual_Transformations_CVPR_2017_paper.pdf

深度学习——分类之ResNeXt - 知乎
https://zhuanlan.zhihu.com/p/32913695

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.