Thursday, November 14, 2019

AI 從頭學(三三):ResNet

AI 從頭學(三三):ResNet

2017/08/03

-----

前言:

ResNet 是 ILSVRC'15 的冠軍。藉著增加恆等映射,殘差網路一舉將卷積神經網路的深度推到一百層以上,並且錯誤率降到 3.57%,能力超越人類專家。後續的版本增加 Batch Normalization,並將 ReLU 移出 identity mapping,網路深度於是可以超過一千層。

-----


# ResNet v1

-----


//  Understanding LSTM Networks -- colah's blog

-----


# ResNet v1

-----

Summary:




-----


# ResNet v1

-----


# ResNet v1

-----


# ResNet v1

-----


# ResNet v2

-----


# ResNet v2

-----


# ResNet v2

-----


# ResNet v2

-----

References

◎ 論文 

# The Power of Depth
Eldan, Ronen, and Ohad Shamir. "The power of depth for feedforward neural networks." Conference on Learning Theory. 2016.
http://proceedings.mlr.press/v49/eldan16.pdf

# ResNet v1
He, Kaiming, et al. "Deep residual learning for image recognition." Proceedings of the IEEE conference on computer vision and pattern recognition. 2016.
http://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf

# ResNet v2
He, Kaiming, et al. "Identity mappings in deep residual networks." European Conference on Computer Vision. Springer, Cham, 2016.
https://arxiv.org/pdf/1603.05027.pdf 

-----

◎ 英文參考資料

Understanding and Implementing Architectures of ResNet and ResNeXt for state-of-the-art Image…
https://medium.com/@14prakash/understanding-and-implementing-architectures-of-resnet-and-resnext-for-state-of-the-art-image-cf51669e1624 

Residual blocks — Building blocks of ResNet – Towards Data Science
https://towardsdatascience.com/residual-blocks-building-blocks-of-resnet-fd90ca15d6ec

An Overview of ResNet and its Variants – Towards Data Science
https://towardsdatascience.com/an-overview-of-resnet-and-its-variants-5281e2f56035 

Why is it hard to train deep neural networks  Degeneracy, not vanishing gradients, is the key _ Severely Theoretical
https://severelytheoretical.wordpress.com/2018/01/01/why-is-it-hard-to-train-deep-neural-networks-degeneracy-not-vanishing-gradients-is-the-key/

# LSTM
Understanding LSTM Networks -- colah's blog
http://colah.github.io/posts/2015-08-Understanding-LSTMs/ 

-----

◎ 簡體中文參考資料

# TensorFlow
你必须要知道CNN模型:ResNet - 知乎
https://zhuanlan.zhihu.com/p/31852747

# 綜述
深度学习之四大经典CNN技术浅析 _ 硬创公开课 _ 雷锋网
https://www.leiphone.com/news/201702/dgpHuriVJHTPqqtT.html

一文简述ResNet及其多种变体 _ 机器之心
https://www.jiqizhixin.com/articles/042201

-----

◎ 繁體中文參考資料

【致敬ImageNet】ResNet 6大變體:何愷明,孫劍,顏水成引領計算機視覺這兩年
https://gogonews.cc/article/1576981.html

-----

◎ 代碼實作
 
# Keras
# 479 clapsUnderstanding and Coding a ResNet in Keras - Towards Data Science
https://towardsdatascience.com/understanding-and-coding-a-resnet-in-keras-446d7ff84d33

# Keras
# 260 claps
Implementing a ResNet model from scratch. - Towards Data Science
https://towardsdatascience.com/implementing-a-resnet-model-from-scratch-971be7193718

No comments: