Saturday, June 19, 2021

Entropy, Cross Entropy, and Softmax

 Entropy, Cross Entropy, and Softmax

2021/06/19

-----


-----

References

Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names

https://gombru.github.io/2018/05/23/cross_entropy_loss/


softmax和cross-entropy是什麼關係? - 知乎

https://www.zhihu.com/question/294679135


剖析深度學習 (1):為什麼Normal Distribution這麼好用? - YC Note

https://www.ycc.idv.tw/deep-dl_1.html


剖析深度學習 (2):你知道Cross Entropy和KL Divergence代表什麼意義嗎?談機器學習裡的資訊理論 - YC Note

https://www.ycc.idv.tw/deep-dl_2.html


剖析深度學習 (3):MLE、MAP差在哪?談機器學習裡的兩大統計觀點 - YC Note

https://www.ycc.idv.tw/deep-dl_3.html


剖析深度學習 (4):Sigmoid, Softmax怎麼來?為什麼要用MSE和Cross Entropy?談廣義線性模型 - YC Note

https://www.ycc.idv.tw/deep-dl_4.html



















No comments:

Post a Comment

Note: Only a member of this blog may post a comment.