C=QKV
2019/09/30
-----
NTM
1. MN
2. EEMN
3. KVMN
4. PN
5. FSA
-----
// 論文解説 Memory Networks (MemNN) - ディープラーニングブログ
-----
// Attention in NLP – Kate Loginova – Medium
-----
// 論文解説 Attention Is All You Need (Transformer) - ディープラーニングブログ
-----
// Attention Attention!
-----
// Attention in NLP – Kate Loginova – Medium
-----
-----
// 論文解説 Convolutional Sequence to Sequence Learning (ConvS2S) - ディープラーニングブログ
-----
-----
References
◎ 論文
// MN
J Weston, S Chopra, and A Bordes. Memory networks. ICLR, 2014.
https://arxiv.org/abs/1410.3916
// EEMN
Sukhbaatar, Sainbayar, Jason Weston, and Rob Fergus. "End-to-end memory networks." Advances in neural information processing systems. 2015.
https://papers.nips.cc/paper/5846-end-to-end-memory-networks.pdf
// KVMN
Miller, Alexander, et al. "Key-value memory networks for directly reading documents." arXiv preprint arXiv:1606.03126 (2016).
https://arxiv.org/pdf/1606.03126.pdf // PN
Vinyals, Oriol, Meire Fortunato, and Navdeep Jaitly. "Pointer networks." Advances in Neural Information Processing Systems. 2015.
http://papers.nips.cc/paper/5866-pointer-networks.pdf// FSA
Daniluk, Michał, et al. "Frustratingly short attention spans in neural language modeling." arXiv preprint arXiv:1702.04521 (2017).
https://arxiv.org/pdf/1702.04521.pdf-----
◎ 英文參考資料
# EEMN
# FSA
# 680 claps
Attention in NLP – Kate Loginova – Medium
https://medium.com/@joealato/attention-in-nlp-734c6fa9d983
# KVMN
Summary of paper 'Key-Value Memory Networks for Directly Reading Documents' · GitHub
https://gist.github.com/shagunsodhani/a5e0baa075b4a917c0a69edc575772a8
# PN
Attention Attention!
https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
-----
◎ 日文參考資料
# MN
論文解説 Memory Networks (MemNN) - ディープラーニングブログ
http://deeplearning.hatenablog.com/entry/memory_networks
# KVMN
論文解説 Attention Is All You Need (Transformer) - ディープラーニングブログ
http://deeplearning.hatenablog.com/entry/transformer
# QKVC
論文解説 Convolutional Sequence to Sequence Learning (ConvS2S) - ディープラーニングブログ
http://deeplearning.hatenablog.com/entry/convs2s
No comments:
Post a Comment