导读
在过去的两年中,NLP在各种不同任务和应用上的进展十分迅速。这些进展是由于构建NLP系统的经典范式发生了转变带来的:很长一段时间以来,研究人员都使用预训练的词嵌入(如word2vec或GloVe)来初始化神经网络,然后使用一个特定于任务的架构,该架构使用单个数据集以监督方法训练。
免责声明:本列表并非详尽无遗,也无法涵盖NLP中的所有主题(例如,没有涵盖语义解析、对抗性学习、NLP强化学习等)。所选择的论文主要是过去几年/几个月最具影响力的工作。
以下的参考文献涵盖了NLP迁移学习的基本思想:
Deep contextualized word representations (NAACL 2018)
Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer
Universal Language Model Fine-tuning for Text Classification (ACL 2018)
Jeremy Howard, Sebastian Ruder
Improving Language Understanding by Generative Pre-Training
Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever
Language Models are Unsupervised Multitask Learners
Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (NAACL 2019) Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova
Cloze-driven Pretraining of Self-attention Networks (arXiv 2019) Alexei Baevski, Sergey Edunov, Yinhan Liu, Luke Zettlemoyer, Michael Auli
Unified Language Model Pre-training for Natural Language Understanding and Generation (arXiv 2019) Li Dong, Nan Yang, Wenhui Wang, Furu Wei, Xiaodong Liu, Yu Wang, Jianfeng Gao, Ming Zhou, Hsiao-Wuen Hon
MASS: Masked Sequence to Sequence Pre-training for Language Generation (ICML 2019) Kaitao Song, Xu Tan, Tao Qin, Jianfeng Lu, Tie-Yan Liu
Transformer结构已经成为序列建模任务流行结构。Source: Attention is all you need
表示学习(Representation Learning)
What you can cram into a single vector: Probing sentence embeddings for linguistic properties (ACL 2018) Alexis Conneau, German Kruszewski, Guillaume Lample, Loïc Barrault, Marco Baroni
No Training Required: Exploring Random Encoders for Sentence Classification(ICLR 2019) John Wieting, Douwe Kiela
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding (ICLR 2019) Alex Wang, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, Samuel R. Bowman
SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems (arXiv 2019) Alex Wang, Yada Pruksachatkun, Nikita Nangia, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, Samuel R. Bowman
Linguistic Knowledge and Transferability of Contextual Representations (NAACL 2019) Nelson F. Liu, Matt Gardner, Yonatan Belinkov, Matthew E. Peters, Noah A. Smith
To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks(arXiv 2019) Matthew Peters, Sebastian Ruder, Noah A. Smith
神经对话(Neural Dialogue)
A Neural Conversational Model (ICML Deep Learning Workshop 2015) Oriol Vinyals, Quoc Le
A Persona-Based Neural Conversation Model (ACL 2016) Jiwei Li, Michel Galley, Chris Brockett, Georgios P. Spithourakis, Jianfeng Gao, Bill Dolan
A Simple, Fast Diverse Decoding Algorithm for Neural Generation (arXiv 2017) Jiwei Li, Will Monroe, Dan Jurafsky
Neural Approaches to Conversational AI (arXiv 2018) Jianfeng Gao, Michel Galley, Lihong Li
TransferTransfo: A Transfer Learning Approach for Neural Network Based Conversational Agents (NeurIPS 2018 CAI Workshop) Thomas Wolf, Victor Sanh, Julien Chaumond, Clement Delangue
Wizard of Wikipedia: Knowledge-Powered Conversational agents (ICLR 2019) Emily Dinan, Stephen Roller, Kurt Shuster, Angela Fan, Michael Auli, Jason Weston
Learning to Speak and Act in a Fantasy Text Adventure Game (arXiv 2019) Jack Urbanek, Angela Fan, Siddharth Karamcheti, Saachi Jain, Samuel Humeau, Emily Dinan, Tim Rocktäschel, Douwe Kiela, Arthur Szlam, Jason Weston
其他
Pointer Networks (NIPS 2015) Oriol Vinyals, Meire Fortunato, Navdeep Jaitly
End-To-End Memory Networks (NIPS 2015) Sainbayar Sukhbaatar, Arthur Szlam, Jason Weston, Rob Fergus
Get To The Point: Summarization with Pointer-Generator Networks (ACL 2017) Abigail See, Peter J. Liu, Christopher D. Manning
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data (EMNLP 2017) Alexis Conneau, Douwe Kiela, Holger Schwenk, Loic Barrault, Antoine Bordes
End-to-end Neural Coreference Resolution (EMNLP 2017) Kenton Lee, Luheng He, Mike Lewis, Luke Zettlemoyer
StarSpace: Embed All The Things! (AAAI 2018) Ledell Wu, Adam Fisch, Sumit Chopra, Keith Adams, Antoine Bordes, Jason Weston
The Natural Language Decathlon: Multitask Learning as Question Answering(arXiv 2018) Bryan McCann, Nitish Shirish Keskar, Caiming Xiong, Richard Socher
Character-Level Language Modeling with Deeper Self-Attention (arXiv 2018) Rami Al-Rfou, Dokook Choe, Noah Constant, Mandy Guo, Llion Jones
Linguistically-Informed Self-Attention for Semantic Role Labeling (EMNLP 2018) Emma Strubell, Patrick Verga, Daniel Andor, David Weiss, Andrew McCallum
Phrase-Based & Neural Unsupervised Machine Translation (EMNLP 2018) Guillaume Lample, Myle Ott, Alexis Conneau, Ludovic Denoyer, Marc’Aurelio Ranzato
Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning (ICLR 2018) Sandeep Subramanian, Adam Trischler, Yoshua Bengio, Christopher J Pal
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context (arXiv 2019) Zihang Dai, Zhilin Yang, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov
Universal Transformers (ICLR 2019) Mostafa Dehghani, Stephan Gouws, Oriol Vinyals, Jakob Uszkoreit, Łukasz Kaiser
An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models (NAACL 2019) Alexandra Chronopoulou, Christos Baziotis, Alexandros Potamianos
其他年代较远的论文,在选择阅读内容时,通常可以用引用数量作为选择的指标。
我的经验是,你应该阅读那些你觉得有趣、并能让你快乐的文章!
其他资源
有很多很赞的资源可以使用,不一定是论文。以下是一些:
书籍:
Speech and Language Processing (3rd ed. draft) Dan Jurafsky and James H. Martin
Neural Network Methods for Natural Language Processing Yoav Goldberg
课程资料:
Natural Language Understanding and Computational Semantics
with Katharina Kann and Sam Bowman at NYU
CS224n: Natural Language Processing with Deep Learning
with Chris Manning and Abigail See at Standford
Contextual Word Representations: A Contextual Introduction
from Noah A. Smith’s teaching material at UW
博客/播客:
Sebastian Ruder’s blog
http://ruder.io/
Jay Alammar’s illustrated blog
http://jalammar.github.io/
NLP Highlights hosted by Matt Gardner and Waleed Ammar
https://podcasts.apple.com/us/podcast/nlp-highlights/id1235937471
其他:
Papers With Code
https://paperswithcode.com/
Twitter ?
arXiv daily newsletter
Survey papers
…
最后的建议
以上是我们推荐的资源!阅读这些资源中的一部分就已经能够让你对当代NLP的最新趋势有了很好的了解,并能够帮助你构建自己的NLP系统!
最后一个建议,我发现非常重要(有时被忽视)的是,阅读很好,实践更好!通过深入阅读(有时)附带的代码或尝试自己实现其中的一些代码,你可以学到更多。
原文地址:
https://medium.com/huggingface/the-best-and-most-current-of-modern-natural-language-processing-5055f409a1d1