前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >ACL 2020接收论文列表公开,接收率25.2%,你上榜了吗?

ACL 2020接收论文列表公开,接收率25.2%,你上榜了吗?

作者头像
机器之心
发布2020-05-27 15:33:01
7680
发布2020-05-27 15:33:01
举报
文章被收录于专栏:机器之心

机器之心报道

参与:魔王

自然语言处理顶会 ACL 2020 将于 7 月 5 日-10 日线上举行。不久之前,ACL 论文接收结果公布,但官方并未放出完整的论文列表。近日,ACL 接收论文列表公布,让我们看一下都有哪些论文被接收了。

此次 ACL 会议的投稿数量为 3088 篇,与去年的投稿数量 2906 篇相比稍有增长。ACL 2020 共接收 779 篇论文,包括 571 篇长论文和 208 篇短论文,接收率为 25.2%。

在接收论文列表中,我们看到了很多熟悉的名字:

Christopher D. Manning(斯坦福大学教授、斯坦福 AI 实验室负责人):

  • Finding Universal Grammatical Relations in Multilingual BERT
  • Optimizing the Factual Correctness of a Summary: A Study of Summarizing Radiology Reports
  • Syn-QG: Syntactic and Shallow Semantic Rules for Question Generation

Yoshua Bengio(加拿大计算机科学家、蒙特利尔大学教授):

  • Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach

Yoav Goldberg(以色列 Bar-Ilan 大学计算机科学系高级讲师):

  • A Formal Hierarchy of RNN Architectures
  • Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection
  • Simple, Interpretable and Stable Method for Detecting Words with Usage Change across Corpora
  • Unsupervised Domain Clusters in Pretrained Language Models
  • A Two-Stage Masked LM Method for Term Set Expansion
  • Towards Faithfully Interpretable NLP Systems: How should we define and evaluate faithfulness?

Noah A. Smith(华盛顿大学计算机科学与工程系教授):

  • A Formal Hierarchy of RNN Architectures
  • A Mixture of h − 1 Heads is Better than h Heads
  • Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
  • Improving Transformer Models by Reordering their Sublayers
  • Social Bias Frames: Reasoning about Social and Power Implications of Language
  • The Right Tool for the Job: Matching Model and Instance Complexities
  • Recollection versus Imagination: Exploring Human Memory and Cognition via Neural Language Models

Percy Liang(斯坦福大学计算机系副教授、斯坦福人工智能实验室成员):

  • Robust Encodings: A Framework for Combating Adversarial Typos
  • Selective Question Answering under Domain Shift
  • Enabling Language Models to Fill in the Blanks
  • ExpBERT: Representation Engineering with Natural Language Explanations
  • Shaping Visual Representations with Language for Few-Shot Classification

Sebastian Ruder(DeepMind 研究科学家):

  • A Call for More Rigor in Unsupervised Cross-lingual Learning
  • On the Cross-lingual Transferability of Monolingual Representations

周明(微软亚洲研究院副院长、国际计算语言学协会(ACL)主席):

  • A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction
  • Curriculum Pre-training for End-to-End Speech Translation
  • Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension
  • Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder
  • Graph Neural News Recommendation with Unsupervised Preference Disentanglement
  • Improving Neural Machine Translation with Soft Template Prediction
  • LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network
  • MIND: A Large-scale Dataset for News Recommendation
  • MuTual: A Dataset for Multi-Turn Dialogue Reasoning
  • Reasoning Over Semantic-Level Graph for Fact Checking
  • A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation
  • A Simple and Effective Unified Encoder for Document-Level Machine Translation

刘铁岩(微软亚洲研究院副院长):

  • A Study of Non-autoregressive Model for Sequence Generation
  • SEEK: Segmented Embedding of Knowledge Graphs
  • SimulSpeech: End-to-End Simultaneous Speech to Text Translation

刘群(华为诺亚方舟实验室语音语义首席科学家):

  • Perturbed Masking: Parameter-free Probing for Analyzing and Interpreting BERT
  • Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order
  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

宗成庆(中科院自动化所研究员):

  • Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

孙茂松(清华大学计算机科学与技术系教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation
  • Fine-grained Fact Verification with Kernel Graph Attention Network
  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence
  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

刘知远(清华大学计算机科学与技术系副教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation
  • Expertise Style Transfer: A New Task Towards Better Communication between Experts and Laymen
  • Fine-grained Fact Verification with Kernel Graph Attention Network
  • Grounded Conversation Generation as Guided Traverses in Commonsense Knowledge Graphs
  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence
  • Word-level Textual Adversarial Attacking as Combinatorial Optimization
  • MOOCCube: A Large-scale Data Repository for NLP Applications in MOOCs

黄民烈(清华大学计算机科学与技术系副教授):

  • A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction
  • KdConv: A Chinese Multi-domain Dialogue Dataset Towards Multi-turn Knowledge-driven Conversation
  • Multi-Agent Task-Oriented Dialog Policy Learning with Role-Aware Reward Decomposition

万小军(北京大学计算机科学技术研究所研究员):

  • Automatic Generation of Citation Texts in Scholarly Papers: A Pilot Study
  • Heterogeneous Graph Transformer for Graph-to-Sequence Learning
  • Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization
  • Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction
  • Multi-Granularity Interaction Network for Extractive and Abstractive Multi-Document Summarization
  • Semantic Parsing for English as a Second Language
  • Multimodal Transformer for Multimodal Machine Translation

邱锡鹏(复旦大学计算机科学技术学院教授):

  • Extractive Summarization as Text Matching
  • Heterogeneous Graph Neural Networks for Extractive Document Summarization
  • Improving Image Captioning with Better Use of Caption
  • FLAT: Chinese NER Using Flat-Lattice Transformer

韩松(MIT 电子工程和计算机科学系助理教授):

  • HAT: Hardware-Aware Transformers for Efficient Natural Language Processing

欢迎中了 ACL 2020 论文的读者留言,机器之心也将持续为大家推荐更多优质论文。

ACL 2020 接收论文列表,参见:https://acl2020.org/program/accepted/#long-papers

本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2020-05-19,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 机器之心 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
相关产品与服务
NLP 服务
NLP 服务(Natural Language Process,NLP)深度整合了腾讯内部的 NLP 技术,提供多项智能文本处理和文本生成能力,包括词法分析、相似词召回、词相似度、句子相似度、文本润色、句子纠错、文本补全、句子生成等。满足各行业的文本智能需求。
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档