前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >吐血整理了下AAAI2022中和NLP相关的论文,包括信息抽取、机翻、NER、多模态、数据增强、问答、多语言、KD、文本纠错等

吐血整理了下AAAI2022中和NLP相关的论文,包括信息抽取、机翻、NER、多模态、数据增强、问答、多语言、KD、文本纠错等

作者头像
zenRRan
发布2022-03-14 11:11:14
1.7K0
发布2022-03-14 11:11:14
举报
文章被收录于专栏:深度学习自然语言处理

每天给你送来NLP技术干货!


作者:zenRRan

公众号:深度学习自然语言处理

AAAI2022出来一段时间了,但是好像还没人整理出NLP相关的论文呢,趁着周末最后一天还没过完,吐血整理了一番,需要的收藏即可。

其中包括:信息抽取、关系抽取、机器翻译、命名实体识别、多模态、数据增强、智能问答、多语言、知识蒸馏、文本纠错等。

信息抽取

OneRel: Joint Entity and Relation Extraction with One Module in One Step 

Yu-Ming Shang, Heyan Huang, Xian-Ling Mao

BROS: A Pre-Trained Language Model Focusing on Text and Layout for Better Key Information Extraction from Documents 

Teakgyu Hong, Donghyun Kim, Mingi Ji, Wonseok Hwang, Daehyun Nam, Sungrae Park

Selecting Optimal Context Sentences for Event-Event Relation Extraction

Hieu Man Duc Trong, Nghia Ngo Trung, Linh Van Ngo, Thien Huu Nguyen

Hyperbolic Disentangled Representation for Fine-Grained Aspect Extraction 

Chang-You Tai, Ming-Yao Li, Lun-Wei Ku

Language Model Priming for Cross-Lingual Event Extraction 

Steven Fincke, Shantanu Agarwal, Scott Miller, Elizabeth Boschee

知识蒸馏

Content-Variant Reference Image Quality Assessment via Knowledge Distillation 

Guanghao Yin, Wei Wang, Zehuan Yuan, Chuchu Han, Wei Ji, Shouqian Sun, Changhu Wang

Adversarial Data Augmentation for Task-Specific Knowledge Distillation of Pre-Trained Transformers 

Minjia Zhng, Niranjan Uma Naresh, Yuxiong He

Boosting Contrastive Learning with Relation Knowledge Distillation 

Kai Zheng, Yuanjiang Wang, Ye Yuan

UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation 

Zhengkun Zhang, Xiaojun Meng, Yasheng Wang, Xin Jiang, Qun Liu, Zhenglu Yang

Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay 

Kuluhan Binici, Shivam Aggarwal, Nam Trung Pham, Karianto Leman, Tulika Mitra

Cross-Task Knowledge Distillation in Multi-Task Recommendation 

Chenxiao Yang, Junwei Pan, Xiaofeng Gao, Tingyu Jiang, Dapeng Liu, Guihai Chen

Improving Neural Cross-Lingual Abstractive Summarization via Employing Optimal Transport Distance for Knowledge Distillation 

Thong Nguyen, Luu Anh Tuan

Knowledge Distillation via Constrained Variational Inference 

Ardavan Saeedi, Yuria Utsumi, Li Sun, Kayhan Batmanghelich, Li-wei H. Lehman

Up to 100$\times$ Faster Data-Free Knowledge Distillation 

Gongfan Fang, Kanya Mo, Xinchao Wang, Jie Song, Shitao Bei, Haofei Zhang, Mingli Song

Adversarial Data Augmentation for Task-Specific Knowledge Distillation of Pre-Trained Transformers 

Minjia Zhng, Niranjan Uma Naresh, Yuxiong He

Boosting Contrastive Learning with Relation Knowledge Distillation 

Kai Zheng, Yuanjiang Wang, Ye Yuan

多语言

Multilingual Code Snippets Training for Program Translation 

Ming Zhu, Karthik Suresh, Chandan K. Reddy

Improving Neural Cross-Lingual Abstractive Summarization via Employing Optimal Transport Distance for Knowledge Distillation 

Thong Nguyen, Luu Anh Tuan

DetIE: Multilingual Open Information Extraction Inspired by Object Detection 

Michael Vasilkovsky, Anton Alekseev, Valentin Malykh, Ilya Shenbin, Elena Tutubalina, Dmitriy Salikhov, Mikhail Stepnov, Andrey Chertok, Sergey Nikolenko

Zero-Shot Cross-Lingual Machine Reading Comprehension via Inter-Sentence Dependency Graph 

Liyan Xu, Xuchao Zhang, Bo Zong, Yanchi Liu, Wei Cheng, Jingchao Ni, Haifeng Chen, Liang Zhao, Jinho D. Choi

Language Model Priming for Cross-Lingual Event Extraction 

Steven Fincke, Shantanu Agarwal, Scott Miller, Elizabeth Boschee

Cross-Lingual Adversarial Domain Adaptation for Novice Programming 

Ye Mao, Farzaneh Khoshnevisan, Thomas Price, Tiffany Barnes, Min Chi

Interpreting Gender Bias in Neural Machine Translation: Multilingual Architecture Matters 

Marta R. Costa-jussà, Carlos Escolano, Christine Basta, Javier Ferrando, Roser Batlle, Ksenia Kharitonova

From Good to Best: Two-Stage Training for Cross-Lingual Machine Reading Comprehension 

Nuo Chen, Linjun Shou, Ming Gong, Jian Pei

Few-Shot Cross-Lingual Stance Detection with Sentiment-Based Pre-Training 

Momchil Hardalov, Arnav Arora, Preslav Nakov, Isabelle Augenstein

Parameter Differentiation Based Multilingual Neural Machine Translation 

Qian Wang, Jiajun Zhang

XLM-K: Improving Cross-Lingual Language Model Pre-Training with Multilingual Knowledge 

Xiaoze Jiang, Yaobo Liang, Weizhu Chen, Nan Duan

Mind the Gap: Cross-Lingual Information Retrieval with Hierarchical Knowledge Enhancement 

Fuwei Zhang, Zhao Zhang, Xiang Ao, Dehong Gao, Fuzhen Zhuang, Yi Wei, Qing He

BiRdQA: A Bilingual Dataset for Question Answering on Tricky Riddles 

Yunxiang Zhang, Xiaojun Wan

UNISON: Unpaired Cross-Lingual Image Captioning 

Jiahui Gao, Yi Zhou, Philip L. H. Yu, Shafiq Joty, Jiuxiang Gu

问答

Video as Conditional Graph Hierarchy for Multi-Granular Question Answering 

Junbin Xiao, Angela Yao, Zhiyuan Liu, Yicong Li, Wei Ji, Tat-Seng Chua

Block-Skim: Efficient Question Answering for Transformer 

Yue Guan, Zhengyi Li, Zhouhan Lin, Yuhao Zhu, Jingwen Leng, Minyi Guo

BiRdQA: A Bilingual Dataset for Question Answering on Tricky Riddles 

Yunxiang Zhang, Xiaojun Wan

(2.5+1)D Spatio-Temporal Scene Graphs for Video Question Answering 

Anoop Cherian, Chiori Hori, Tim K. Marks, Jonathan Le Roux

Zero-Shot Commonsense Question Answering with Cloze Translation and Consistency Optimization 

Zi-Yi Dou, Nanyun (Violet) Peng

Dynamic Key-Value Memory Enhanced Multi-Step Graph Reasoning for Knowledge-Based Visual Question Answering 

Mingxiao Li, Marie-Francine Moens

多模态

Show Your Faith: Cross-Modal Confidence-Aware Network for Image-Text Matching 

Huatian Zhang, Zhendong Mao, Kun Zhang, Yongdong Zhang

Event-Image Fusion Stereo Using Cross-Modality Feature Propagation 

Hoonhee Cho, Kuk-Jin Yoon

MAGIC: Multimodal relAtional Graph adversarIal inferenCe for Diverse and Unpaired Text-Based Image Captioning 

Wenqiao Zhang, Haochen Shi, Jiannan Guo, Shengyu Zhang, Qingpeng Cai, Juncheng Li, Sihui Luo, Yueting Zhuang

Hierarchical Cross-Modality Semantic Correlation Learning Model for Multimodal Summarization 

Litian Zhang, Junshu Pan, Xiaoming Zhang, Feiran Huang

UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation 

Zhengkun Zhang, Xiaojun Meng, Yasheng Wang, Xin Jiang, Qun Liu, Zhenglu Yang

Cross-Modal Mutual Learning for Audio-Visual Speech Recognition and Manipulation 

Chih-Chun Yang, Wan-Cyuan Fan, Cheng-Fu Yang, Yu-Chiang Frank Wang

Cross-Modal Coherence for Text-to-Image Retrieval 

Malihe Alikhani, Fangda Han, Hareesh Ravi, Mubbasir Kapadia, Vladimir Pavlovic, Matthew Stone

Are Vision-Language Transformers Learning Multimodal Representations? A Probing Perspective. 

Emmanuelle Salin, Badreddine Farah, Stéphane Ayache, Benoit Favre

D-Vlog: Multimodal Vlog Dataset for Depression Detection 

Jeewoo Yoon, Chaewon Kang, Seungbae Kim, Jinyoung Han

Sentiment and Emotion-Aware Multi-Modal Complaint Identification 

Apoorva Singh, Soumyodeep Dey, Anamitra Singha, Sriparna Saha

机器翻译

Parameter Differentiation Based Multilingual Neural Machine Translation 

Qian Wang, Jiajun Zhang

Deep Fusing Pre-Trained Models into Neural Machine Translation 

Rongxiang Weng, Heng Yu, Weihua Luo, Min Zhang

Non-Parametric Online Learning from Human Feedback for Neural Machine Translation 

Dongqi Wang, Haoran Wei, Zhirui Zhang, Shujian Huang, Jun Xie, Jiajun Chen

Frequency-Aware Contrastive Learning for Neural Machine Translation 

Tong Zhang, Wei Ye, Baosong Yang, Long Zhang, Xingzhang Ren, Dayiheng Liu, Jinan Sun, Shikun Zhang, Haibo Zhang, Wen Zhao

From Fully Trained to Fully Random Embeddings: Improving Neural Machine Translation with Compact Word Embedding Tables 

Krtin Kumar, Peyman Passban, Mehdi Rezagholizadeh, Yiusing Lau, Qun Liu

Interpreting Gender Bias in Neural Machine Translation: Multilingual Architecture Matters 

Marta R. Costa-jussà, Carlos Escolano, Christine Basta, Javier Ferrando, Roser Batlle, Ksenia Kharitonova

命名实体识别

Unified Named Entity Recognition as Word-Word Relation Classification 

Jingye Li, Donghong Ji, Jiang Liu, Hao Fei, Meishan Zhang, Shengqiong Wu, Chong Teng, Fei Li

模型压缩

BATUDE: Budget-Aware Neural Network Compression Based on Tucker Decomposition 

Miao Yin, Huy Phan, Xiao Zang, Siyu Liao, Bo yuan

From Dense to Sparse: Contrastive Pruning for Better Pre-Trained Language Model Compression 

Runxin Xu, Fuli Luo, Chengyu Wang, Baobao Chang, Jun Huang, Songfang Huang, Fei Huang

Convolutional Neural Network Compression Through Generalized Kronecker Product Decomposition 

Marawan Gamal Abdel Hameed, Marzieh Tahaei, Ali Mosleh, Vahid Partovi Nia

数据增强

Adversarial Data Augmentation for Task-Specific Knowledge Distillation of Pre-Trained Transformers 

Minjia Zhng, Niranjan Uma Naresh, Yuxiong He

ALP: Data Augmentation Using Lexicalized PCFGs for Few-Shot Text Classification 

Hazel Kim, Daecheol Woo, Seong Joon Oh, Jeong-Won Cha, Yo-Sub Han

阅读理解

Zero-Shot Cross-Lingual Machine Reading Comprehension via Inter-Sentence Dependency Graph 

Liyan Xu, Xuchao Zhang, Bo Zong, Yanchi Liu, Wei Cheng, Jingchao Ni, Haifeng Chen, Liang Zhao, Jinho D. Choi

From Good to Best: Two-Stage Training for Cross-Lingual Machine Reading Comprehension 

Nuo Chen, Linjun Shou, Ming Gong, Jian Pei

文本纠错

Sequence-to-Action: Grammatical Error Correction with Action Guided Sequence Generation 

Jiquan Li, Junliang Guo, Yongxin Zhu, Xin Sheng, Deqiang Jiang, Bo Ren, Linli Xu

最近文章

EMNLP 2022 和 COLING 2022,投哪个会议比较好?

一种全新易用的基于Word-Word关系的NER统一模型,刷新了14种数据集并达到新SoTA

阿里+北大 | 在梯度上做简单mask竟有如此的神奇效果


代码语言:javascript
复制
下载一:中文版!学习TensorFlow、PyTorch、机器学习、深度学习和数据结构五件套!  后台回复【五件套】
下载二:南大模式识别PPT  后台回复【南大模式识别】

投稿或交流学习,备注:昵称-学校(公司)-方向,进入DL&NLP交流群。

方向有很多:机器学习、深度学习,python,情感分析、意见挖掘、句法分析、机器翻译、人机对话、知识图谱、语音识别等。

记得备注呦

代码语言:javascript
复制
整理不易,还望给个在看!
本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2022-03-11,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 深度学习自然语言处理 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 信息抽取
  • 知识蒸馏
  • 多语言
  • 问答
  • 多模态
  • 机器翻译
  • 命名实体识别
  • 模型压缩
  • 数据增强
  • 阅读理解
  • 文本纠错
相关产品与服务
文件存储
文件存储(Cloud File Storage,CFS)为您提供安全可靠、可扩展的共享文件存储服务。文件存储可与腾讯云服务器、容器服务、批量计算等服务搭配使用,为多个计算节点提供容量和性能可弹性扩展的高性能共享存储。腾讯云文件存储的管理界面简单、易使用,可实现对现有应用的无缝集成;按实际用量付费,为您节约成本,简化 IT 运维工作。
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档