前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Connecting language and knowledge with heterogeneous representations for neural relation extraction

Connecting language and knowledge with heterogeneous representations for neural relation extraction

作者头像
JNJYan
发布2019-04-18 17:18:09
5460
发布2019-04-18 17:18:09
举报
文章被收录于专栏:算法工程师的养成之路

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/JN_rainbow/article/details/88972193

This is a paper about relationship extraction on NAACL 2019. Connecting language and knowledge with heterogeneous representations for neural relation extraction

Problem

In the process of building a knowledge base, we usually extract the relationship between entities from sentences. If the entity already exists in the knowledge base, then we can use the knowledge in the knowledge base to improve the results of relation extraction.

The usual practice is to train two models, one is the RE model, and another is the KBE knowledge model(Knowledge Base Embedding). But there is little research to properly unify these models systematically.

Contribution

In this paper, a Heterogeneous REpresentations for neural Relation Extraction(HRERE) of RE and KBE is proposed. The framework unifies the RE model and the KBE model, and the framework can effectively enhance the relationship between the two. The gap between the language representions and knoledge representions can be reduced as much as possible leading to significant improvements over the state-of-the-art in RE.

Solution

The RE model uses a Bi-LSTM with multiple levels of attention mechanism to predict the ralationship between entity pairs. The KBE model borrows from ComplEx proposed by Trouillon et al in 2016, which can nudge the language model to agree with facts in the KB.

The framework introduces three loss functions, namely the RE model language representation loss, the KBE model knowledge representation loss, and the cross entropy loss of the two distributions. JL=−1N∑i=1Nlog⁡p(ri∣Si;Θ(L)) J_L = - \frac{1}{N}\sum^N_{i=1}\log p(r_i|S_i;\Theta^{(L)}) JL​=−N1​i=1∑N​logp(ri​∣Si​;Θ(L)) JG=−1N∑i=1Nlog⁡p(ri∣(hi,ti)Θ(L)) J_G = - \frac{1}{N}\sum^N_{i=1}\log p(r_i|(h_i,t_i)\Theta^{(L)}) JG​=−N1​i=1∑N​logp(ri​∣(hi​,ti​)Θ(L)) JD=−1N∑i=1Nlog⁡p(ri∗∣Si;Θ(L)) J_D = - \frac{1}{N}\sum^N_{i=1}\log p(r_i^*|S_i;\Theta^{(L)}) JD​=−N1​i=1∑N​logp(ri∗​∣Si​;Θ(L)) 其中ri∗=argmax⁡r∈R∪NAp(r∣(hi,ti);Θ(G))r_i^* = arg \max_{r\in R\cup{NA}}p(r|(h_i,t_i);\Theta^{(G)})ri∗​=argmaxr∈R∪NA​p(r∣(hi​,ti​);Θ(G)) min⁡ΘJ=JL+JG+JD+λ∣∣Θ∣∣22 \min_{\Theta} J = J_L + J_G + J_D + \lambda||\Theta||_2^2 Θmin​J=JL​+JG​+JD​+λ∣∣Θ∣∣22​

Understanding

The essence of this paper is to improve the results of relation extraction RE through the existing knowledge base. By training the KBE model on the existing knowledge base to form the knowledge representation, the RE model predicts the relationship between the entity pairs through the language model, so that the prediction results can be as close as possible to existing knowledge.

This paper describes and evaluates a novel neural framework for jointly learningrepresentations for RE and KBE tasks that uses a cross-entropy loss function to ensure both representations are learned together, resulting in significant improvements over the current state-of-theart for the RE task.

Limitation

In real-life scenarios, we often want to extract entity pairs and their relationships from a sentence, rather than extracting relationship by a given entity pair.

But this paper gives me ideas on how to extract relationships in the above scenarios. We can construct a new loss function to represent entity extraction and relationship extraction, thus reducing the error propagation of the step model.

Reference

Connecting language and knowledge with heterogeneous representations for neural relation extraction

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2019年04月02日,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Problem
  • Contribution
  • Solution
  • Understanding
  • Limitation
  • Reference
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档