[go: up one dir, main page]

Heterogeneous Supervision for Relation Extraction: A Representation Learning Approach

Liyuan Liu, Xiang Ren, Qi Zhu, Shi Zhi, Huan Gui, Heng Ji, Jiawei Han


Abstract
Relation extraction is a fundamental task in information extraction. Most existing methods have heavy reliance on annotations labeled by human experts, which are costly and time-consuming. To overcome this drawback, we propose a novel framework, REHession, to conduct relation extractor learning using annotations from heterogeneous information source, e.g., knowledge base and domain heuristics. These annotations, referred as heterogeneous supervision, often conflict with each other, which brings a new challenge to the original relation extraction task: how to infer the true label from noisy labels for a given instance. Identifying context information as the backbone of both relation extraction and true label discovery, we adopt embedding techniques to learn the distributed representations of context, which bridges all components with mutual enhancement in an iterative fashion. Extensive experimental results demonstrate the superiority of REHession over the state-of-the-art.
Anthology ID:
D17-1005
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
46–56
Language:
URL:
https://aclanthology.org/D17-1005
DOI:
10.18653/v1/D17-1005
Bibkey:
Cite (ACL):
Liyuan Liu, Xiang Ren, Qi Zhu, Shi Zhi, Huan Gui, Heng Ji, and Jiawei Han. 2017. Heterogeneous Supervision for Relation Extraction: A Representation Learning Approach. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 46–56, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Heterogeneous Supervision for Relation Extraction: A Representation Learning Approach (Liu et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1005.pdf
Video:
 https://aclanthology.org/D17-1005.mp4
Code
 LiyuanLucasLiu/ReHession
Data
FIGER