Domain-Adapted Dependency Parsing for Cross-Domain Named Entity Recognition
DOI:
https://doi.org/10.1609/aaai.v37i11.26498Keywords:
SNLP: Syntax -- Tagging, Chunking & Parsing, SNLP: Lexical & Frame Semantics, Semantic Parsing, SNLP: Other Foundations of Speech & Natural Language Processing, SNLP: Phonology, Morphology, Word Segmentation, SNLP: Text MiningAbstract
In recent years, many researchers have leveraged structural information from dependency trees to improve Named Entity Recognition (NER). Most of their methods take dependency-tree labels as input features for NER model training. However, such dependency information is not inherently provided in most NER corpora, making the methods with low usability in practice. To effectively exploit the potential of word-dependency knowledge, motivated by the success of Multi-Task Learning on cross-domain NER, we investigate a novel NER learning method incorporating cross-domain Dependency Parsing (DP) as its auxiliary learning task. Then, considering the high consistency of word-dependency relations across domains, we present an unsupervised domain-adapted method to transfer word-dependency knowledge from high-resource domains to low-resource ones. With the help of cross-domain DP to bridge different domains, both useful cross-domain and cross-task knowledge can be learned by our model to considerably benefit cross-domain NER. To make better use of the cross-task knowledge between NER and DP, we unify both tasks in a shared network architecture for joint learning, using Maximum Mean Discrepancy(MMD). Finally, through extensive experiments, we show our proposed method can not only effectively take advantage of word-dependency knowledge, but also significantly outperform other Multi-Task Learning methods on cross-domain NER. Our code is open-source and available at https://github.com/xianghuisun/DADP.Downloads
Published
2023-06-26
How to Cite
Dou, C., Sun, X., Wang, Y., Ji, Y., Ma, B., & Li, X. (2023). Domain-Adapted Dependency Parsing for Cross-Domain Named Entity Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 12737-12744. https://doi.org/10.1609/aaai.v37i11.26498
Issue
Section
AAAI Technical Track on Speech & Natural Language Processing