[go: up one dir, main page]

Deconstructing Supertagging into Multi-Task Sequence Prediction

Zhenqi Zhu, Anoop Sarkar


Abstract
Supertagging is a sequence prediction task where each word is assigned a piece of complex syntactic structure called a supertag. We provide a novel approach to multi-task learning for Tree Adjoining Grammar (TAG) supertagging by deconstructing these complex supertags in order to define a set of related but auxiliary sequence prediction tasks. Our multi-task prediction framework is trained over the exactly same training data used to train the original supertagger where each auxiliary task provides an alternative view on the original prediction task. Our experimental results show that our multi-task approach significantly improves TAG supertagging with a new state-of-the-art accuracy score of 91.39% on the Penn Treebank supertagging dataset.
Anthology ID:
K19-1002
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–21
Language:
URL:
https://aclanthology.org/K19-1002
DOI:
10.18653/v1/K19-1002
Bibkey:
Cite (ACL):
Zhenqi Zhu and Anoop Sarkar. 2019. Deconstructing Supertagging into Multi-Task Sequence Prediction. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 12–21, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Deconstructing Supertagging into Multi-Task Sequence Prediction (Zhu & Sarkar, CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1002.pdf