[go: up one dir, main page]

Exploiting Document Knowledge for Aspect-level Sentiment Classification

Ruidan He, Wee Sun Lee, Hwee Tou Ng, Daniel Dahlmeier


Abstract
Attention-based long short-term memory (LSTM) networks have proven to be useful in aspect-level sentiment classification. However, due to the difficulties in annotating aspect-level data, existing public datasets for this task are all relatively small, which largely limits the effectiveness of those neural models. In this paper, we explore two approaches that transfer knowledge from document-level data, which is much less expensive to obtain, to improve the performance of aspect-level sentiment classification. We demonstrate the effectiveness of our approaches on 4 public datasets from SemEval 2014, 2015, and 2016, and we show that attention-based LSTM benefits from document-level knowledge in multiple ways.
Anthology ID:
P18-2092
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
579–585
Language:
URL:
https://aclanthology.org/P18-2092
DOI:
10.18653/v1/P18-2092
Bibkey:
Cite (ACL):
Ruidan He, Wee Sun Lee, Hwee Tou Ng, and Daniel Dahlmeier. 2018. Exploiting Document Knowledge for Aspect-level Sentiment Classification. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 579–585, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Exploiting Document Knowledge for Aspect-level Sentiment Classification (He et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-2092.pdf
Poster:
 P18-2092.Poster.pdf
Code
 ruidan/Aspect-level-sentiment
Data
SemEval-2014 Task-4