[go: up one dir, main page]

HW-TSC 2024 Submission for the Quality Estimation Shared Task

Weiqiao Shan, Ming Zhu, Yuang Li, Mengyao Piao, Xiaofeng Zhao, Chang Su, Min Zhang, Hao Yang, Yanfei Jiang


Abstract
Quality estimation (QE) is a crucial technique for evaluating the quality of machine translations without the need for reference translations. This paper focuses on Huawei Translation Services Center’s (HW-TSC’s) submission to the sentence-level QE shared task, named LLMs-enhanced-CrossQE. Our system builds upon the CrossQE architecture from our submission from last year, which consists of a multilingual base model and a task-specific downstream layer. The model input is a concatenation of the source and the translated sentences. To enhance performance, we fine-tuned and ensembled multiple base models, including XLM-R, InfoXLM, RemBERT, and CometKiwi. Specifically, we employed two pseudo-data generation methods: 1) a diverse pseudo-data generation method based on the corruption-based data augmentation technique introduced last year, and 2) a pseudo-data generation method that simulates machine translation errors using large language models (LLMs). Our results demonstrate that the system achieves outstanding performance on sentence-level QE test sets.
Anthology ID:
2024.wmt-1.39
Volume:
Proceedings of the Ninth Conference on Machine Translation
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
535–540
Language:
URL:
https://aclanthology.org/2024.wmt-1.39
DOI:
Bibkey:
Cite (ACL):
Weiqiao Shan, Ming Zhu, Yuang Li, Mengyao Piao, Xiaofeng Zhao, Chang Su, Min Zhang, Hao Yang, and Yanfei Jiang. 2024. HW-TSC 2024 Submission for the Quality Estimation Shared Task. In Proceedings of the Ninth Conference on Machine Translation, pages 535–540, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
HW-TSC 2024 Submission for the Quality Estimation Shared Task (Shan et al., WMT 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.wmt-1.39.pdf