[go: up one dir, main page]

Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion

Yi-An Lai, Elman Mansimov, Yuqing Xie, Yi Zhang


Abstract
When upgrading neural models to a newer version, new errors that were not encountered in the legacy version can be introduced, known as regression errors. This inconsistent behavior during model upgrade often outweighs the benefits of accuracy gain and hinders the adoption of new models. To mitigate regression errors from model upgrade, distillation and ensemble have proven to be viable solutions without significant compromise in performance. Despite the progress, these approaches attained an incremental reduction in regression which is still far from achieving backward-compatible model upgrade. In this work, we propose a novel method, Gated Fusion, that promotes backward compatibility via learning to mix predictions between old and new models. Empirical results on two distinct model upgrade scenarios show that our method reduces the number of regression errors by 62% on average, outperforming the strongest baseline by an average of 25%.
Anthology ID:
2023.findings-eacl.74
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1010–1022
Language:
URL:
https://aclanthology.org/2023.findings-eacl.74
DOI:
10.18653/v1/2023.findings-eacl.74
Bibkey:
Cite (ACL):
Yi-An Lai, Elman Mansimov, Yuqing Xie, and Yi Zhang. 2023. Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion. In Findings of the Association for Computational Linguistics: EACL 2023, pages 1010–1022, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion (Lai et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.74.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.74.mp4