Bridging Language Gaps in Multilingual Embeddings via Contrastive Learning
Multilingual models often face a "language gap," where similar phrases in different languages don't align. We show how contrastive learning can bridge this gap, enhancing cross-language performance.