Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables

Published in EMNLP-2019, 2019

Abstract:

Despite the surging demands for multilin-gual task-oriented dialog systems (e.g., Alexa, Google Home), there has been less research done in multilingual or cross-lingual scenarios. Hence, we propose a zero-shot adaptation of task-oriented dialogue system to low-resource languages. To tackle this challenge, we first use a set of very few parallel word pairs to refine the aligned cross-lingual word-level representations. We then employ a latent variable model to cope with the variance of similar sentences across different languages, which is induced by imperfect cross-lingual alignments and inherent differences in languages. Finally, the experimental results show that even though we utilize much less external resources, our model achieves better adaptation performance for natural language understanding task (i.e., the intent detection and slot filling) compared to the current state-of-the-art model in the zero-shot scenario.

Paper PDF

Recommended citation: Liu, Z., Shin, J., Xu, Y., Winata, G. I., Xu, P., Madotto, A., & Fung, P. (2019). Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables. arXiv preprint arXiv:1911.04081.