Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun

Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge. In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation. We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model. Experiments on Chinese-English translation show that our approach leads to significant improvements.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment