Towards Empathetic Dialogue Generation over Multi-type Knowledge

Qintong Li, Piji Li, Zhumin Chen, ZhaoChun Ren

Enabling the machines with empathetic abilities to provide context-consistent responses is crucial on both semantic and emotional levels. The task of empathetic dialogue generation is proposed to address this problem. However, lacking external knowledge makes it difficult to perceive implicit emotions from limited dialogue history. To address the above challenges, we propose to leverage multi-type knowledge, i.e, the commonsense knowledge and emotional lexicon, to explicitly understand and express emotions in empathetic dialogue generation. We first enrich the dialogue history by jointly interacting with two-type knowledge and construct an emotional context graph. Then we introduce a multi-type knowledge-aware context encoder to learn emotional context representations and distill emotional signals, which are the prerequisites to predicate emotions expressed in responses. Finally, we propose an emotional cross-attention mechanism to exploit the emotional dependencies between the emotional context graph and the target empathetic response. Conducted on a benchmark dataset, extensive experimental results show that our proposed framework outperforms state-of-the-art baselines in terms of automatic metrics and human evaluations.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment