Siqi Sun
2020
DIALOGPT : Large-Scale Generative Pre-training for Conversational Response Generation
Yizhe Zhang
|
Siqi Sun
|
Michel Galley
|
Yen-Chun Chen
|
Chris Brockett
|
Xiang Gao
|
Jianfeng Gao
|
Jingjing Liu
|
Bill Dolan
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
We present a large, tunable neural conversational response generation model, DIALOGPT (dialogue generative pre-trained transformer). Trained on 147M conversation-like exchanges extracted from Reddit comment chains over a period spanning from 2005 through 2017, DialoGPT extends the Hugging Face PyTorch transformer to attain a performance close to human both in terms of automatic and human evaluation in single-turn dialogue settings. We show that conversational systems that leverage DialoGPT generate more relevant, contentful and context-consistent responses than strong baseline systems. The pre-trained model and training pipeline are publicly released to facilitate research into neural response generation and the development of more intelligent open-domain dialogue systems.
Search
Co-authors
- Yizhe Zhang 1
- Michel Galley 1
- Yen-Chun Chen 1
- Chris Brockett 1
- Xiang Gao 1
- show all...
Venues
- ACL1