Jialong Han
2020
Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders
Yu Duan
|
Canwen Xu
|
Jiaxin Pei
|
Jialong Han
|
Chenliang Li
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Conditional Text Generation has drawn much attention as a topic of Natural Language Generation (NLG) which provides the possibility for humans to control the properties of generated contents. Current conditional generation models cannot handle emerging conditions due to their joint end-to-end learning fashion. When a new condition added, these techniques require full retraining. In this paper, we present a new framework named Pre-train and Plug-in Variational Auto-Encoder (PPVAE) towards flexible conditional text generation. PPVAE decouples the text generation module from the condition representation module to allow “one-to-many” conditional generation. When a fresh condition emerges, only a lightweight network needs to be trained and works as a plug-in for PPVAE, which is efficient and desirable for real-world applications. Extensive experiments demonstrate the superiority of PPVAE against the existing alternatives with better conditionality and diversity but less training effort.
Hypernymy Detection for Low-Resource Languages via Meta Learning
Changlong Yu
|
Jialong Han
|
Haisong Zhang
|
Wilfred Ng
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Hypernymy detection, a.k.a, lexical entailment, is a fundamental sub-task of many natural language understanding tasks. Previous explorations mostly focus on monolingual hypernymy detection on high-resource languages, e.g., English, but few investigate the low-resource scenarios. This paper addresses the problem of low-resource hypernymy detection by combining high-resource languages. We extensively compare three joint training paradigms and for the first time propose applying meta learning to relieve the low-resource issue. Experiments demonstrate the superiority of our method among the three settings, which substantially improves the performance of extremely low-resource languages by preventing over-fitting on small datasets.
Search
Co-authors
- Yuguang Duan 1
- Canwen Xu 1
- Jiaxin Pei 1
- Chenliang Li 1
- Changlong Yu 1
- show all...
Venues
- ACL2