Changyou Chen
2020
Generative Semantic Hashing Enhanced via Boltzmann Machines
Lin Zheng
|
Qinliang Su
|
Dinghan Shen
|
Changyou Chen
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Generative semantic hashing is a promising technique for large-scale information retrieval thanks to its fast retrieval speed and small memory footprint. For the tractability of training, existing generative-hashing methods mostly assume a factorized form for the posterior distribution, enforcing independence among the bits of hash codes. From the perspectives of both model representation and code space size, independence is always not the best assumption. In this paper, to introduce correlations among the bits of hash codes, we propose to employ the distribution of Boltzmann machine as the variational posterior. To address the intractability issue of training, we first develop an approximate method to reparameterize the distribution of a Boltzmann machine by augmenting it as a hierarchical concatenation of a Gaussian-like distribution and a Bernoulli distribution. Based on that, an asymptotically-exact lower bound is further derived for the evidence lower bound (ELBO). With these novel techniques, the entire model can be optimized efficiently. Extensive experimental results demonstrate that by effectively modeling correlations among different bits within a hash code, our model can achieve significant performance gains.
Towards Faithful Neural Table-to-Text Generation with Content-Matching Constraints
Zhenyi Wang
|
Xiaoyang Wang
|
Bang An
|
Dong Yu
|
Changyou Chen
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Text generation from a knowledge base aims to translate knowledge triples to natural language descriptions. Most existing methods ignore the faithfulness between a generated text description and the original table, leading to generated information that goes beyond the content of the table. In this paper, for the first time, we propose a novel Transformer-based generation framework to achieve the goal. The core techniques in our method to enforce faithfulness include a new table-text optimal-transport matching loss and a table-text embedding similarity loss based on the Transformer model. Furthermore, to evaluate faithfulness, we propose a new automatic metric specialized to the table-to-text generation problem. We also provide detailed analysis on each component of our model in our experiments. Automatic and human evaluations show that our framework can significantly outperform state-of-the-art by a large margin.
Improving Adversarial Text Generation by Modeling the Distant Future
Ruiyi Zhang
|
Changyou Chen
|
Zhe Gan
|
Wenlin Wang
|
Dinghan Shen
|
Guoyin Wang
|
Zheng Wen
|
Lawrence Carin
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Auto-regressive text generation models usually focus on local fluency, and may cause inconsistent semantic meaning in long text generation. Further, automatically generating words with similar semantics is challenging, and hand-crafted linguistic rules are difficult to apply. We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues. Specifically, we propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization. Extensive experiments demonstrate that the proposed method leads to improved performance.
Search
Co-authors
- Dinghan Shen 2
- Lin Zheng 1
- Qinliang Su 1
- Zhenyi Wang 1
- Xiaoyang Wang 1
- show all...
Venues
- ACL3