Sujith Ravi
2020
GoEmotions: A Dataset of Fine-Grained Emotions
Dorottya Demszky
|
Dana Movshovitz-Attias
|
Jeongwoo Ko
|
Alan Cowen
|
Gaurav Nemade
|
Sujith Ravi
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Understanding emotion expressed in language has a wide range of applications, from building empathetic chatbots to detecting harmful online behavior. Advancement in this area can be improved using large-scale datasets with a fine-grained typology, adaptable to multiple downstream tasks. We introduce GoEmotions, the largest manually annotated dataset of 58k English Reddit comments, labeled for 27 emotion categories or Neutral. We demonstrate the high quality of the annotations via Principal Preserved Component Analysis. We conduct transfer learning experiments with existing emotion benchmarks to show that our dataset generalizes well to other domains and different emotion taxonomies. Our BERT-based model achieves an average F1-score of .46 across our proposed taxonomy, leaving much room for improvement.
Low-Dimensional Hyperbolic Knowledge Graph Embeddings
Ines Chami
|
Adva Wolf
|
Da-Cheng Juan
|
Frederic Sala
|
Sujith Ravi
|
Christopher Ré
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Knowledge graph (KG) embeddings learn low- dimensional representations of entities and relations to predict missing facts. KGs often exhibit hierarchical and logical patterns which must be preserved in the embedding space. For hierarchical data, hyperbolic embedding methods have shown promise for high-fidelity and parsimonious representations. However, existing hyperbolic embedding methods do not account for the rich logical patterns in KGs. In this work, we introduce a class of hyperbolic KG embedding models that simultaneously capture hierarchical and logical patterns. Our approach combines hyperbolic reflections and rotations with attention to model complex relational patterns. Experimental results on standard KG benchmarks show that our method improves over previous Euclidean- and hyperbolic-based efforts by up to 6.1% in mean reciprocal rank (MRR) in low dimensions. Furthermore, we observe that different geometric transformations capture different types of relations while attention- based transformations generalize to multiple relations. In high dimensions, our approach yields new state-of-the-art MRRs of 49.6% on WN18RR and 57.7% on YAGO3-10.
Search
Co-authors
- Dorottya Demszky 1
- Dana Movshovitz-Attias 1
- Jeongwoo Ko 1
- Alan Cowen 1
- Gaurav Nemade 1
- show all...
Venues
- ACL2