Chen Lin
2020
A BERT-based One-Pass Multi-Task Model for Clinical Temporal Relation Extraction
Chen Lin
|
Timothy Miller
|
Dmitriy Dligach
|
Farig Sadeque
|
Steven Bethard
|
Guergana Savova
Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing
Recently BERT has achieved a state-of-the-art performance in temporal relation extraction from clinical Electronic Medical Records text. However, the current approach is inefficient as it requires multiple passes through each input sequence. We extend a recently-proposed one-pass model for relation classification to a one-pass model for relation extraction. We augment this framework by introducing global embeddings to help with long-distance relation inference, and by multi-task learning to increase model performance and generalizability. Our proposed model produces results on par with the state-of-the-art in temporal relation extraction on the THYME corpus and is much “greener” in computational cost.
Search