What does this word mean? Explaining contextualized embeddings with natural language definition
Journal
EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference
Date Issued
2020
Author(s)
Chang, T.-Y.
Abstract
Contextualized word embeddings have boosted many NLP tasks compared with traditional static word embeddings. However, the word with a specific sense may have different contextualized embeddings due to its various contexts. To further investigate what contextualized word embeddings capture, this paper analyzes whether they can indicate the corresponding sense definitions and proposes a general framework that is capable of explaining word meanings given contextualized word embeddings for better interpretation. The experiments show that both ELMo and BERT embeddings can be well interpreted via a readable textual form, and the findings may benefit the research community for a better understanding of what the embeddings capture1. © 2019 Association for Computational Linguistics
Other Subjects
Embeddings; Natural languages; Research communities; Word meaning; Natural language processing systems
Type
conference paper
