Authors
Wei Zhou
J. Bloem
Date (dd-mm-yyyy)
2021-11
Title
Comparing Contextual and Static Word Embeddings with Small Philosophical Data
Publication Year
2021-11
Number of pages
7
Publisher
Düsseldorf, GermanyKONVENS 2021 Organizers
Document type
Conference contribution
Abstract
For domain-specific NLP tasks, applying word embeddings trained on general corpora is not optimal. Meanwhile, training domain-specific word representations poses challenges to dataset construction and embedding evaluation. In this paper, we present and compare ELMo and Word2Vec models trained/finetuned on philosophical data. For evaluation, a conceptual network was used. Results show that contextualized models provide better word embeddings than static models and that merging embeddings from different models boosts task performance.
Permalink
https://hdl.handle.net/11245.1/c3d35f60-44b1-4fe7-ad40-e6de7866c3fe