Lecture: Semantic Change in Computational Linguistics

Language is inherently dynamic and changing over time. However, mainstream computational models like BERT are mostly based on static corpora, and retraining the models is costly---both time and resource-wise. (Lazaridou et al., 2021; Brown et al. 2020; Vaswani et al., 2017) It is therefore necessary to find ways to update language models and factor in language change. This is also important from an ethical perspective.
Lexical semantic change detection addresses the dynamic nature of semantic representations. The field focuses on the development systems, which measure the change of a word‘s meaning between two periods of time. (Tahmasebi et al., 2021) It has seen increased interest in recent years, leading to the introduction of the first large-scale semantic change detection task. (Schlechtweg et al., 2020)

In the talk, I explain why ‘static’ language modelling can be problematic. I show how insight from historical linguistics and typology can contribute to computational linguistics in general and lexical semantic change detection in particular. Finally, present the work of my ongoing master's thesis, in which I investigate the role of change types such as metaphor and metonymy in performance of current state-of-the-art models. To this end, I construct a test corpus and compare the performances of a representative subset of change detection systems.

Brown, Tom, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, Prafulla Dhariwal, Arvind Neelakantan, et al. “Language Models Are Few-Shot Learners.” In Advances in Neural Information Processing Systems, 33:1877–1901. Curran Associates, Inc., 2020.
Lazaridou, Angeliki, Adhi Kuncoro, Elena Gribovskaya, Devang Agrawal, Adam Liska, Tayfun Terzi, Mai Gimenez, et al. “Mind the Gap: Assessing Temporal Generalization in Neural Language Models.” In Advances in Neural Information Processing Systems, 34:29348–63. Curran Associates, Inc., 2021.
Schlechtweg, Dominik, Barbara McGillivray, Simon Hengchen, Haim Dubossarsky, and Nina Tahmasebi. “SemEval-2020 Task 1: Unsupervised Lexical Semantic Change Detection.” In Proceedings of the Fourteenth Workshop on Semantic Evaluation, 1–23. Barcelona (online): International Committee for Computational Linguistics, 2020.
Tahmasebi, Nina, Lars Borin, and Adam Jatowt. “Survey of Computational Approaches to Lexical Semantic Change Detection.” In Computational Approaches to Semantic Change, 1–91. Berlin: Language Science Press, 2021.
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention Is All You Need.” In Advances in Neural Information Processing Systems, Curran Associates, Inc., 2017.

Info

Day: 2022-11-03
Start time: 14:45
Duration: 00:30
Room: Wiwi-Bunker —Room 5050
Track: Computational Linguistics
Language: en

Links:

Feedback

Click here to let us know how you liked this event.

Concurrent Events