Authors
Konstantin Todorov
G. Colavizza
Date (dd-mm-yyyy)
2022
Title
An Assessment of the Impact of OCR Noise on Language Models
Publication Year
2022
Document type
Paper
Abstract
Neural language models are the backbone of modern-day natural language processing applications. Their use on textual heritage collections which have undergone Optical Character Recognition (OCR) is therefore also
increasing. Nevertheless, our understanding of the impact OCR noise could have on language models is still limited. We perform an assessment of the impact OCR noise has on a variety of language models, using data in Dutch, English, French and German. We find that OCR noise poses a significant obstacle to language modelling, with language models increasingly diverging from their noiseless targets as OCR quality lowers. In the presence of small corpora, simpler models including PPMI and Word2Vec consistently outperform
transformer-based models in this respect.
Permalink
https://hdl.handle.net/11245.1/b407055c-8b82-4c4d-87fa-aef76b0ba02e