More Efficient Manual Review of Automatically Transcribed Tabular Data
DOI:
https://doi.org/10.51964/hlcs15456Keywords:
Population census data, Machine Learning, Historical data, Manual review, Occupation codes, Norway 1950, Norwegian population data, Efficient manual review, Automatically transcribed, Tabular data, Manual review and correction, Interviews, Norwegian occupation data, Historical occupation data, Norwegian Historical Data Centre, UiT The Arctic University of NorwayAbstract
Any machine learning method for transcribing historical text requires manual verification and correction, which is often time-consuming and expensive. Our aim is to make it more efficient. Previously, we developed a machine learning model to transcribe 2.3 million handwritten occupation codes from the Norwegian 1950 census. Here, we manually review the 90,000 codes (3%) for which our model had the lowest confidence scores. We allocated these codes to human reviewers, who used our custom annotation tool to review them. The reviewers agreed with the model's labels 31.9% of the time. They corrected 62.8% of the labels, and 5.1% of the images were uncertain or assigned invalid labels. 9,000 images were reviewed by multiple reviewers, resulting in an agreement of 86.4% and a disagreement of 9%. The results suggest that one reviewer per image is sufficient. We recommend that reviewers indicate any uncertainty about the label they assign to an image by adding a flag to their label. Our interviews show that the reviewers performed internal quality control and found our custom tool to be useful and easy to operate. We provide guidelines for efficient and accurate transcription of historical text by combining machine learning and manual review. We have open-sourced our custom annotation tool and made the reviewed images open access.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Bjørn-Richard Pedersen, Rigmor Katrine Johansen, Einar Holsbø, Hilde Sommerseth, Lars Ailo Bongo
This work is licensed under a Creative Commons Attribution 4.0 International License.