Measuring the impact of automated evaluation tools on alternative text quality
Author(s) -
Silvia Rodríguez Vázquez
Publication year - 2016
Publication title -
archive ouverte unige (university of geneva)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/2899475.2899484
Subject(s) - computer science , task (project management) , quality (philosophy) , world wide web , the internet , empirical research , screen reader , information retrieval , multimedia , data science , human–computer interaction , engineering , philosophy , systems engineering , epistemology , visually impaired
The number of Internet users has increased tenfold since the beginning of the century up to present, especially thanks to the improvements experienced in web accessibility and the growing number of languages which online content is available in. While translation professionals are making a considerable contribution to that digital information richness, little evidence exists regarding their involvement in the achievement of a more accessible web for all. In this paper, we present the main results of the first empirical study on web accessibility conceived around a translation task. The experiment sought to particularly investigate the quality of image text alternatives produced by French translators with the help of two evaluation tools: aDesigner and Acrolinx. The assessment of their alt text proposals, carried out by seven screen reader users, suggests that using both tools helps translators to create more appropriate text alternatives than when trying to do so with only one tool or without any automated support. A more in-depth analysis of the data gathered shows that Acrolinx offers better guidance than aDesigner for translators to render images accessible
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom