z-logo
open-access-imgOpen Access
BrailleEnter: A Touch Screen Braille Text Entry Method for the Blind
Author(s) -
Mrim M. Alnfiai,
Srinivas Sampalli
Publication year - 2017
Publication title -
procedia computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.334
H-Index - 76
ISSN - 1877-0509
DOI - 10.1016/j.procs.2017.05.349
Subject(s) - touchscreen , braille , computer science , text entry , human–computer interaction , screen reader , gesture , mobile device , touchpad , smartwatch , multimedia , visually impaired , wearable computer , artificial intelligence , computer hardware , world wide web , embedded system , operating system
Text entry methods on touchscreen devices are often developed without taking into account the needs of people with no or low vision. Despite this, there has been significant improvement in touchscreen accessibility for blind people; for example, integrating the screen reader in smartphone devices. The QWERTY keyboard with VoiceOver and many other proposed touchscreen keyboards are in many ways inaccessible to blind people because they primarily require finding keys’ locations on a touchscreen and both hands to interact with keyboard interfaces. In order to enable blind people to easily perform text entry, this research has the objective of developing a tool that overcomes the limitations of existing keyboards. In this paper, we describe BrailleEnter, a text input method to support nonvisual interaction on touchscreen devices. BrailleEnter allows users to type letters, numbers, and punctuation by pressing the screen to represent raised Braille dots and briefly tapping to represent unraised dots anywhere on a touchscreen using one finger, based on braille coding. This paper presents our keyboard design rationale and our preliminary evaluation of BrailleEnter, which was conducted with two blind users. The study shows that the BrailleEnter keyboard has the potential to make smartphone devices fully accessible for blind people. The study also highlights that using only one finger to interact with a screen is the most accessible method for blind users.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom