Improved SingleTapBraille: Developing a single tap text entry method based on Grade 1 and 2 braille encoding
Author(s) -
Mrim Alnfiai,
Srinivas Sampalli
Publication year - 2017
Publication title -
journal of ubiquitous systems and pervasive networks
Language(s) - English
Resource type - Journals
ISSN - 1923-7332
DOI - 10.5383/juspn.09.01.003
Subject(s) - touchscreen , braille , computer science , text entry , visually impaired , human–computer interaction , thumb , object (grammar) , gesture , computer vision , artificial intelligence , operating system , medicine , anatomy
Touchscreen technology has brought about significant improvements for both normal sighted and visually impaired people. Visually impaired people tend to use touchscreen devices because these devices support a screen reader function, providing a cheaper, smaller alternative to screen reader machines. However, most of the available touchscreen keyboards are still largely inaccessible to blind and visually impaired people because they require the user to find an object location on a touchscreen in order to interact with an application. In this paper, we describe SingleTapBraille, a novel nonvisual text input approach for touchscreen devices. With SingleTapBraille, a user enters characters including text, numbers, and punctuations by tapping anywhere on the screen with one finger or a thumb several times based on braille patterns. This paper presents our initial keyboard design to enter Grade 1 and our explorative evaluation of SingleTapBraille conducted with braille instructors and visually impaired users. It also presents the implementation of Grade 2 and an initial evaluation of the improved SingleTapBraille keyboard with a blind user.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom