A Gesture-based Tool for Sterile Browsing of Radiology Images
Author(s) -
Juan Wachs,
Helman I. Stern,
Yael Edan,
M P Gillam,
Jonathan A. Handler,
Craig F. Feied,
Michael J. Smith
Publication year - 2008
Publication title -
journal of the american medical informatics association
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.614
H-Index - 150
eISSN - 1527-974X
pISSN - 1067-5027
DOI - 10.1197/jamia.m241
Subject(s) - gesture , modalities , focus (optics) , computer science , usability , human–computer interaction , interface (matter) , user interface , artificial intelligence , computer vision , multimedia , social science , physics , bubble , maximum bubble pressure method , sociology , parallel computing , optics , operating system
The use of doctor-computer interaction devices in the operation room (OR) requires new modalities that support medical imaging manipulation while allowing doctors' hands to remain sterile, supporting their focus of attention, and providing fast response times. This paper presents "Gestix," a vision-based hand gesture capture and recognition system that interprets in real-time the user's gestures for navigation and manipulation of images in an electronic medical record (EMR) database. Navigation and other gestures are translated to commands based on their temporal trajectories, through video capture. "Gestix" was tested during a brain biopsy procedure. In the in vivo experiment, this interface prevented the surgeon's focus shift and change of location while achieving a rapid intuitive reaction and easy interaction. Data from two usability tests provide insights and implications regarding human-computer interaction based on nonverbal conversational modalities.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom