z-logo
open-access-imgOpen Access
Evaluation of Gesture Based Interfaces for Medical Volume Visualization Tasks
Author(s) -
Can Kirmizibayrak,
Nadezhda Radeva,
Mike Wakid,
John W. Philbeck,
John L. Sibert,
James K. Hahn
Publication year - 2012
Publication title -
international journal of virtual realit
Language(s) - English
Resource type - Journals
eISSN - 2727-9979
pISSN - 1081-1451
DOI - 10.20870/ijvr.2012.11.2.2839
Subject(s) - gesture , computer science , visualization , human–computer interaction , magic (telescope) , interface (matter) , user interface , task (project management) , modalities , artificial intelligence , computer vision , physics , management , bubble , quantum mechanics , maximum bubble pressure method , parallel computing , economics , operating system , social science , sociology
Interactive systems are increasingly used in medical applications with the widespread availability of various imaging modalities. Gesture-based interfaces can be beneficial to interact with these kinds of systems in a variety of settings, as they can be easier to learn and can eliminate several shortcomings of traditional tactile systems, especially for surgical applications. We conducted two user studies that explore different gesture-based interfaces for interaction with volume visualizations. The first experiment focused on rotation tasks, where the performance of the gesture-based interface (using Microsoft Kinect) was compared to using the mouse. The second experiment studied localization of internal structures, comparing slice-based visualizations via gestures and the mouse, in addition to a 3D Magic Lens visualization. The results of the user studies showed that the gesture-based interface outperform the traditional mouse both in time and accuracy in the orientation matching task. The traditional mouse was the superior interface for the second experiment in terms of accuracy. However, the gesture-based Magic Lens interface was found to have the fastest target localization time. We discuss these findings and their further implications in the use of gesture-based interfaces in medical volume visualization, and discuss the possible underlying psychological mechanisms why these methods can outperform traditional interaction methods

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom