z-logo
open-access-imgOpen Access
Multimodal Interface Based on Novel HMI UI/UX for In‐Vehicle Infotainment System
Author(s) -
Kim Jinwoo,
Ryu Jae Hong,
Han Tae Man
Publication year - 2015
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.15.0114.0076
Subject(s) - touchscreen , interface (matter) , computer science , mechanism (biology) , human–computer interaction , gesture , user interface , simulation , artificial intelligence , operating system , philosophy , bubble , epistemology , maximum bubble pressure method
We propose a novel HMI UI/UX for an in‐vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface–based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in‐vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here