
Controlling your contents with the breath: Interactive breath interface for VR, games, and animations
Author(s) -
JongHyun Kim,
Jung Lee
Publication year - 2020
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0241498
Subject(s) - computer science , mobile device , interface (matter) , virtual reality , human–computer interaction , acceleration , animation , noise (video) , computer vision , simulation , computer graphics (images) , physics , bubble , classical mechanics , maximum bubble pressure method , parallel computing , image (mathematics) , operating system
In this paper, we propose a new interface to control VR(Virtual reality) contents, games, and animations in real-time using the user’s breath and the acceleration sensor of a mobile device. Although interaction techniques are very important in VR and physically-based animations, UI(User interface) methods using different types of devices or controllers have not been covered. Most of the proposed interaction techniques have focused on screen touch and motion recognition. The direction of the breath is calculated using the position and angle between the user and the mobile device, and the control position to handle the contents is determined using the acceleration sensor built into the mobile device. Finally, to remove the noise contained in the input breath, the magnitude of the wind is filtered using a kernel modeling a pattern similar to the actual breath. To demonstrate the superiority of this study, we produced real-time interaction results by applying the breath as an external force of VR contents, games, and animations.