z-logo
open-access-imgOpen Access
Multi-modal human-computer interaction system in cockpit
Author(s) -
Jie Ren,
Yanyan Cui,
Jing Chen,
Yuanyuan Qiao,
Luhui Wang
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1693/1/012212
Subject(s) - cockpit , mode (computer interface) , computer science , modal , human–computer interaction , interface (matter) , control (management) , set (abstract data type) , software , field (mathematics) , gesture , simulation , engineering , artificial intelligence , aeronautics , operating system , chemistry , mathematics , bubble , maximum bubble pressure method , polymer chemistry , pure mathematics , programming language
In order to explore new human-machine interaction methods, a set of multi-modal human-machine interaction coordinated control system is proposed, which realizes the basic flight control based on the change of the pilot’s field of view, touch control, voice control and other information obtained from multi-mode coordinated control. This system introduces a new type of human-computer interaction into the cockpit application on the basis of the human-computer interaction interface for flight driving, and has carried out research on multi-mode collaborative control system, mainly including eye movement interaction, touch interaction gesture Interaction and voice interaction. Finally, this project formed a multi-mode cooperative control software and hardware system.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here