From Touch to Immersion: A Systematic Review of Haptic and Multimodal Perception in the Metaverse
Author(s) -
Wenyu Yang,
Zihe Zhao,
Shuo Gao
Publication year - 2025
Publication title -
ieee open journal on immersive displays
Language(s) - English
Resource type - Magazines
eISSN - 2836-211X
DOI - 10.1109/ojid.2025.3622139
Subject(s) - computing and processing
In the emerging metaverse, immersive experiences depend on the integration of multimodal sensory channels, with haptic feedback playing a central role in enhancing realism and interaction. Recent progress in wearable devices, force feedback, and neural interfaces has driven research on combining tactile sensations with visual, auditory, olfactory, thermal, and force-related signals. Yet, current systems still face challenges in latency, energy efficiency, synchronisation, and personalisation, limiting their ability to match the complexity of human perception. This paper reviews the trajectory of haptic and multimodal perception fusion in the metaverse. It first introduces the biological and psychological foundations of touch, then discusses tactile technologies and device configurations. Next, it examines multimodal fusion models and mechanisms through comparative analyses and application evaluations, focusing on spatio-temporal synchronisation, cross-modal compensation, perceptual enhancement, and attentional allocation. The review provides an overview of technical approaches, implementation strategies, and application challenges, offering both theoretical grounding and practical insights for designing synchronised, real-time, personalised, and scalable multisensory interaction systems.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom