Premium
A shared augmented virtual environment for real‐time mixed reality applications
Author(s) -
Zhu Yu,
Li Shiying,
Luo Xi,
Zhu Kang,
Fu Qiang,
Chen Xilin,
Gong Huixing,
Yu Jingyi
Publication year - 2018
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.1805
Subject(s) - computer science , augmented reality , virtual reality , headset , metaverse , exploit , computer graphics (images) , merge (version control) , human–computer interaction , opengl , mixed reality , artificial intelligence , visualization , telecommunications , computer security , information retrieval
Headsets for virtual reality such as head‐mounted displays have become ubiquitous and bring immersive experiences to individual users. People who stand outside the virtual world may want to share the same scenes that are shown on the screen of the headset. It is therefore of great importance to merge real and virtual worlds into the same environment, where physical and virtual objects exist simultaneously and interact in real time. We propose shared augmented virtual environment (SAVE), a mixed reality (MR) system that overlays the virtual world with real objects captured by a Kinect depth camera. We refine the depth map and exploit a Graphics Processing Unit (GPU) based natural image matting method to obtain the real objects from cluttered scenes. In the synthetic MR world, we can render real and virtual objects in real time and handle the depth from both worlds properly. The advantage of our system is that we connect the virtual and real worlds with a bridge controller mounted on the Kinect and need to calibrate the whole system only once before use. Our results demonstrate that the proposed SAVE system is able to create high‐quality 1080p live MR footage, enabling realistic virtual experiences to be shared among a number of people in potential applications such as education, design, and entertainment.