z-logo
open-access-imgOpen Access
Performance of VR Technology in Environmental Art Design Based on Multisensor Information Fusion under Computer Vision
Author(s) -
Tao Xu
Publication year - 2022
Publication title -
mobile information systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.346
H-Index - 34
eISSN - 1875-905X
pISSN - 1574-017X
DOI - 10.1155/2022/3494535
Subject(s) - computer science , sensor fusion , process (computing) , information fusion , kalman filter , computer vision , fusion , artificial intelligence , virtual reality , information filtering system , extended kalman filter , machine learning , linguistics , philosophy , operating system
Multisensor information fusion technology is a symbol of scientific and technological progress. This paper is aimed at discussing the performance of virtual reality (VR) technology in the environmental art design of multisensor information fusion technology. This paper prepares some related work in the early stage and then lists the algorithms and models, such as the multisensor information fusion model based on VR instrument technology, and shows the principle of information fusion and GPID bus structure. This paper describes the multisensor information fusion algorithm to analyze DS evidence theory. In the evidence-based decision theory, the multisensor information fusion process is the calculation of the qualitative level and/or confidence level function, generally calculating the posterior distribution information. In addition to showing its algorithm, this paper also shows the data flow of the multisensor information fusion system through pictures. Then, this paper explains the design and construction of garden art environment based on active panoramic stereo vision sensor, shows the relationship of the four coordinates in an all-round way, and shows the interactive experience of indoor and outdoor environmental art design. Then, this paper conducts estimation simulation experiments based on EKF and shows the results, and it is concluded that the fusion data using the extended Kalman filter algorithm is closer to the actual target motion data and the accuracy rate is better than 92%.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom