View-Dependent Rendering to Enhance Natural Perception for Augmented Reality Workstations
Author(s) -
Rafael Radkowski,
James H. Oliver
Publication year - 2012
Publication title -
volume 1: advanced computational mechanics; advanced simulation-based engineering sciences; virtual and augmented reality; applied solid mechanics and material processing; dynamical systems and control
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1115/esda2012-82608
Subject(s) - rendering (computer graphics) , computer science , workstation , augmented reality , workspace , computer vision , computer graphics (images) , perception , visualization , artificial intelligence , human–computer interaction , robot , neuroscience , biology , operating system
International audienceThis paper presents a method for reconstructing 3D buildings and updating Geographic Information System (GIS) data from video. We use 2D-GIS data and a ground-based video sequence as inputs. The main approach consists of three parts. In the first part, the data is captured and analyzed: besides the 2D-GIS data, we capture a video from a street view; then we can obtain thousands of 3D feature points by our extracting algorithm and design a noise filter to remove outliers. In the second part, we present a generation process, which contains the footprint extraction and basic facades reconstruction. The last part is the correction and updating process: after correcting the footprint and computing the height of the building, our method will update the data into GIS. In addition, we use some user knowledge to make the results much more accurate. In the filtering and the correcting process, our method can deal with several interactive operations. Copyright \textcopyright 2012 by ASME
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom