z-logo
open-access-imgOpen Access
Simultaneous Mosaicing and Tracking with an Event Camera
Author(s) -
Hanme Kim,
Ankur Handa,
Ryad Benosman,
Sio H. Ieng,
Andrew Davison
Publication year - 2014
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.28.26
Subject(s) - computer vision , artificial intelligence , computer science , tracking (education) , event (particle physics) , computer graphics (images) , physics , psychology , pedagogy , quantum mechanics
© 2014. The copyright of this document resides with its authors. An event camera is a silicon retina which outputs not a sequence of video frames like a standard camera, but a stream of asynchronous spikes, each with pixel location, sign and precise timing, indicating when individual pixels record a threshold log intensity change. By encoding only image change, it offers the potential to transmit the information in a standard video but at vastly reduced bitrate, and with huge added advantages of very high dynamic range and temporal resolution. However, event data calls for new algorithms, and in particular we believe that algorithms which incrementally estimate global scene models are best placed to take full advantages of its properties. Here, we show for the first time that an event stream, with no additional sensing, can be used to track accurate camera rotation while building a persistent and high quality mosaic of a scene which is super-resolution accurate and has high dynamic range. Our method involves parallel camera rotation tracking and template reconstruction from estimated gradients, both operating on an event-by-event basis and based on probabilistic filtering.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom