
TrackAER: Real-Time Event-Based Particle Tracking
Author(s) -
Alexander Rusch,
T. Roêsgen
Publication year - 2021
Publication title -
international symposium on particle image velocimetry.
Language(s) - English
Resource type - Journals
ISSN - 2769-7576
DOI - 10.18409/ispiv.v1i1.176
Subject(s) - computer science , event (particle physics) , computer vision , tracking (education) , pixel , brightness , timestamp , artificial intelligence , computer graphics (images) , representation (politics) , flash (photography) , real time computing , optics , physics , psychology , pedagogy , quantum mechanics , politics , law , political science
Event-based cameras (Lichtsteiner et al., 2008; Posch et al., 2010; Gallego et al., 2020) operate fundamentally different from frame-based cameras: Each pixel of the sensor array reacts asynchronously to relative brightness changes creating a sequential stream of events in address-event representation (AER). Each event is defined by a microsecond-accurate time stamp, the pixel position and a binary polarity indicating a relative increase or decrease of light intensity. Thus, event-based cameras only sense changes in a scenery while effectively suppressing static, redundant information. This renders the camera technology promising also for flow diagnostics. In established approaches like PIV or PTV vast amounts of data are generated, only for a large part of redundant information to be eliminated in data post-processing. In contrast, eventbased cameras effectively compress the data stream already at the source. To make full use of this potential, new data processing algorithms are needed since event-based cameras do not generate conventional framebased data. This work utilizes an event-based camera to identify and track flow tracers such as helium-filled soap bubbles (HFSBs) with real-time visual feedback in measurement volumes of the order of several cubic meters.