z-logo
open-access-imgOpen Access
Visual Analysis of Large, Time-Dependent, Multi-Dimensional Smart Sensor Tracking Data
Author(s) -
Walker James
Publication year - 2017
Language(s) - English
Resource type - Dissertations/theses
DOI - 10.23889/suthesis.36342
Subject(s) - compromise , tracking (education) , computer science , degree (music) , data science , operations research , management , engineering , sociology , economics , pedagogy , social science , physics , acoustics
Technological advancements over the past decade have increased our ability to collect data to previously unimaginable volumes [Kei02]. Understanding temporal patterns is key to gaining knowledge and insight. However, our capacity to store data now far exceeds the rate at which we are able to understand it [KKEM10]. This phenomenon has led to a growing need for advanced solutions to make sense and use of an ever-increasing data space. Abstract temporal data provides additional challenges in its, representation, size, and scalability, high dimensionality, and unique structure. One instance of such temporal data is acquired from smart sensor tags attached to freelyroaming animals recording multiple parameters at infra-second rates which are becoming commonplace, and are transforming biologists understanding of the way wild animals behave. The excitement at the potential inherent in sophisticated tracking devices has, however, been limited by a lack of available software to advance research in the field. This thesis introduces methodologies to deal with the problem of the analysis of the large, multi-dimensional, time-dependent data acquired. Interpretation of such data is complex and currently limits the ability of biologists to realise the value of their recorded information. We present several contributions to the field of time-series visualisation, that is, the visualisation of ordered collections of real value data attributes at successive points in time sampled at uniform time intervals. Traditionally, time-series graphs have been used for temporal data. However, screen resolution is small in comparison to the large information space commonplace today. In such cases, we can only render a proportion of the data. It is widely accepted that the effective interpretation of large temporal data sets requires advanced methods and interaction techniques. In this thesis, we address these issues to enhance the exploration, analysis, and presentation of time-series data for movement ecologists in their smart sensor data analysis. The content of this thesis is split into two parts. In the first volume, we provide an overview of the relevant literature and state-of-the-art methodologies. In the second part, we introduce a research development of techniques which address particular challenges unsolved in the literature, emphasising on their application to solving challenging domain level tasks faced by movement ecologists using smart tag data. Firstly, we comparatively evaluate existing methods for the visual inspection of time-series data, giving a graphical overview of each and classifying their ability to explore and interact with data. Analysis often involves identifying segments of a time-series where specific phenomena occur and comparing between time segments for interesting patterns which can be used to form, prove, or refute a hypothesis. After analysis, findings are communicated to a wider audience. Navigating and communicating through a large data space is an important task which is not fully supported by existing techniques. We propose new visualisations and other extensions to the existing approaches. We undertake and report an empirical study and a field study of our approach on smart sensor data. The reality of researchers faced with perhaps 10 channels of data recorded at subsecond rates spanning several days, is it is time consuming and error-prone for the domain expert to manually decode behaviour. Primarily this is dependent on the manual inspection of multiple time-series graphs. Machine learning algorithms have been considered, but have been difficult to introduce because of the large numbers of training sets required and their low discriminating precision in practice. We introduce TimeClassifier, a visual analytic system for the classification of time-series data to assist in the labelling and understanding of smart sensor data. We deploy our system with biologists and report real-world case studies of its use. Next, we encapsulate TimeClassifier into an all-encompassing software suite, Framework4, which operates on smart sensor data to determine the four key elements considered pivotal for movement analysis from such tags. The software transforms smart sensor data into (i) dead-reckoned movements, (ii) template-matched behaviours, (iii) dynamic body acceleration derived energetics and (iv) position-linked environmental data before outputting it all into a single file. Biologists are given a software suite which enables them to link

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom