
Semi‐automatic annotation samples for vehicle type classification in urban environments
Author(s) -
Chen Zezhi,
Ellis Tim
Publication year - 2015
Publication title -
iet intelligent transport systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.579
H-Index - 45
eISSN - 1751-9578
pISSN - 1751-956X
DOI - 10.1049/iet-its.2013.0150
Subject(s) - annotation , computer science , classifier (uml) , artificial intelligence , cluster analysis , support vector machine , ground truth , pattern recognition (psychology) , automatic image annotation , contextual image classification , process (computing) , data mining , machine learning , image retrieval , image (mathematics) , operating system
Data collection, and especially data annotation, are surprisingly time consuming and costly tasks for vehicle classification. Annotation is used to label examples of vehicles, manually outlining their shapes and assigning their correct classification, for use in classifier training and performance evaluation. This study presents a semi‐automatic approach for the annotation of the vehicle samples recorded from roadside CCTV video cameras. Vehicles are detected by using automatic image analysis and classified into four main categories: car, van, bus and motorcycle/bicycle by using a vehicle observation vector constructed from the size, the shape and the appearance features. Unsupervised K ‐means clustering is used to automatically compute an initial class label for each detected vehicle. Then, in an iterative process, the output scores of a linear support vector machines classifier are used to identify the low confidence samples, for which the annotations are considered for manual correction. Experimental results are presented for both synthetic and real datasets to demonstrate the effectiveness and the efficiency of the authors approach, which significantly reduces the time required to generate an annotated dataset. The method is general enough that it can be used in other classification problems and domains that use a manually‐created ground‐truth.