z-logo
open-access-imgOpen Access
Graph clustering‐based crowd counting with very limited labelled samples
Author(s) -
Wang Huake,
Zhang Kaibing,
Su Zebin,
Lu Jian,
Xiong Zenggang
Publication year - 2020
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
eISSN - 1350-911X
pISSN - 0013-5194
DOI - 10.1049/el.2020.0746
Subject(s) - cluster analysis , representativeness heuristic , computer science , artificial intelligence , graph , pattern recognition (psychology) , benchmark (surveying) , feature (linguistics) , feature vector , clustering coefficient , annotation , sampling (signal processing) , data mining , mathematics , statistics , computer vision , geography , linguistics , philosophy , geodesy , theoretical computer science , filter (signal processing)
In this Letter, the authors present a novel graph clustering‐based method for crowd counting only using very limited labelled samples. Based on an intuitional observation that the distribution of low‐level features of a specific scene containing the same or similar number of pedestrians are close to each other in the feature space, the authors adopt a first neighbour propagation (FNP) based clustering method to divide all unlabelled data into different groups. Next, an active sampling learning strategy that measures representativeness and diversity of the training data is used to obtain a few informative samples for annotation. Finally, the counts of those labelled informative samples are effectively propagated to predict the unlabelled samples in the constructed clusters by FNP‐based clustering. The compelling results on two benchmark datasets demonstrate that the proposed method is not only effective to estimate crowd counts with very few labelled samples but also applicable to annotate a large number of unknown video frames for scene‐specific crowd counting models.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here