
Video‐based road traffic monitoring and prediction using dynamic Bayesian networks
Author(s) -
Chaudhary Shraddha,
Indu Sreedevi,
Chaudhury Santanu
Publication year - 2018
Publication title -
iet intelligent transport systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.579
H-Index - 45
eISSN - 1751-9578
pISSN - 1751-956X
DOI - 10.1049/iet-its.2016.0336
Subject(s) - dynamic bayesian network , computer science , real time computing , floating car data , intelligent transportation system , bayesian network , synchro , traffic congestion reconstruction with kerner's three phase theory , chaotic , pedestrian , bayesian probability , state (computer science) , gaussian , bayesian inference , artificial intelligence , data mining , traffic congestion , engineering , transport engineering , algorithm , physics , electrical engineering , quantum mechanics
The varied road conditions, chaotic and unstructured traffic, lack of lane discipline and wide variety of vehicles in countries like India, Pakistan and so on pose a need for a novel traffic monitoring system. In this study, the authors propose a novel camera‐based traffic monitoring and prediction scheme without identifying or tracking vehicles. Spatial interest points (SIPs) and spatio‐temporal interest points (STIPs) are extracted from the video stream of road traffic. SIP represents the number of vehicles and STIP represents the number of moving vehicles. The distributions of these features are then classified using Gaussian mixture model. In the proposed method, they learn the road state pattern using dynamic Bayesian network and predict the future road traffic state within a specific time delay. The predicted road state information can be used for traffic planning. The proposed method is computationally light, yet very powerful and efficient. The algorithm is tested for different weather conditions as well. They have validated their algorithm using Synchro Studio simulator and got 95.7% as average accuracy and on real‐time video we got an accuracy of 84%.