Open Access
UFSP-Net: a neural network with spatio-temporal information fusion for urban fire situation prediction
Author(s) -
Guangyin Jin,
Cunchao Zhu,
Bo Ji,
Hao Sha,
Xingchen Hu,
Jincai Huang
Publication year - 2020
Publication title -
iop conference series. materials science and engineering
Language(s) - English
Resource type - Journals
eISSN - 1757-899X
pISSN - 1757-8981
DOI - 10.1088/1757-899x/853/1/012050
Subject(s) - computer science , graph , convolutional neural network , artificial intelligence , recurrent neural network , process (computing) , artificial neural network , machine learning , data mining , theoretical computer science , operating system
Capturing the dynamics of urban fire situation is a basic but challenging task, which takes an indispensable role in the field of urban security and fire emergency decision. Traditional methods approach the urban fire prediction via stochastic process based on physics or statistics, which may be interpretable but less practical in real applications. Recently, some data-driven models, Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and Graph Convolutional Neural Network (GCN), seem to be fruitful in capturing spatio-temporal dynamics with massive high-dimension data. In this paper, we process some regional urban fire dataset of recent six years in the fire situation awareness images (FSAIs) and extract pixel-level latent representations with CNNs while GCNs are applied to process some spatial graph structure auxiliary information to obtain graph-level latent representations. And then Urban Fire Situation Prediction Neural Network (UFSP-Net) is formulated as a novel urban fire prediction model, integrating these two different kind of spatial latent representations and RNN structure. In comparison to other traditional algorithms, such as Conv-RNN, UFSP-Net demonstrates its superior prediction performance for multi type urban fire in spatio-temporal scale.