
Human Movement Detection using Recurrent Convolutional Neural Networks
Publication year - 2019
Publication title -
international journal of innovative technology and exploring engineering
Language(s) - English
Resource type - Journals
ISSN - 2278-3075
DOI - 10.35940/ijitee.l1090.10812s19
Subject(s) - artificial intelligence , human skeleton , computer vision , computer science , convolutional neural network , skeleton (computer programming) , set (abstract data type) , movement (music) , image (mathematics) , convolution (computer science) , human body , artificial neural network , pattern recognition (psychology) , philosophy , programming language , aesthetics
Human Movement detection is vital in Tele-presence Robots, Animations, Games and Robotic movements. By using Traditional methods with the help of sensor suits it is difficult to find and interpret the movements. As it includes so much sensor data which is difficult to interpret, find the action and send to long distances. It is also very expensive and bulky too. Image processing and computer vision provides a solution to detect and interpret Human movement based on R-CNN approach. It is cheap, easy and light weight algorithm. It takes the video input and divides it in to frames, then it is Human body is separated for the background image. This paper mainly focused on skeleton, its major points and its relative positions in successive picture frames. A set of frames (Video) is given as input to the model, so that the model compares the coordinates of the successive frames and estimates the movement. First, the human is identified and separated from the rest of the image by drawing a bounding box around the human by using CNN (Convolution neural networks), then by applying R-CNN human is segmented and converted to skeleton. From the shape of the skeleton we can identify whether the skeleton is that of a human or not. Comparing the relative coordinates of skeletons extracted from frames photographed over time gives the movement of the human and its direction.