z-logo
open-access-imgOpen Access
Development of an algorithm for object access system automation based on face recognition using a neural network
Author(s) -
V. S. Orlenko,
I. I. Kolosinsʹkyy
Publication year - 2020
Publication title -
zv'âzok
Language(s) - English
Resource type - Journals
ISSN - 2412-9070
DOI - 10.31673/2412-9070.2020.033437
Subject(s) - artificial neural network , computer science , face (sociological concept) , artificial intelligence , identification (biology) , facial recognition system , set (abstract data type) , field (mathematics) , point (geometry) , automation , algorithm , object (grammar) , pattern recognition (psychology) , time delay neural network , computer vision , mathematics , engineering , mechanical engineering , social science , botany , geometry , sociology , pure mathematics , biology , programming language
The article deals with the technical side of face recognition — the neural network. The advantages of the neural network for identification of the person are substantiated, the stages of comparison of two images are considered. The first step is defined as the face search in the photo. Using several tests, the best neural network was identified, which allowed to effectively obtain a normalized image of a person’s face. The second step is to find the features of the person, for which the comparative analysis is performed. It was this stage that became the main point in this article — 16 sets of tests were carried out, each test set has 12 tests inside. Two large datasets were used for the study to evaluate the effectiveness of the algorithms not only in ideal circumstances but also in the field. The results of the study allowed us to determine the best method and neural model for finding a face and dividing it into parts. It is determined which part of the face the algorithm recognizes best — it will allow making adjustments to the location of the camera.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here