z-logo
open-access-imgOpen Access
COMPARISION OF FACE DETECTION TOOLS
Author(s) -
Ye. Amirgaliyev,
А. Р. Садыкова,
Ch. Kenshimov
Publication year - 2021
Publication title -
habaršy. fizika-matematika seriâsy
Language(s) - English
Resource type - Journals
ISSN - 1728-7901
DOI - 10.51889/2021-4.1728-7901.08
Subject(s) - computer science , graph , face (sociological concept) , process (computing) , face detection , section (typography) , machine learning , artificial intelligence , facial recognition system , data mining , pattern recognition (psychology) , theoretical computer science , programming language , operating system , social science , sociology
This article examines the assessment of tools such as Dlib, OpenCV, MTCNN, FaceNet for face recognition. In the process of work, the execution time and the count of detecting of each tool were determined and calculated. The results pictured in graph choose the right tool according to the data obtained in the article that was optimal for next research works. The choice was made for the ease of writing a parallel algorithm. The rationale for the choice of the tool is also given according to the parameters of the use of machine resources, which makes it possible to optimally select a machine without additional and large costs. A comparative analysis of each instrument was performed and the results were identified accordingly. Based on the test results, we divided two cases and tried to give recommendations for each of them. The first case is triggered if only quick face detection is considered in the video. The second case is triggered if more faces are viewed in the video. It turned out that in the first case, we need to use the Dlib tool. In the second case, we can choose tools like Facenet or Mtcnn. The results obtained in the process of the research are presented in the form of graphs, tables and recorded in the conclusion section of this article.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here