z-logo
open-access-imgOpen Access
Face Detection in a Mixed-Subject Document
Author(s) -
Lhoussaine Bouhou,
Rachid El Ayachi,
Mohamed Baslam,
Mohamed Oukessou
Publication year - 2016
Publication title -
international journal of electrical and computer engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.277
H-Index - 22
ISSN - 2088-8708
DOI - 10.11591/ijece.v6i6.pp2828-2835
Subject(s) - computer science , subject (documents) , face (sociological concept) , face detection , artificial intelligence , matching (statistics) , information retrieval , image (mathematics) , computer vision , pattern recognition (psychology) , facial recognition system , natural language processing , linguistics , world wide web , mathematics , statistics , philosophy
Before you recognize anyone, it is essential to identify various characteristics variations from one person to another. among of this characteristics, we have those relating to the face. Nowadays the detection of skin regions in an image has become an important research topic for the location of a face in the image. In this research study, unlike previous research studies  related  to  this  topic  which  have  focused  on  images  inputs  data  faces,  we  are  more interested to the fields face detection in mixed-subject documents (text + images). The face detection system developed is based on the hybrid method to distinguish two categories of objects from the mixed document. The first category is all that is text or images containing figures having no skin color, and the second category is any figure with the same color as the skin. In the second phase the detection system is based on Template Matching method to distinguish among the figures of the second category only those that contain faces to detect them. To validate this study, the system developed is tested on the various documents which including text and image.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here