
An improved age invariant face recognition using data augmentation
Author(s) -
Kennedy Okokpujie,
Samuel John,
Charles Uzoanya Ndujiuba,
Joke A. Badejo,
Etinosa Noma Osaghae
Publication year - 2021
Publication title -
bulletin of electrical engineering and informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.251
H-Index - 12
ISSN - 2302-9285
DOI - 10.11591/eei.v10i1.2356
Subject(s) - computer science , artificial intelligence , facial recognition system , pattern recognition (psychology) , preprocessor , benchmarking , convolutional neural network , face (sociological concept) , artificial neural network , machine learning , speech recognition , social science , marketing , sociology , business
In spite of the significant advancement in face recognition expertise, accurately recognizing the face of the same individual across different ages still remains an open research question. Face aging causes intra-subject variations (such as geometric changes during childhood adolescence, wrinkles and saggy skin in old age) which negatively affects the accuracy of face recognition systems. Over the years, researchers have devised different techniques to improve the accuracy of age invariant face recognition (AIFR) systems. In this paper, the face and gesture recognition network (FG-NET) aging dataset was adopted to enable the benchmarking of experimental results. The FG-Net dataset was augmented by adding four different types of noises at the preprocessing phase in order to improve the trait aging face features extraction and the training model used at the classification stages, thus addressing the problem of few available training aging for face recognition dataset. The developed model was an adaptation of a pre-trained convolution neural network architecture (Inception-ResNet-v2) which is a very robust noise. The proposed model on testing achieved a 99.94% recognition accuracy, a mean square error of 0.0158 and a mean absolute error of 0.0637. The results obtained are significant improvements in comparison with related works.