
Robust Deep Age Estimation Method Using Artificially Generated Image Set
Author(s) -
Jang Jaeyoon,
Jeon SeungHyuk,
Kim Jaehong,
Yoon Hosub
Publication year - 2017
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.17.0117.0078
Subject(s) - deep learning , artificial intelligence , computer science , economic shortage , machine learning , field (mathematics) , image (mathematics) , estimation , key (lock) , reliability (semiconductor) , set (abstract data type) , pattern recognition (psychology) , computer vision , engineering , mathematics , philosophy , linguistics , power (physics) , physics , computer security , systems engineering , quantum mechanics , government (linguistics) , pure mathematics , programming language
Human age estimation is one of the key factors in the field of Human–Robot Interaction/Human–Computer Interaction (HRI/HCI). Owing to the development of deep‐learning technologies, age recognition has recently been attempted. In general, however, deep learning techniques require a large‐scale database, and for age learning with variations, a conventional database is insufficient. For this reason, we propose an age estimation method using artificially generated data. Image data are artificially generated through 3D information, thus solving the problem of shortage of training data, and helping with the training of the deep‐learning technique. Augmentation using 3D has advantages over 2D because it creates new images with more information. We use a deep architecture as a pre‐trained model, and improve the estimation capacity using artificially augmented training images. The deep architecture can outperform traditional estimation methods, and the improved method showed increased reliability. We have achieved state‐of‐the‐art performance using the proposed method in the Morph‐II dataset and have proven that the proposed method can be used effectively using the Adience dataset.