Abstract:
Currently, automatic age and gender predictions based on face detection draw a lot of attention due to their wide areas of applications. In this study, we try to build a system that consists of age, gender and emotion prediction models. Based on deep convolutional neural network architecture, age and gender models are trained by public dataset with 14,000 data instances. After implementing the primary models using Keras, we convert the model using TFLiteConverter, so that the model can be deployed as a mobile application. The performance of the three models are found as follow: using MAE as the evaluation index, the age model yields MAE of 0.1668; the gender model yields the accuracy of 0.95 and the emotion prediction model yield the accuracy of 0.62. We found that the causes of models inaccuracy included the images with some nonstandard poses, for example, skewed faces, distant faces, makeup on the faces, light, and shadow of the image, etc. By reducing these factors, the accuracy of the models can improve.