Please use this identifier to cite or link to this item: https://ir.swu.ac.th/jspui/handle/123456789/22172
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorSirisup Laohakiat
dc.contributor.authorPanida Jitviriyavasin
dc.contributor.authorKannicha Khamjring
dc.contributor.authorPacharasiri Siriyom
dc.contributor.authorPanida Jitviriyavasin
dc.date.accessioned2022-06-21T03:28:36Z-
dc.date.available2022-06-21T03:28:36Z-
dc.date.issued2021
dc.identifier.urihttps://ir.swu.ac.th/jspui/handle/123456789/22172-
dc.description.abstractCurrently, automatic age and gender predictions based on face detection draw a lot of attention due to their wide areas of applications. In this study, we try to build a system that consists of age, gender and emotion prediction models. Based on deep convolutional neural network architecture, age and gender models are trained by public dataset with 14,000 data instances. After implementing the primary models using Keras, we convert the model using TFLiteConverter, so that the model can be deployed as a mobile application. The performance of the three models are found as follow: using MAE as the evaluation index, the age model yields MAE of 0.1668; the gender model yields the accuracy of 0.95 and the emotion prediction model yield the accuracy of 0.62. We found that the causes of models inaccuracy included the images with some nonstandard poses, for example, skewed faces, distant faces, makeup on the faces, light, and shadow of the image, etc. By reducing these factors, the accuracy of the models can improve.
dc.languageen
dc.publisherDepartment of Computer Science, Srinakharinwirot University
dc.subjectAge detection
dc.subjectFace detection
dc.subjectGender detection
dc.subjectKeras
dc.subjectTensorflow
dc.titleFace, age and gender identification system for application
dc.typeWorking Paper
Appears in Collections:ComSci-Senior Projects

Files in This Item:
There are no files associated with this item.


Items in SWU repository are protected by copyright, with all rights reserved, unless otherwise indicated.