Skip to content

Latest commit

 

History

History
46 lines (30 loc) · 2.2 KB

README.md

File metadata and controls

46 lines (30 loc) · 2.2 KB

Description

This is a repository of source codes for training, evaluating and testing of neural network models for image classification on the AffectNet[1][2] dataset. The neural network architectures are mainly taken from the Keras implementation[3], but in some cases from other sources[4][5][6][7][8]. Trained models are also available from this repository - in TensorFlow 2 standard format, as well as converted and optimised models in TensorFlow Lite format.

In total, 63 classification and 28 regression models were trained for comparison and to decide which were suitable for use on mobile devices. For testing purposes on real devices, an Android application was developed[9] and released[10].

For each classification model there is an average percentage and a confusion matrix. For each regression model there are arousal, valence and average values of RMSE. All based on a test batch of images from the AffectNet dataset.

In my diploma thesis[11] you can find more information about the influence of architectural and training parameters of neural networks on total model latency, classification and RMSE results (Czech language only). There is also a scientific publication available in English[12] - Facial Emotion Recognition for Mobile Devices: A Practical Review.

References

[1] http://mohammadmahoor.com/affectnet/

[2] http://mohammadmahoor.com/wp-content/uploads/2017/08/AffectNet_oneColumn-2.pdf

[3] https://keras.io/api/applications/

[4] https://github.com/YeFeng1993/GhostNet-Keras

[5] https://github.com/abhoi/Keras-MnasNet

[6] https://github.com/Haikoitoh/paper-implementation/blob/main/ShuffleNet.ipynb

[7] https://github.com/opconty/keras-shufflenetV2

[8] https://github.com/cmasch/squeezenet/blob/master/squeezenet.py

[9] https://github.com/VojtaMaiwald/FaceEmotionRecognitionTest

[10] https://play.google.com/store/apps/details?id=cz.vsb.faceemotionrecognition

[11] https://dspace.vsb.cz/handle/10084/151688

[12] https://ieeexplore.ieee.org/document/10414102