Skip to content

XR-FaceMask-EmoClass is a repository containing code and models for exploring the impact of partial occlusion on emotion classification from facial expressions using extended reality (XR) headsets and face masks.

License

Notifications You must be signed in to change notification settings

AlbertoCasasOrtiz/XR-FaceMask-EmoClass

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

XR-FaceMask-EmoClass

This is the source code accompanying our paper "Exploring the Impact of Partial Occlusion on Emotion Classifi-cation from Facial Expressions: A Comparative Study of XR Headsets and Face Masks".

Instructions

This code has been tested on Windows, but minimal modifications should make it work on Linux and Mac environments.

  1. Put your raw images into "assets/raw_dataset/aligned/".
  2. Put labels file into "assets/raw_dataset/list_partition_label.txt".
  • Format: One line per instance.
    • train_<number>.jpg <class> for train instances.`
    • test_<number>.jpg <class> for test instances.`
  1. Set rebuild_dataset = True if you want to generate the VR and Masked datasets.
  2. Execute main.py.

RAF-DB dataset

Access to the RAF-DB dataset can be requested here: http://www.whdeng.cn/raf/model1.html

If you are using the RAF-DB dataset, just put the aligned images directly into "assets/raw_dataset/aligned/" and the lables file into "assets/raw_dataset/list_partition_label.txt".

Data augmentation

The method used to frontalize and de-occlude the faces for data augmentation is called CFR-GAN, and can be found here: https://github.com/yeongjoonJu/CFR-GAN

We used CFR-GAN over the train set, and then manually copied the de-occluded and frontalized instances of the minoritary classes into the RAF-DB dataset to augment it.

GPU training

If you want to train using GPU (way faster) on Windows, you can refer to the following links. Instructions should be similar for Linux and Mac environments:

  1. https://medium.com/@ashkan.abbasi/quick-guide-for-installing-python-tensorflow-and-pycharm-on-windows-ed99ddd9598
  2. https://discuss.tensorflow.org/t/tensorflow-gpu-not-working-on-windows/13120/3

Inference and trained models

Coming soon...

Notes

Ideal image size is (224, 224, 3). Please, resize your images to this size.

Citation

A. Casas-Ortiz, J. Echeverria, N. Jimenez-Tellez and O. C. Santos, "Exploring the Impact of Partial Occlusion on Emotion Classification From Facial Expressions: A Comparative Study of XR Headsets and Face Masks," in IEEE Access, vol. 12, pp. 44613-44627, 2024, doi: 10.1109/ACCESS.2024.3380439.

BibTeX:

@ARTICLE{10477424,
  author={Casas-Ortiz, Alberto and Echeverria, Jon and Jimenez-Tellez, Nerea and Santos, Olga C.},
  journal={IEEE Access}, 
  title={Exploring the Impact of Partial Occlusion on Emotion Classification From Facial Expressions: A Comparative Study of XR Headsets and Face Masks}, 
  year={2024},
  volume={12},
  number={},
  pages={44613-44627},
  keywords={Face recognition;Headphones;Measurement;Reviews;Emotion recognition;Transfer learning;Faces;Extended reality;Emotion classification;emotion recognition;facial expression analysis;partial occlusion;transfer learning;deep learning;extended reality;face masks;HMD;XR headset},
  doi={10.1109/ACCESS.2024.3380439}}

About

XR-FaceMask-EmoClass is a repository containing code and models for exploring the impact of partial occlusion on emotion classification from facial expressions using extended reality (XR) headsets and face masks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages