For in-depth information regarding the project, please refer to the paper.
To extract the features, first place the images in their corresponding directories (e.g. data/unparsed/sick for images representing sick individuals) and run the command make create-data
in a terminal.
Once the data is created, make train-individual
or make train-stacked
can be used to train all the individual networks (i.e. eyes, nose, mouth and skin) or, respectively, a stacked ensemble that parses each feature at once.
If one desires to remove the created data, make clean-data
can be used. Furthermore, make clean-results
will remove any saved models, histories and plots generated.
For python environment details, please check environment.py.
- augment: folder containing the code of a neural style transfer network
- categorization: folder containing a convolutional neural network that categorizes the images
- data: folder containing the collected data set
- SoF Dataset
- IST-EURECOM Light Field Face Database
- CVL Face Database
- Chicago Faces Dataset
- YMU and VMU
A. Sepas-Moghaddam, V. Chiesa, P.L. Correia, F. Pereira, J. Dugelay, “The IST-EURECOM Light Field Face Database”, International Workshop on Biometrics and Forensics, IWBF 2017, Coventry, UK, April 2017
Mahmoud Afifi and Abdelrahman Abdelhamed, "AFIF4: Deep gender classification based on an AdaBoost-based fusion of isolated facial features and foggy faces". Journal of Visual Communication and Image Representation, 2019.
PEER, Peter, EMERŠIČ, Žiga, BULE, Jernej, ŽGANEC GROS, Jerneja, ŠTRUC, Vitomir. Strategies for exploiting independent cloud implementations of biometric experts in multibiometric scenarios. Mathematical problems in engineering, vol. 2014, pp. 1-15, 2014.