Skip to content

GestureRec is an Android application that is focused on analize and recognize gestures

Notifications You must be signed in to change notification settings

UniIdeas/GestureRec

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GestureRec

Digital Systems project - Computer Engineering at Alma Mater Studiorum (University of Bologna)

What about?

This is a university project created for the Digital Systems course held by Professor Matteo Poggi and Professor Stefano Mattoccia at the university of Bologna. 📚

Abstract

The Gesture Rec application is an application for Android devices designed for support for people with dumbness and/or deafness. It allows, in fact, to recognize the various gestures of sign language using the camera of the device and to transcribe what is said. The application is mainly composed of three features:

  1. Photo mode: the user, through the camera, frames the gesture and takes a picture that will be processed and the meaning of the gesture will be shown.
  2. Video mode: the user, through the camera, records a video in which various gestures are performed. The video will be processed and will be displayed on the screen shown the meaning attributed to the word or phrase previously registered.
  3. Audio mode: the user records an audio and what has been said is reported on screen (feature designed to allow a user to speak in quickly to a person with deafness).

TensorFlow 2.7.0 with Keras was used for the development of the network. The MobileNet model was used for its predisposition to mobile devices thanks to its lightness and limited use of resources. For the training of the network a dataset was used consisting of about 7760 images for each of the 29 classes (the letters of the alphabet and some standard gestures), for a total of 225040 images.

Dataset: 80% training set, 20% validation set.

The model NeuralNetwork.h5 was made with the following code: NeuralNetwork.py.
The testNeuralNetwork.p file was used to test the model.
The training was carried out for 10 epochs with a final accuracy of 99.76%. It was later converted to a tflite model via the toTFlite.py file for use on mobile app.

Neural network model:

Model Results

              

Screenshots of the app

Some screenshots from the app (to see other screenshots go to the dir Images):

                              GIF




                             

Credits 🫂

Bibliograpy

About

GestureRec is an Android application that is focused on analize and recognize gestures

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published