Skip to content

helenabarmer/toxicity-classifier-react

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

React Toxicity classifier

The React Toxicity Classifier is importing a pre-trained model from tensorFlow.js.

The project was bootstrapped with Create React App.

Labels

The pre-trained model is a toxicity classifier that will detect if the text you write in the input field is classified within 7 different labels:

Labels
toxic
severe_toxic
obscene
threat
insult
identity_hate

The toxicity classifier will return a true (is toxic) or false (not toxic) depending on the input sentence.

Input field

  • The sentence in the input field is submitted through a button and then sent to the pre-trained model.
  • The input field can be cleared through a button.

Toxic points

  • For every sentence that is labelled true (within the 7 different labels) a toxic point will be added underneath the input field.
  • There is a button that can clear the toxic points.

Table

A submitted sentence will be displayed in the table with the following information:

  • How much % of toxicity the sentence contains
  • An emoji that will switch between a happy one/love eyes (not toxic) and a skull (toxic)
  • Toxic points that will increase every time the input sentence is labelled “true” (toxic)
  • A button to delete one sentence at a time
  • A button to delete all sentences from the table

Footer

At the top of the page the following links are displayed:

  • Source code
  • Dataset
  • Toxicity classifier
  • Tensorflow.js
  • Contact information: E-mail, LinkedIn and GitHub

Dataset

Tensorflow.js

To-do

Deployment

Learn More About React

You can learn more in the Create React App documentation.

To learn React, check out the React documentation.

About

Toxicity classifier made with React

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published