This is an implementation of an arbitrary style transfer algorithm running purely in the browser using TensorFlow.js. As with all neural style transfer algorithms, a neural network attempts to "draw" one picture, the Content (usually a photograph), in the style of another, the Style (usually a painting).
- The browser allows the user to select custom image from their desktop or even capture image using the webcam.
- It works proficiently for any given image and style.
- The user can select the extent of style transfer, the user wants to apply on the original image.
- The user is displayed of how the image transitions progressively as the training occurs.
To run it locally, you must install Yarn and run the following command at the repository's root to get all the dependencies.
yarn run prep
Verify that there is a .babelrc.json file present if it isn't in json format rename the file from .babelrc to .babelrc.json.
Then, you can run
yarn run start
You can then browse to localhost:9966
to view the application.
- Ankit Biswas
- Shruti Gour
- Akshit Mittal
- Reiichiro Nakano for this blog which I took reference from.
- Authors of the arbitrary style transfer paper.
- The Magenta repository for arbitrary style transfer.
- Authors of the MobileNet-v2 paper.
- Authors of the paper describing neural network knowledge distillation.