Skip to content

yonigozlan/simple_pytorch_transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pytorch implementation of the "Transformer"

This work is a simple implementation of the model described in the seminal paper Attention is all you need

The model is implemented in Pytorch and is tested on a translation task from English to German and from German to English.

This implementation uses poetry as package and dependency manager.

Usage

The model can be trained and tested using the command line.

Training the transformer

To train the transformer on a translation task, use this command:

python -m translation train --language en_de --epochs 8

Testing the transformer

The transformer can be tested in two different ways. We can test the transformer on examples from the validation dataset:

python -m translation examples --language en_de --nb 10

Or we can test the transformer on any sentence in the source language:

python -m translation infer "The quick brown fox jumps over the lazy dog." --language en_de

This work is inspired by Harvard NLP annotated transformer

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published