Skip to content

Latest commit

 

History

History
42 lines (35 loc) · 1.57 KB

README.md

File metadata and controls

42 lines (35 loc) · 1.57 KB

attn_gan_pytorch

python package for self-attention gan implemented as extension of PyTorch nn.Module. paper -> https://arxiv.org/abs/1805.08318

Also includes generic layers for image based attention mechanism. Includes a Full-Attention layer as proposed by in another project of mine here

Installation:

This is a python package availbale at the pypi.org. So, installation is fairly straightforward. This package depends on a suitable GPU version of torch and torch-vision for your architecture. So, please download suitable pytorch prior to installing this package. Follow the instructions at pytorch.org to install your version of PyTorch.

Install with following commands:

$ workon [your virtual environment] 
$ pip install attn-gan-pytorch

Celeba Samples:

some celeba samples generated using this code for the fagan architecture:

generated samples

Head over to the Fagan project repo for more info!

Also, this repo contains the code for using this package to build the SAGAN architecture as mentioned in the paper. Please refer the samples/ directory for this.

Thanks

Please feel free to open PRs here if you train on other datasets using this package. Suggestions / Issues / Contributions are most welcome.

Best regards,
@akanimax :)