Attention- This repository contains several self-attention and multiheaded attention techniques. All implimentations are relativly similar.