This repository contains an implementation of transformer models for natural language processing tasks.
The project includes comprehensive tests covering the following components:
- Multi-head Attention Mechanisms
- Encoder Architecture
- Decoder Architecture
All tests are passing successfully:
- ✅ Multi-head Attention Output Shape
- ✅ Decoder Layer Output Shape
- ✅ Decoder Output Shape
- ✅ Encoder Layer Output Shape
- ✅ Encoder Output Shape
vishwamai_transformers/
├── tests/
│ └── test_models/
│ ├── test_attention_mechanisms.py
│ ├── test_decoder.py
│ └── test_encoder.py
To run the tests:
pytest # For basic test execution
pytest -v # For verbose test output
- Python 3.10.12
- pytest 8.3.4
- pluggy 1.5.0