Releases: Oneflow-Inc/libai
Releases · Oneflow-Inc/libai
LiBai v0.3.0 Release
v0.3.0 (03/11/2024)
New Features:
- Support mock transformers, see Mock transformers
- Support lm-evaluation-harness for model evaluation
- User Experience Optimization
New Supported Models:
- These models are natively supported by libai
Models | 2D(tp+pp) Inference | 3D Parallel Training |
---|---|---|
BLOOM | ✔ | - |
ChatGLM | ✔ | ✔ |
Couplets | ✔ | ✔ |
DALLE2 | ✔ | - |
Llama2 | ✔ | ✔ |
MAE | ✔ | ✔ |
Stable_Diffusion | - | - |
New Mock Models:
- These models are extended and implemented by libai through mocking transformers.
Models | Tensor Parallel | Pipeline Parallel |
---|---|---|
BLOOM | ✔ | - |
GPT2 | ✔ | - |
LLAMA | ✔ | - |
LLAMA2 | ✔ | - |
Baichuan | ✔ | - |
OPT | ✔ | - |
LiBai v0.2.0 Release
v0.2.0 (07/07/2022)
New Features:
- Support evaluation enabled and set
eval_iter
- Support customized sampler in
config.py
- Support rdma for pipeline-model-parallel
- Support multi fused kernel
- fused_scale_mask_softmax_dropout
- fused_scale_tril_softmax_mask_scale
- fused_self_attention in branch
libai_bench
- User Experience Optimization
- Optimization for training throughput, see benchmark for more details
New Supported Models:
LiBai v0.1.0 Release
Some major features:
- Data Parallelism
- 1D Tensor Parallelism
- Pipeline Parallelism
- Unified Distributed Layers for single-GPU and multi-GPU
- "LazyConfig" system for more flexible syntax and no predefined structures
- Easy-to-use trainer and engine
- CV & NLP Data Preprocessing
- Mixed Precision Training
- Activation Checkpointing
- Gradient Accumulation
- Gradient Clipping
- Zero Redundancy Optimizer (ZeRO)
Model_Zoo
- Bert (3D Parallelism)
- GPT-2 (3D Parallelism)
- ViT (3D Parallelism)
- Swin-Transformer (data parallelism)
- Add finetune task in
projects/
- Add text classification in
projects