Skip to content
@NVIDIA-NeMo

NVIDIA-NeMo

NVIDIA NeMo Framework

NeMo Framework is NVIDIA's GPU accelerated, end-to-end training framework for large language models (LLMs), multi-modal models and speech models. It enables seamless scaling of training (both pretraining and post-training) workloads from single GPU to thousand-node clusters for both 🤗Hugging Face, Megatron, and Pytorch models.

This GitHub organization hosts repositories for NeMo's core components and integrations.

Documentation

To learn more about NVIDIA NeMo Framework and all of its component libraries, please refer to the NeMo Framework User Guide, which includes quick start guide, tutorials, model-specific recipes, best practice guides and performance benchmarks.

License

Apache 2.0 licensed with third-party attributions documented in each repository.

Popular repositories Loading

  1. Curator Curator Public

    Scalable data pre processing and curation toolkit for LLMs

    Python 1k 149

  2. RL RL Public

    Scalable toolkit for efficient model reinforcement

    Python 507 73

  3. Run Run Public

    A tool to configure, launch and manage your machine learning experiments.

    Python 172 68

  4. Automodel Automodel Public

    Day-0 support for any Hugging Face model leveraging PyTorch native functionalities while providing performance and memory optimized training and inference recipes.

    Python 28 1

  5. FW-CI-templates FW-CI-templates Public

    CI/CD templates for NeMo-FW libraries

    Python 4 2

  6. .github .github Public

Repositories

Showing 6 of 6 repositories

Top languages

Loading…

Most used topics

Loading…