BigDL-LLM is a library to make LLMs (large language models) run fast1 on your low-cost Intel PC. This repository contains tutorials to help you understand what is BigDL-LLM, what you can do with BigDL-LLM, and how to use it.
The tutorials are organized as follows:
- Chapter 1 servers as a general introduction of BigDL-LLM.
- Chapter 2 provides a set of best practices for setting-up your environment.
- Chapter 3 helps you get started quickly with BigDL-LLM by briefly introducting the essential concepts and API usage.
- Chapter 4 elaborates how to load and accelerate Transformes models using BigDL-LLM
- Chapter 5 explains how to use BigDL-LLM with langchain.
- Chapter 6 is about multi-language support, e.g. Chinese
Footnotes
-
Performance varies by use, configuration and other factors.
bigdl-llm
may not optimize to the same degree for non-Intel products. Learn more at www.Intel.com/PerformanceIndex. ↩