The project integrates the Retrieval Augmented Generation (RAG) tool Llama-Index and Microsoft's AutoGen with ComfyUI's adaptable node interface, enhancing the functionality and user experience of the platform.
🔥 May 9, 2024: Added agents, more information can be found here.
Follow these steps to set up the environment:
-
Set up a virtual environment as needed.
-
Navigate to
ComfyUI/custom_nodes
. -
Clone the repository: git clone https://github.com/get-salt-AI/SaltAI_Llama-Index
-
Change to the cloned directory: cd SaltAI_Llama-index
-
Install dependencies:
5.a Python venv:
pip install -r requirements.txt
5.b ComfyUI Portable:
path\to\ComfyUI\python_embeded\python.exe -m pip install -r requirements.txt
- Have ComfyUI-Manager installed.
- Open up Manager within ComfyUI and search for the nodepack "SaltAI_LlamaIndex"
- Install
- Restart the server.
- Ctrl+F5 Hard refresh the browser.
If you encounter issues due to package conflicts, ensure your virtual environment is configured correctly.
You can install and use any GGUF files loaded into your ComfyUI/custom_nodes/models/llm
folder.
Here is probably the world's largest repository of those:
Example workflows and images can be found in the /examples
folder
Detailed documentation and guidelines for contributing to the project will be provided soon.
You can find out existing documentation at https://docs.getsalt.ai/
The project is open-source under the MIT license.