Home Assistant Ollama add-on repository Ollama makes it easy to get up and running with large language models locally. Ollama Documentation: https://github.com/ollama/ollama/tree/main/docs Add-ons This repository contains the following add-ons: Ollama CPU Ollama GPU AMD Ollama GPU Intel Ollama GPU Nvidia Ollama GPU Vulkan