This Jupyter notebook is crafted to demonstrate the integration of Redis and LlamaIndex for creating a customer support chatbot specifically tailored for Chevy vehicles. The system is powered by an "agentic RAG" architecture.
- Cohere: Serves as the language model and embeddings provider, ensuring that the chatbot understands and generates human-like text responses.
- Redis: A versatile tool within our architecture, Redis functions as the document store, ingestion cache, vector store, chat history repository, and semantic cache.
- LlamaIndex: Acts as the central framework that ties together the entire system, enabling seamless integration with various services and tools to enhance functionality.
To begin exploring the Agentic RAG Support Bot, you can launch this notebook in a Google Colab environment for a hands-on experience:
To provide a clearer picture of how each component interacts within the system, the architecture diagram can be explored below. This visual representation highlights document ingestion and inference with the agent.
For further reading and resources related to the technologies and approaches used in this project, consider the following links: