From ae14ec350c14352d83747efa0bdd95d4efc9cfd4 Mon Sep 17 00:00:00 2001 From: Maximilian Winter Date: Thu, 23 May 2024 00:40:40 +0200 Subject: [PATCH] Update ReadMe.md --- ReadMe.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ReadMe.md b/ReadMe.md index ae2d1a3..8636de1 100644 --- a/ReadMe.md +++ b/ReadMe.md @@ -67,7 +67,7 @@ Join the Discord Community [here](https://discord.gg/6tGznupZGX) The llama-cpp-agent framework provides a wide range of examples demonstrating its capabilities. Here are some key examples: ### Simple Chat Example using llama.cpp server backend -This example demonstrates how to initiate a chat with an LLM model using the llama.cpp server backend. It supports llama-cpp-python Llama class instances, OpenAI endpoints with GBNF grammar support, and the llama.cpp backend server. +This example demonstrates how to initiate a chat with an LLM model using the llama.cpp server backend. [View Example](https://llama-cpp-agent.readthedocs.io/en/latest/simple-chat-example/)