This purpose of this sample application is to illustrate how to build a custom AI CoPilot application to chat with structured customer and sales data that would typically be found in a Sales CRM application.
This project utilizes the Chat Copilot Sample Application as a starting point, which uses a RAG pattern with unstructured data in AISearch (or similar) to allow you to chat with data in documents, with the following high-level components:
- WebApp - React-based web client application to host the CoPilot chat experience
- WebAPI - OpenAPI endpoint that executes the backend OpenAI orchestration, using the Semantic Kernel SDK
This project adds the following additional components:
- DataAPI - OpenAPI endpoint registered with the Semantic Kernel agent as an additional plugin that can be used by the LLM to answer questions about customer data. The DataApi is modeled after a few data sets from a Customer Relationship Management (CRM) system containing customer asset and and sales data, but could easily replaced with any OpenAPI with a well documented swagger definition.
- SQL Database - deployment templates and scripts for an Azure SQL backend database with sample data exposed via the DataAPI.
This open-source sample application is intended for use as an education tool or solution accelerator. The deployment scripts included will deploy the below basic infrastructure, where the CosmosDB is an optional component for persisting chat session history.
This project includes .NET Web Services and utilizes the Semantic Kernel SDK for backend orchestration. The front end is a react web application.
Below describes the basic logical flow for each chat turn in a session:
- The user asks a question of CoPilot in the chat window.
- This question along with other session information is sent to the backend WebAPI.
- The Chat controller assembles the below, and uses a Semantic Kernel instance to send to the OpenAI LLM.
- User intent (question)
- All chat history from the session
- A list of all available Kernel Functions available to help answer the question (including the DataAPI functions) with descriptions
- A Prompt to instruct the model on the expected response based on data and logic available in the Functions
- The model then instructs Semantic Kernel which functions to execute in order, to retrieve data needed to answer the question. This is using an OpenAI feature called Function Calling.
- In this project, Automatic Function Calling has been enabled in the Kernel so that these instructions are carried out without human-in-the-loop intervention required.
- The Kernel then executes the function calls, including requests to the DataAPI to retrieve the relevant customer data, and sends the results back to the model. This is an example of a RAG (retrieval augmented generation) Pattern, but with structured data query endpoints rather than a semantic search.
- The model constructs an answer - based on the provided data, prompt and history context - and sends it back to the Kernel, which is assembled with other response infromation and returned to the front end web app, where the answer is displayed.
- Chat CoPilot: A reference application for Semantic Kernel
- Chat CoPilot: Github
- OpenAI Platform - Function Calling
- Semantic Kernel SDK
- Semantic Kernel - Understanding AI plugins and functions
- OpenAI Articles - Retrieval Augmented Generation (RAG) and Semantic Search for GPTs
For demo or learning purposes, if you simply want to get this solution running on your local machine, follow the Run Locally
steps below.
For instructions to deploy this solution template to the cloud in your Azure subscription, follow the instructions found at /scripts/deploy/README.md.
Utilize the quick-start instructions to run the Chat Copilot Sample Application this project is built on found on the official Chat CoPilot Microsoft Learn getting started page.
A copy of these instructions published at the time of cloning can be found here in the event that the official getting started page gets significant updates after the creation of this project.
In short, the instructions should help you to:
- Clone this repository
- Setup your local environment with pre-requisites
- Configure app settings for Azure OpenAI connection
- Existing Azure OpenAI or OpenAI deployment endpoint is required for this application.
- Run the backend APIs and front end webapp locally
- This project adds an additional backend .NET web service that provides the DataAPI endpoint for retrieving the sales data for the LLM to reason over and answer questions.
- Requires .NET 8 SDK, which will not currently be installed by the setup scripts referenced by the getting started instructions.
- Has been added to the Start.ps1 script referenced in the getting started instructions. No additional steps will be required to build and run the DataAPI.
- DataAPI assumes a backend SQL database for data storage. If running locally without one, DataApi will run and can be called successfully, but no data will be returned.
- See DataAPI README for more information
- The backend WebAPI in this project uses .NET 6 rather than .NET 7
- Certain 'educational' features and tabs have been disabled in the web app, including:
- Plugin setup
- Profile setup
- The 'Document' tab in the web app, and the ability to upload discrete documents, has not been disabled in the WebApp, but will not be used or visible to the model with the current configuration of Kernel Memory in the WebAPI.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.