Skip to content

A powerful AI chatbot capable of providing real-time answers to technical questions by accessing documentation from various tools and services

License

Notifications You must be signed in to change notification settings

mitch-zink/technical-documentation-ai

Repository files navigation

📄 Technical Documentation AI

Open with Streamlit

Python Streamlit OpenAI Langchain

🚀 Quick Navigation

✨ Description

A powerful AI chatbot capable of providing real-time answers to technical questions by accessing documentation from various tools and services.

📚 Supported Documentation

  1. AWS Documentation
  2. dbt Documentation
  3. dbt Project Evaluator
  4. Fivetran Documentation
  5. Looker Documentation
  6. Prefect Documentation
  7. Python (Langchain) Documentation
  8. Snowflake Documentation
  9. Streamlit Documentation

🤖 How to Use

Interact with the chatbot by sending your technical questions related to the supported tools and services. The chatbot will provide real-time responses by fetching relevant information from the specified documentation sources.

💡 See the AI in Action

Startup Screen

Example Interaction 1

Sample Interaction

A typical dialogue with the AI, fetching a relevant response from Snowflake documentation. Example Interaction 2

🙋‍♂️ User Question Processing Workflow

To effectively handle a user’s question, the system performs the following steps:

  1. Query Generation: Utilize a Large Language Model (LLM) to generate a comprehensive set of queries based on the provided user input.
  2. Search Execution: Conduct searches for each of the generated queries.
  3. URL Storage: Collect and store the URLs obtained from the search results in self.urls.
  4. URL Check: Identify any URLs that are new and have not been processed previously, ensuring they do not exist in self.url_database.
  5. Content Transformation and Storage: Load, transform, and add these new URLs exclusively to the vectorstore.
  6. Relevant Document Retrieval: Query the vectorstore for documents that are relevant to the questions generated by the LLM.
  7. Final Result Preparation: Ensure that only unique documents are selected, compiling them to form the final result set.

🔧 Configuration

You only need to supply a few things.

In settings() function, supply:

  • Search: Select the search tool you want to use (e.g., GoogleSearchAPIWrapper).
  • Vectorstore: Select the vectorstore and embeddings you want to use (e.g., Chroma, OpenAIEmbeddings).
  • Select the LLM you want to use (e.g., ChatOpenAI).

To use st.secrets set enviorment variables in .streamlit/secrets.toml file.

Or, simply add environemnt variables and remove st.secrets:

import os
os.environ["GOOGLE_API_KEY"] = "YOUR_API_KEY"
os.environ["GOOGLE_CSE_ID"] = "YOUR_CSE_ID" 
os.environ["OPENAI_API_BASE"] = "https://api.openai.com/v1"
os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"

🛠️ API Key Configuration

👷 Setup & Run for MacOS

python3.9 -m venv venv && source venv/bin/activate && pip3 install --upgrade pip && pip3 install -r requirements.txt && streamlit run technical-documentation-ai.py

👷‍♀️ Setup & Run for Windows

py -m venv venv; .\venv\Scripts\Activate.ps1; python -m pip install --upgrade pip; pip install -r requirements.txt; streamlit run app.py

About

A powerful AI chatbot capable of providing real-time answers to technical questions by accessing documentation from various tools and services

Resources

License

Security policy

Stars

Watchers

Forks

Languages