Skip to content

Web Search Notes

XuanXuanxuannn edited this page Apr 30, 2025 · 2 revisions

Modelworks Chatbot with Web Search

This wiki explains how to set up and use the Modelworks Gradio chat application, which integrates OllamaLLM with live web search via SerpAPI.


Prerequisites

  • Python 3.8+
  • Virtual environment (recommended)
  • Ollama server running locally with a model loaded (e.g., deepseek-r1:1.5b).
  • SerpAPI account and API key.

Installation

  1. Clone your repo (or copy main.py into your project directory):
    git clone <your-repo-url>
    cd <your-project>
  2. Create and activate a Python virtual environment: python3 -m venv venv source venv/bin/activate # macOS/Linux venv\Scripts\activate # Windows
  3. Install dependencies
    pip install --upgrade \
    gradio \
    langchain[serpapi] \
    langchain-community \
    langchain-ollama

Configuration

  1. Start Ollama with your chosen model
    ollama serve --model gemma3:1b
  2. Set your SerpAPI key in the environment
    # macOS / Linux
    export SERPAPI_API_KEY="your_serpapi_key"
    
    # Windows PowerShell
    setx SERPAPI_API_KEY "your_serpapi_key"
    

File Structure

project/
β”œβ”€β”€ main.py          # Gradio app with web search agent
β”œβ”€β”€ venv/            # Python virtual environment
└── requirements.txt # optionally freeze your deps

Running the App

With your venv activated and Ollama running:

python main.py

By default, the Gradio UI will launch at http://localhost:7860.

Usage

  1. Open your browser at http://localhost:7860.
  2. Type a query in the chatbox (e.g., β€œWhat’s the latest news on renewable energy?”).
  3. Send β€” the agent will:
    • Optionally call SerpAPI to fetch real-time web results.
    • Combine its reasoning with those results.
    • Return a final answer in the chat UI.
Example: User: What’s the population of Australia? Bot: (πŸ” Web Search) β€œpopulation of Australia” β†’ 26 million (est.) Final Answer: As of 2025, Australia’s population is approximately 26 million.

Clone this wiki locally