This application serves as a simple browser-based chat interface demonstrating how to use the @ubc-genai-toolkit/llm
module, specifically for streaming conversations with an LLM like Ollama.
This demo uses a two-part architecture:
-
Frontend (
index.html
,style.css
,app.js
):- A plain HTML, CSS, and JavaScript single-page application.
- Handles user interface elements (settings form, chat display, input).
- Manages local chat history.
- Communicates with the backend server via
fetch
API to send messages and receive streamed responses.
-
Backend (
server.js
):- A Node.js server built with Express.
- Acts as a proxy between the frontend and the
@ubc-genai-toolkit/llm
module. - This backend is necessary because the
ollama
JavaScript library (a dependency of the toolkit'sOllamaProvider
) includes Node.js-specific code (e.g., usage ofnode:fs
) that cannot run directly in a web browser environment. - The LLM module is initialized and used here, where it can operate correctly.
- Node.js and npm: Required to install dependencies and run the servers. (Node.js v16+ recommended).
- UBC GenAI Toolkit (
ubc-genai-toolkit-ts
):- You need to have the main
ubc-genai-toolkit-ts
repository cloned locally. - The toolkit must be built (run
npm install
thennpm run build
in the root of theubc-genai-toolkit-ts
directory).
- You need to have the main
- Ollama Server:
- An Ollama server must be running and accessible.
- The model specified in
server.js
(default:llama3.1
) must be pulled and available on the Ollama server (e.g.,ollama pull llama3.1
).
-
Directory Structure:
- It's assumed this
toolkit-chat-demo
directory is a sibling to the mainubc-genai-toolkit-ts
directory. For example:Developer/ ├── ubc-genai-toolkit-ts/ (The main toolkit repo) └── toolkit-chat-demo/ (This application)
- If your structure is different, you'll need to adjust the
file:
paths intoolkit-chat-demo/package.json
that point to the toolkit modules.
- It's assumed this
-
Install Dependencies:
- Navigate to the
toolkit-chat-demo
directory in your terminal. - Run
npm install
. This will install Express, CORS, and link the local@ubc-genai-toolkit/core
and@ubc-genai-toolkit/llm
modules.
- Navigate to the
- LLM Configuration: The LLM provider, endpoint, and default model are configured directly within
toolkit-chat-demo/server.js
in thellmConfig
object.const llmConfig = { provider: 'ollama', endpoint: 'http://localhost:11434', // Change if your Ollama server is elsewhere defaultModel: 'llama3.1', // Change to your desired (pulled) model };
- Logging & Debugging: In
server.js
, aConsoleLogger
is initialized for the LLM module. Thedebug
property inconfigWithLogger
can be set totrue
for more verbose logging from the toolkit.const configWithLogger = { // ... logger: new ConsoleLogger('ToolkitChatDemo-Backend'), debug: false, // Set to true for verbose logs };
You need to run two separate server processes:
-
Start the Backend Server:
- Open a terminal in the
toolkit-chat-demo
directory. - Run the command:
npm run start:server
- This will start the Node.js Express server, typically listening on
http://localhost:3001
(see terminal output).
- Open a terminal in the
-
Start the Frontend Static File Server:
- Open a second terminal in the
toolkit-chat-demo
directory. - Run the command:
npx serve
- This will start a simple static file server. It will usually pick port
3000
or the next available port if3000
is in use. Note the URL from the terminal output (e.g.,http://localhost:3000
orhttp://localhost:XXXXX
).
- Open a second terminal in the
-
Access the Application:
- Open your web browser and navigate to the URL provided by the
npx serve
command (the frontend server URL).
- Open your web browser and navigate to the URL provided by the
index.html
: The main HTML file for the user interface.style.css
: CSS styles for the application.app.js
: Frontend JavaScript logic; handles UI, user input, and communication with the backend/chat
API.server.js
: Backend Node.js Express server; uses the UBC GenAI Toolkit to interact with the LLM and streams responses to the frontend.package.json
: Defines project dependencies and scripts.README.md
: This file.