Skip to content
This repository was archived by the owner on Aug 10, 2025. It is now read-only.
/ ctis221-project Public archive

A modular AI chat application built with Java and Spring Boot, featuring PostgreSQL for persistence, Redis for caching, and WebFlux for real-time streaming. Designed to integrate multiple large language models through a core AI module, with a JavaFX-based desktop client for user interaction.

Notifications You must be signed in to change notification settings

sezRR/ctis221-project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Important

THIS REPOSITORY IS NOT UNDER ACTIVE DEVELOPMENT!

LLM Chat Wrapper

This project is an AI model wrapper developed for the Bilkent University CTIS221 - Object-Oriented Programming course. It provides a modular and extensible framework for integrating and managing various AI models, including support for chat, embeddings, and other capabilities.

Features

  • Modular Architecture: Built with Spring Modulith for modularity and scalability.
  • Database Integration: Uses PostgreSQL for persistent storage.
  • Caching: Redis is used for caching frequently accessed data.
  • Streaming Support: Real-time streaming capabilities using WebFlux and test streaming with Hono.
  • AI Model Integration: Supports integration with various AI models and providers.

Prerequisites

  • Java 17 or higher
  • Gradle (for building the backend service)
  • Docker (for running PostgreSQL, Redis, and other services)
  • Bun (for the stream-hono service)

Setup Instructions

Backend (LLM Chat Service)

  1. Clone the repository:

    git clone https://github.com/sezrr/ctis221-project.git
    cd ctis221-project/projects/llm-chat-service
  2. Start the required services using Docker Compose:

    docker-compose -f compose.yaml up -d
  3. Build and run the Spring Boot application:

    ./gradlew bootRun
  4. Access the application at http://localhost:8080.

Stream Hono - OPTIONAL

  1. Navigate to the stream-hono directory:

    cd ctis221-project/projects/stream-hono
  2. Install dependencies:

    bun install
  3. Start the development server:

    bun run dev
  4. Open http://localhost:3000 in your browser.

Frontend (JavaFX GUI)

  • Add instructions for running the JavaFX GUI.

API Documentation

The API documentation is available at:

  • http://localhost:8080/swagger-ui.html for the backend services.

  • MORE DOCUMENTATION

Project Structure

The project uses a monorepo style structure with the following projects:

  • llm-chat-service: Spring Boot Modular Monolith Backend service for managing AI models, users, chats and their configurations.
    • aimodel: AI model management module.
    • chat: Chat module for interacting with AI models, and CRUD operations for chat history.
    • authentication: User authentication and authorization module.
    • instruction: Instruction module for managing instructions.
    • tiering: Tiering module for managing user tiers.
    • shared: Shared module for common utilities and configurations.
  • stream-hono: Backend service for real-time streaming and interaction.
  • modulo-ai: Core library for AI model integration and provider management.
  • javafx-gui: JavaFX GUI for interacting with the backend services.

Technologies Used

  • Spring Boot: Backend framework.
  • PostgreSQL: Database.
  • Redis: Caching.
  • Hono: Real-time streaming.
  • Bun: JavaScript runtime for the frontend.
  • Docker: Containerization.

Team Members

Contributing

Contributions are welcome! Please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Commit your changes and push them to your fork.
  4. Submit a pull request.

About

A modular AI chat application built with Java and Spring Boot, featuring PostgreSQL for persistence, Redis for caching, and WebFlux for real-time streaming. Designed to integrate multiple large language models through a core AI module, with a JavaFX-based desktop client for user interaction.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published