Skip to content

r7b7/hosp-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Banner GIF

HOSP-AI

Universal LLM Provider Connector for Java

GitHub stars License GitHub release (latest by semver) PRs Welcome Wiki

Why HOSP-AI?

When I first started exploring LLMs and Neural Networks in Python, experimenting was easy. But when I switched back to Java—the language I trust for its scalability and performance—I hit a roadblock. There weren’t any simple tools to help me work seamlessly with multiple LLM providers.

This had to be fixed.

The result? Hosp-AI

A library designed for quick prototyping with LLMs, and fully compatible with production-ready frameworks like Spring Boot.

Thanks to Adalflow , the inspiration behind building this library.

Contributions are Welcome

  1. Fork the Repo
  2. Create a Branch - name it based on issue-fix, documentation, feature
  3. Pull a PR
  4. Once Reviewed, PR will be merged by Admin

Features

  1. Following LLM providers are supported currently: OpenAI, Anthropic, Groq, Ollama
  2. PromptBuilder to build complex prompts
  3. Flexibility to add customized client implementations
  4. Tools(Function Calls) Supported
  5. Add Image in Prompt

Features in Pipeline

  1. Stream Response
  2. Structured Output

Installation

  1. Add jitpack repository in pom file
    <repositories>
     <repository>
      <id>jitpack.io</id>
      <url>https://jitpack.io</url>
     </repository>
    </repositories>
    
  2. Add hosp-ai dependency (check latest version)
    <dependency>
     <groupId>com.github.r7b7</groupId>
     <artifactId>hosp-ai</artifactId>
     <version>v1.0.0-alpha.2</version>
    </dependency>
    
    
    

Working Examples

For working examples and tutorials - visit Wiki