Skip to content

OllamaPY is a python library to use the ollama api.

Notifications You must be signed in to change notification settings

ClementCaton/OllamaPY

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

OllamaPY

OllamaPY is a Python library I did to use the Ollama API for my own projects. Feedback is appreciated.
It's an interface, so it requires you to have Ollama running on a machine. You can download it here.
It works with Python 3.11.4 and Ollama 0.1.10.

Documentation

I copied parts of the documentation from the Ollama API documentation to describe the functions.

Minimal examples

  • Install models
import ollama

client = ollama.Ollama()
# Setting a model for your instance will pull it from the ollama model library if it is not already installed
client.setModel("llama2")

# You can check if a model is already installed
print(client.listLocalModels())
  • Get generate text
# This function will use the model you set for your instance
output = client.generate("Who are you ?")
# output is the JSON response string
print(output)

# You can also specify another model
output = client.generate("Who are you ?", "mistral:7b")
# output is the JSON response string
print(output)
  • Here is a way to extract the text from the JSON response
import json

def extract(generation:str)->str:
    text = ""
    generation = generation.split("\n")
    for line in generation[:-1]:
        text += json.loads(line)['response']
    return text

Fonctions

  • generate(prompt, model)

    • prompt : String
    • model (optional) : String
    • return : String or None

    Generate a response for a given prompt with a provided model.
    Returns the JSON response from Ollama. If the model is not specified, it will use the model you set for your instance.
    If the model specified or in your instance is invalid, or if the connection with Ollama fails, it will return None.

  • createModel(modelname:str, modelfile:str)

    • modelName : String
    • modelfile : String
    • return : Boolean

    Create a model from a Modelfile.

    ! Not implemented yet !

    Returns True if the model was created, False otherwise.

  • listLocalModels()

    • return : String

    It will return the JSON of the models that are available locally. If the connection with Ollama fails, it will return None.
    You can use the extract function to get the list of models, but change ['response'] to ['models'].

  • showModelInfo(model:str=None)

    • model (optional) : String
    • return : String or None

    Show details about a model including modelfile, template, parameters, license, and system prompt.
    If the model is not specified, it will use the model you set for your instance.
    If the model specified or in your instance is invalid, or if the connection with Ollama fails, it will return None.

  • copyModel(modelInput:str, modelOutput:str)

    ! Not implemented yet !

  • deleteModel(model: str)

    • model : String

    Delete a model and its data.
    Returns True if the model was deleted, False otherwise.

  • setModel(model:str)

    • model : String

    Set the model for your instance. It will pull it from the ollama model library if it is not already installed.

  • pushModel()

    ! Not implemented yet !

  • embeddings(prompt:str, model:str=None, option:dict[str,any]=None)

    • prompt : String
    • model (optional) : String
    • option (optional) : dict[str,any]
    • return : String or None

Installation

Copy paste the ollama.py file in your project and import it.
(don't forget to star the repo if you like it, maybe one day it'll have pip support)

About

OllamaPY is a python library to use the ollama api.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages