Skip to content

inisis/OnnxLLM

Repository files navigation

OnnxLLM

This is an example project for learning purpose

Installation

Using Prebuilt

pip install onnxllm

Install From Source

pip install git+https://github.com/inisis/OnnxLLM@main

Install From Local

git clone https://github.com/inisis/OnnxLLM && cd OnnxLLM/
pip install .

How to use

from transformers import AutoTokenizer
from onnxllm import AutoModelForCausalLM

# you should download onnx models from https://huggingface.co/inisis-me first
tokenizer = AutoTokenizer.from_pretrained("/data/llm/llm-export/onnx-standard/", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("/data/llm/llm-export/onnx-standard/", trust_remote_code=True)

prompt = '蒙古国的首都是乌兰巴托(Ulaanbaatar)\n冰岛的首都是雷克雅未克(Reykjavik)\n埃塞俄比亚的首都是'

inputs = tokenizer(prompt, return_tensors='pt')

output = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(output[0], skip_special_tokens=True))

References

referenced models

About

Large Language Model Onnx Inference Framework

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages