You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.
ONNX models are serialised version of the current AI models. They are a bit faster from the normal pytorch or huggingface therefore some users might want to use this type of models.
ONNX models are serialised version of the current AI models. They are a bit faster from the normal pytorch or huggingface therefore some users might want to use this type of models.
We already do have a model converted and added to huggingface:
https://huggingface.co/TextCortex/codegen-350M-optimized
In total we need to support following model types:
Here is the script for supporting text generation for ONNX models with optimum library from huggingface:
For the Vanilla Pytorch models .pt, you can directly use AutoModel class from transformers (which also works for the huggingface .bin model types.)
The text was updated successfully, but these errors were encountered: