Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

转chatglm2的onnx推理模型时,调用onnx2paddle接口时,会报错模型太大。 #1016

Open
2 tasks
shanyuaa opened this issue Jun 10, 2024 · 2 comments

Comments

@shanyuaa
Copy link

感谢您参与 X2Paddle 社区! 问题模版为了 X2Paddle 能更好的迭代,例如新功能发布、 RoadMaps 和错误跟踪. 😸

问题描述

  • 错误信息
    ValueError: This protobuf of onnx model is too large (>2GB). Call check_model with model path instead.

  • 错误截图

截屏2024-06-10 20 39 19

具体信息

  • 转换模型后用处

    • [ ✅ ] 使用 Paddle 框架/ PaddleInference 推理预测
    • 使用 Paddle-Lite 做移动端推理
    • 转换预训练参数,再使用 Paddle 进行模型开发
  • 模型来源
    chatglm2

  • 应用场景

  • 版本信息
    PaddlePaddle => 2.6.1
    X2Paddle => 1.4.1
    来源框架版本(ONNX) => 1.16.0

  • 您的联系方式(邮箱/微信/电话)
    [email protected]

@luotao1
Copy link
Collaborator

luotao1 commented Jun 11, 2024

适配PaddlePaddle 2.6版本和3.0 版本、以及支持大语言生成模型的工作正在规划中。

@shanyuaa
Copy link
Author

shanyuaa commented Jun 11, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants