Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Export BevFusion model to onnx or torchscript #3026

Open
inzombakura opened this issue Aug 24, 2024 · 1 comment
Open

[Feature] Export BevFusion model to onnx or torchscript #3026

inzombakura opened this issue Aug 24, 2024 · 1 comment

Comments

@inzombakura
Copy link

inzombakura commented Aug 24, 2024

What is the feature?

I am looking into whether it is possible to export the BevFusion model to an external format like ONNX or torchscript.

Any other context?

I have been able to load in the model checkpoint (.pth) and use it in the demo (multi_modality_demo.py) but am unable to export the model using torch.export.export or torch.onnx.dynamo_export:

torch.export.export(model, args=(collate_data,))
onnx_program = torch.onnx.dynamo_export(model, collate_data)

and am wondering if their is any documentation or existing ability to export the model in common formats. My goal in completing this is to have a format that can be traced using jit. This will assist in model reusability to other platforms.

@Dontrian
Copy link

Dontrian commented Sep 4, 2024

I have the same problem and to be more specific, the export step fails with the following error:

RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: Det3DDataSample

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants