-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exporting portable models #151
Labels
Comments
Haven't had a chance to look at it deeply, but ONNX seems to have good support so far: |
This seems interesting ... but it seems so recent (early release: Sept of this year) that I would hesitate to commit much energy. If it's a going thing six months from now, maybe.
I think most or all of the interoperability API efforts are or will be python based.
Original post, Brian: "Lets add support for exporting a trained model into a portable network format"
Am thinking we shouldn't roll-our-own at this point in time, or jump on any bandwagon. Unless there's a motivating (political) need.
…________________________________
From: Adam Moody <[email protected]>
Sent: Thursday, December 21, 2017 10:59:02 AM
To: LLNL/lbann
Cc: Subscribed
Subject: Re: [LLNL/lbann] Exporting portable models (#151)
Haven't had a chance to look at it deeply, but ONNX seems to have good support so far:
https://research.fb.com/onnx-v1-released/
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub<#151 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AI8DH2aXlZuhiV_l7Q5BesKA3RaAGtaOks5tCqp1gaJpZM4RJf_6>.
|
At the moment all I want is a numpy array per weight matrix that can be loaded into TensorFlow or something like that.
Brian C. Van Essen
[email protected]<mailto:[email protected]>
(w) 925-422-9300
(c) 925-290-5470
…---
Sent from my iPhone
On Dec 21, 2017, at 5:12 PM, davidHysom <[email protected]<mailto:[email protected]>> wrote:
This seems interesting ... but it seems so recent (early release: Sept of this year) that I would hesitate to commit much energy. If it's a going thing six months from now, maybe.
I think most or all of the interoperability API efforts are or will be python based.
Original post, Brian: "Lets add support for exporting a trained model into a portable network format"
Am thinking we shouldn't roll-our-own at this point in time, or jump on any bandwagon. Unless there's a motivating (political) need.
________________________________
From: Adam Moody <[email protected]<mailto:[email protected]>>
Sent: Thursday, December 21, 2017 10:59:02 AM
To: LLNL/lbann
Cc: Subscribed
Subject: Re: [LLNL/lbann] Exporting portable models (#151)
Haven't had a chance to look at it deeply, but ONNX seems to have good support so far:
https://research.fb.com/onnx-v1-released/
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub<#151 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AI8DH2aXlZuhiV_l7Q5BesKA3RaAGtaOks5tCqp1gaJpZM4RJf_6>.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub<#151 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/AF7Ce6EIhuV-Wk-e4Z2iRYfUTkygFydJks5tCwIJgaJpZM4RJf_6>.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Lets add support for exporting a trained model into a portable network format. This should enable us to pre-train a network, export it, and allow someone to fine tune it in a different toolkit.
The text was updated successfully, but these errors were encountered: