-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Asking about coral delegates #56
Comments
coral could work for qna models, but the model needs to be converted for coral device. |
In general, you can convert a model using the Edge TPU Compiler. You can try it out on Google Colab without installing it. In this case though, the mobilebert qna model that
Note these lines:
This model will need to be quantized to Uint8 and possibly re-trained in order to work on Coral. Even then, I'm not sure all of the ops will be supported. Coral seems to support mostly image-related models, and some audio models, but you can definitely try it and see. Here are some more details on the model requirements for Coral. Edit: I accidentally hit close on this issue instead of posting this. |
Thank you for your explanation. Now, I understand. However, the BERT model cannot be quantized to Uint8 and the model cannot be used with Edge TPU. |
Hello,
I tried this example and run it on raspberry pi with coral. It work well. Thank you…
I have question, little bit out contexts.
however, when I tried to implement the coral delegates for @tensorflow/qna is difficult. Since the mobileBert pre-trained model is auto load from the qna package.
it is still possible using Coral delegates for qna? Or Coral only work for object detection model like in this tutorial? Thank you..
The text was updated successfully, but these errors were encountered: