Is it possible to convert Gemma 7B/Llama2 7B run on Android device #338
jayliau
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
From https://developers.google.com/mediapipe/solutions/genai/llm_inference#models, we can see it can convert Gemma2B/Phi-2/Falcon-RW-1B/StableLM-3B to inference on Android device.
Is it possible to convert Gemma 7B/Llama2 7B run on Android device
Beta Was this translation helpful? Give feedback.
All reactions