watt-ai tool calling #948
Unanswered
bigrobinson
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hey @bigrobinson ,
Let me know if you have more questions! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am curious about watt-ai 8B model, as it appears near the top of the leaderboard. It is a fine tune of meta-llama/Llama-3.1-8B-Instruct and uses the same added_tokens_decoder; however, it uses a much simpler chat template. Also, it is supposed to be a function calling model, but BFCL is using LlamaHandler (for chat) instead of LlamaFCHander (for tool calling). I find this confusing. Can you please explain the choice?
Beta Was this translation helpful? Give feedback.
All reactions