You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@tnsai what version of the llama stack server / llama stack client package are you running?
The error message indicates you are using a newer version of the stack-client than the server. We deprecated the identifier field and renamed it shield_id.
@dineshyv I'm running pip installed version from last week, with some PR amends for remote::vllm committed a few days ago and the latest available pip installed client from yesterday.
When was identifier renamed (so that I can feed back to vllm provider contributors)?
I can run the examples (with some small fixes for the latest Llama Stack). However, the safety example errors at the point:
response = client.safety.run_shield
with:
fastapi.exceptions.RequestValidationError: [{'type': 'missing', 'loc': ('body', 'identifier'), 'msg': 'Field required', 'input': None}]
The preceding code that lists the available shields outputs:
llama-guard is running fine, llama-stack is running on localhost port 5000 and the safety examples is called with:
python safety.py 127.0.0.1 5000
What to do?
The text was updated successfully, but these errors were encountered: