-
Notifications
You must be signed in to change notification settings - Fork 849
Issues: meta-llama/llama-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
The server forcefully shuts down upon receiving a signal.
bug
Something isn't working
#1043
opened Feb 11, 2025 by
leseb
1 of 2 tasks
GET /models/{model-id} do not work with backslash in model identifier
bug
Something isn't working
#1038
opened Feb 11, 2025 by
yanxi0830
2 tasks
Support enhanced logging (eg: debug)
enhancement
New feature or request
#1031
opened Feb 10, 2025 by
thoraxe
llama client sdk can not reach server using ollama
bug
Something isn't working
#1029
opened Feb 10, 2025 by
sancelot
2 tasks
Fix the Llama stack README image after the Refactor
bug
Something isn't working
#1027
opened Feb 10, 2025 by
zanetworker
2 tasks
Propogate provider config into ProviderInfo
enhancement
New feature or request
#1011
opened Feb 7, 2025 by
noelo
llama stack run
does not have a venv
mode of operation
bug
#1007
opened Feb 7, 2025 by
cdoern
1 of 2 tasks
Sometimes getting Something isn't working
Error exporting span to SQLite: Cannot operate on a closed database.
when running RAG agent example from getting started document
bug
#999
opened Feb 7, 2025 by
booxter
1 of 2 tasks
getting_started.ipynb web search doesn't seem to work
bug
Something isn't working
#973
opened Feb 5, 2025 by
DarrellKeller
2 tasks
Support non-Llama models
enhancement
New feature or request
#965
opened Feb 5, 2025 by
terrytangyuan
Create Groq Distribution Template
enhancement
New feature or request
good first issue
Good for newcomers
#958
opened Feb 4, 2025 by
yanxi0830
how to build llamastack/distribution-meta-reference-gpu from Dockerfile?
#919
opened Feb 1, 2025 by
alexhegit
Switch Ollama distro to use ollama embeddings
good first issue
Good for newcomers
#904
opened Jan 30, 2025 by
ashwinb
Conversion of RawTextItem to TextContentItem causes no tokens to be generated
#829
opened Jan 19, 2025 by
AidanFRyan
2 tasks done
meta-llama/Llama-3.2-3B-Instruct-QLORA_INT4_EO8 not found
#824
opened Jan 19, 2025 by
AidanFRyan
1 of 2 tasks
Want to use client_tool calls, but code_interpreter is used instead
#820
opened Jan 18, 2025 by
aidando73
1 of 2 tasks
Previous Next
ProTip!
Follow long discussions with comments:>50.