-
Notifications
You must be signed in to change notification settings - Fork 173
Issues: SeldonIO/MLServer
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Dockerfile has a conda version which is incompatible with Conda Lib Mamba
#1719
opened Apr 25, 2024 by
ramonpzg
Inference parameters are not passed to the predict function in MLFlow runtime
#1660
opened Apr 3, 2024 by
Okamille
Schema enforcement error when using Something isn't working
codecs
PandasCodec
inference request
bug
#1625
opened Mar 6, 2024 by
mfueller
[BUG]: Kafka Message cannot be serialized to JSON when dict contains bytes
#1621
opened Mar 5, 2024 by
DerTiedemann
pydantic.errors.PydanticImportError:
BaseSettings
has been moved to the pydantic-settings
package
#1594
opened Feb 29, 2024 by
harupy
Select GPU to be used for each worker on parallel inference
#1570
opened Feb 15, 2024 by
teddy-ambona
mlserver build
fails if requirements.txt includes a git path
#1562
opened Feb 2, 2024 by
isaac-smothers
InferenceRequests with Tensor of Tensors does not decode properly
#1542
opened Jan 19, 2024 by
GDegrove
Examples/clarification over using OpenTelemetry tracing in MLServer
#1513
opened Dec 15, 2023 by
danielsoutar
Ability to serve swagger doc dependencies from the mlserver instance or use a custom CDN
#1470
opened Nov 5, 2023 by
ichbinjakes
Enable PandasCodec.decode_request to restore the exact dataframe
#1390
opened Sep 12, 2023 by
ysk24ok
Previous Next
ProTip!
no:milestone will show everything without a milestone.