You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fromdatetimeimportdatetimefromkserveimportKServeClient, constantsfromkserve.modelsimport (
V1beta1InferenceService,
V1beta1InferenceServiceSpec,
V1beta1PredictorSpec,
V1beta1SKLearnSpec
)
fromkubernetesimportclientimportutils# Get the default target namespacenamespace="admin"now=datetime.now()
v=now.strftime("%Y-%m-%d--%H-%M-%S")
name='iris-classifier'kserve_version='v1beta1'api_version=constants.KSERVE_GROUP+'/'+kserve_version# Create the InferenceServiceisvc=V1beta1InferenceService(
api_version=api_version,
kind=constants.KSERVE_KIND,
metadata=client.V1ObjectMeta(
name=name,
namespace=namespace,
annotations={'sidecar.istio.io/inject': 'false'}
),
spec=V1beta1InferenceServiceSpec(
predictor=V1beta1PredictorSpec(
service_account_name="sa-minio-kserve",
sklearn=V1beta1SKLearnSpec(
storage_uri="s3://mlpipeline/models/iris_model.pkl"
)
)
)
)
# Create the InferenceService in KServeKServe=KServeClient()
KServe.create(isvc)
Environment
AWS t3x2 large instance with 10 gbs of storage
Installed Charmed Kubeflow, minio and mlflow
Allowed minio access and mlflow access
Relevant Log Output
Post \"https://kserve-webhook-server-service.kubeflow.svc:443/mutate-serving-kserve-io-v1beta1-inferenceservice?timeout=10s\": tls: failed to verify certificate: x509: certificate signed by unknown authority","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}{"level":"error","ts":"2024-09-30T20:50:29Z","msg":"Reconciler error","controller":"inferenceservice","controllerGroup":"serving.kserve.io","controllerKind":"InferenceService","InferenceService":{"name":"iris-classifier","namespace":"admin"},"namespace":"admin","name":"iris-classifier","reconcileID":"57908d65-5f49-4305-ade5-3247160b89ec","error":"Internal error occurred: failed calling webhook \"inferenceservice.kserve-webhook-server.defaulter\": failed to call webhook: Post \"https://kserve-webhook-server-service.kubeflow.svc:443/mutate-serving-kserve-io-v1beta1-inferenceservice?timeout=10s\": tls: failed to verify certificate: x509: certificate signed by unknown authority","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
Additional Context
No response
The text was updated successfully, but these errors were encountered:
Hi @ShrishtiKarkera
From the logs it looks like there's an issue with verifying certificate the webhook of KServe in the MutatingWebhookConfiguration object.
To debug this further, first we need to check the health of the admission webhook charm and workload.
Can you share:
Bug Description
I'm unable to use kserve inferenceservice using the JupyterLab notebook, when I create an inference client, it throws this error:
"inferenceservice.kserve-webhook-server.defaulter": failed to call webhook: Post "https://kserve-webhook-server-service.kubeflow.svc:443/mutate-serving-kserve-io-v1beta1-inferenceservice?timeout=10s\": tls: failed to verify certificate: x509: certificate signed by unknown authority"
Inference service client looks like this and my model is stored in minio:
I checked the certs and found everything to be in place, I also tried restarting the mutatingwebhookconfiguration but didn't help.
To Reproduce
Note: I have the model in minio bucket: mlpipeline (upload the model.pkl file)
Environment
AWS t3x2 large instance with 10 gbs of storage
Installed Charmed Kubeflow, minio and mlflow
Allowed minio access and mlflow access
Relevant Log Output
Additional Context
No response
The text was updated successfully, but these errors were encountered: