-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: io.milvus.v2.exception.MilvusClientException: fail to Query on QueryNode 6: worker(6) query failed: getrandom #38265
Comments
/assign @aoiasd could you help to have a look? Thanks. |
And maybe similar issue #36271 |
@walker1024 Thanks. |
@walker1024 what types of fields did you set in the output_fields |
my collection has two fields: id and vector, so I set the output fields is id and vector. |
ok.
complete logs |
@walker1024 we need the milvus logs from all the pods, could you please provide all the logs? I tried to reproduce the issue in house, but no luck. |
…tion (#38326) relate: #38265 Signed-off-by: aoiasd <[email protected]>
…tion (milvus-io#38326) relate: milvus-io#38265 Signed-off-by: aoiasd <[email protected]>
Is there an existing issue for this?
Environment
Current Behavior
When I perform a vector ANN search, if I request input parameters with outputfileds, the service will report an error.
but if I request without the outputfileds parameter, it will return normally
Expected Behavior
return normally
Steps To Reproduce
Milvus Log
{"log":"[2024/12/06 02:53:48.479 +00:00] [WARN] [proxy/impl.go:3478] ["Query failed to WaitToFinish"] [traceID=128f9f8ab5179b4d8f2ba0fb0534eb5a] [role=proxy] [db=default] [collection=img_coll_12_02] [partitions="[]"] [ConsistencyLevel=Strong] [useDefaultConsistency=false] [error="failed to query: failed to search/query delegator 6 for channel by-dev-rootcoord-dml_5_453943773523046964v0: fail to Query on QueryNode 6: worker(6) query failed: getrandom"] [errorVerbose="failed to query: failed to search/query delegator 6 for channel by-dev-rootcoord-dml_5_453943773523046964v0: fail to Query on QueryNode 6: worker(6) query failed: getrandom\n(1) attached stack trace\n -- stack trace:\n | github.com/milvus-io/milvus/internal/proxy.(*queryTask).Execute\n | \t/workspace/source/internal/proxy/task_query.go:471\n | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).processTask\n | \t/workspace/source/internal/proxy/task_scheduler.go:474\n | github.com/milvus-io/milvus/internal/proxy.(*taskScheduler).queryLoop.func1\n | \t/workspace/source/internal/proxy/task_scheduler.go:553\n | github.com/milvus-io/milvus/pkg/util/conc.(*Pool[...]).Submit.func1\n | \t/workspace/source/pkg/util/conc/pool.go:81\n | github.com/panjf2000/ants/v2.(*goWorker).run.func1\n | \t/go/pkg/mod/github.com/panjf2000/ants/[email protected]/worker.go:67\nWraps: (2) failed to query\nWraps: (3) attached stack trace\n -- stack trace:\n | github.com/milvus-io/milvus/internal/proxy.(*LBPolicyImpl).ExecuteWithRetry.func1\n | \t/workspace/source/internal/proxy/lb_policy.go:188\n | [...repeated from below...]\nWraps: (4) failed to search/query delegator 6 for channel by-dev-rootcoord-dml_5_453943773523046964v0\nWraps: (5) attached stack trace\n -- stack trace:\n | github.com/milvus-io/milvus/internal/proxy.(*queryTask).queryShard\n | \t/workspace/source/internal/proxy/task_query.go:566\n | github.com/milvus-io/milvus/internal/proxy.(*LBPolicyImpl).ExecuteWithRetry.func1\n | \t/workspace/source/internal/proxy/lb_policy.go:180\n | github.com/milvus-io/milvus/pkg/util/retry.Do\n | \t/workspace/source/pkg/util/retry/retry.go:44\n | github.com/milvus-io/milvus/internal/proxy.(*LBPolicyImpl).ExecuteWithRetry\n | \t/workspace/source/internal/proxy/lb_policy.go:154\n | github.com/milvus-io/milvus/internal/proxy.(*LBPolicyImpl).Execute.func2\n | \t/workspace/source/internal/proxy/lb_policy.go:218\n | golang.org/x/sync/errgroup.(*Group).Go.func1\n | \t/go/pkg/mod/golang.org/x/[email protected]/errgroup/errgroup.go:75\n | runtime.goexit\n | \t/usr/local/go/src/runtime/asm_amd64.s:1650\nWraps: (6) fail to Query on QueryNode 6\nWraps: (7) worker(6) query failed: getrandom\nError types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.withPrefix (7) merr.milvusError"]\n","stream":"stdout","time":"2024-12-06T02:53:48.480521644Z"}
{"log":"[2024/12/06 02:53:48.480 +00:00] [WARN] [proxy/task_search.go:688] ["failed to requery"] [traceID=128f9f8ab5179b4d8f2ba0fb0534eb5a] [nq=1] [error="fail to Query on QueryNode 6: worker(6) query failed: getrandom"]\n","stream":"stdout","time":"2024-12-06T02:53:48.481061288Z"}
{"log":"[2024/12/06 02:53:48.481 +00:00] [WARN] [proxy/task_scheduler.go:485] ["Failed to post-execute task: "] [traceID=128f9f8ab5179b4d8f2ba0fb0534eb5a] [error="fail to Query on QueryNode 6: worker(6) query failed: getrandom"]\n","stream":"stdout","time":"2024-12-06T02:53:48.481649548Z"}
Anything else?
Looking at the service log, there is another ERROR log. I don’t know if it is related to this exception.
The text was updated successfully, but these errors were encountered: