milvus: [Bug]: search has been timeouted
Is there an existing issue for this?
- I have searched the existing issues
Environment
- Milvus version: Latest in the helm, so 2.0
- Deployment mode(standalone or cluster): cluster
- SDK version(e.g. pymilvus v2.0.0rc2): latest, doesn’t matter
- CPU/Memory: a lot of
- GPU: 0
- Others: etcd has been upgraded to latest version
Current Behavior
When I do search I got no results and timeout error
Expected Behavior
When I do search I should got results instead timeout
Steps To Reproduce
1. Helm on AWS EKS cluster
2. Upgraded etcd latest version
3. Query nodes has own node pool with huge memory instance type (controlled by twins and tolerations)
4. I have 4 fields in my schema: ID (not auto inc), vector 512 dim float, 1 field with int64 and one more field with int64
Anything else?
I got the timeout in attu UI and in nodejs lib.
Here is my copy of messages from slack thread:
Hi @yhmo. I got some troubles with by search query. I trying to find any logs regarding to that, but not sure with what I have to start checking. I though it should be querycoord pod but I can not find anything related to my search query there. So the question for now where I can see the error why my search query has been timeouted?
The latest milvus helm version for k8s. Clustered. 1.5M test dataset. 3 fields: 512 dim vector, and two int64 fields. Trying to search in both ways: just with vectors and just by metadata. All of them has been timeouted
Okaayy so I found some logs in proxy:
- Query enqueued
- Query PreExecute done
- Query Execute done But on Attu front-end I still doesnt have any results… Like timeouted
Also I see something like: when I am trying to do search query from nodejs api
[2022/02/04 17:06:22.898 +00:00] [DEBUG] [time_recorder.go:78] ["proxy execute search 430963038662950989: send search msg to message stream (5ms)"] [2022/02/04 17:06:22.898 +00:00] [DEBUG] [time_recorder.go:78] ["proxy execute search 430963038662950989: done (5ms)"]
About this issue
- Original URL
- State: closed
- Created 2 years ago
- Comments: 40 (18 by maintainers)
Now search does not support returning the original vector, the SDK will throw the wrong
localStorage.Path need to set to a fast storage , default path is /var/lib/milvus/data/
seems that current querynode has weird error logs, does search still stuck?
This proxy log only indicates that proxy successfully sent a search request to querynode, but that proxy did not receive search result
@scipe thank you for your issue. Could you please also upload the milvus logs, especially for query nodes, quer coordinator and proxy? Moreover, it would be helpful if you can also upload etcd and pulsar logs for investigation.
/assign @scipe /unassign