-
Notifications
You must be signed in to change notification settings - Fork 465
Open
Description
Hi, I am working with large image descriptors datasets (1M, 128-d vectors). Based on a few tests, HNSW has an excellent scaling complexity. However, I met an unexpected situation from the memory footprint, and I am not sure whether I did or thought something wrong. I checked the memory usage of my search function and I found there are 30% of writing. However, in my understanding, running a search function will not cause that big amount of writing. Can I ask why does search function have that amount of writing to the memory?
Metadata
Metadata
Assignees
Labels
No labels