8000 Wrong size of embeddings · Issue #47 · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content
Wrong size of embeddings #47
Closed
Closed
@djpadbit

Description

@djpadbit

While playing around, I noticed the embeddings are only 512 floats rather than the 4096 you get when using the standalone application.

So I went digging and I found the culprit which was a copy-paste residue in the function llama_n_embd

def llama_n_embd(ctx: llama_context_p) -> c_int:
return _lib.llama_n_ctx(ctx)

It's calling llama_n_ctx rather than llama_n_embd.

I don't think this warrants a pull request as it is a very easy issue to fix, so I made a simple issue instead.

Keep up the good work :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0