You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There's a comment for "ignored or currently unsupported" parameters but I'm pretty sure some of them are supported (for instance, I think logprobs is supported?).
Also when I generate an openapi client off the openapi.json there are errors because of these.
So just delete the things that are unsupported or clarify that they're actually supported / provide more info on the existing support.
The text was updated successfully, but these errors were encountered:
Just leaving some notes here for parameters i see in app.py
model: ignored, but currently marked "optional"... on the one hand could mark "required" to make it explicit in case the server supports multiple llama's at the same time, but also could delete it since its ignored. decision: mark it required for the sake of openai api compatibility
n, presence_penalty, frequency_penalty, best_of, logit_bias, user: not supported, excluded from the calls into llama. decision: delete it
logprobs (CreateCompletionRequest only): i think this is actually supported (its in the arguments of LLama.__call__, which is how the completion is invoked). decision: mark as supported
top_k (CreateChatCompletionRequest only): llama.create_chat_completion definitely has a top_k argument, but its missing from CreateChatCompletionRequest. decision: add it
There's a comment for "ignored or currently unsupported" parameters but I'm pretty sure some of them are supported (for instance, I think logprobs is supported?).
Also when I generate an openapi client off the
openapi.json
there are errors because of these.So just delete the things that are unsupported or clarify that they're actually supported / provide more info on the existing support.
The text was updated successfully, but these errors were encountered: