8000 debug · themrzmaster/llama-cpp-python@c4b2f87 · GitHub
[go: up one dir, main page]

Skip to content

Commit c4b2f87

Browse files
committed
debug
1 parent 7dea619 commit c4b2f87

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

llama_cpp/llama_chat_format.py

Lines changed: 2 additions & 1 deletion
2576
Original file line numberDiff line numberDiff line change
@@ -2571,6 +2571,7 @@ def base_function_calling(
25712571
completions.append(completion_or_chunks)
25722572
completions_tool_name.append(tool_name)
25732573
prompt += completion_or_chunks["choices"][0]["text"]
2574+
print(prompt)
25742575
prompt += "\n"
25752576
response = llama.create_completion(
2577
prompt=prompt,
@@ -2598,7 +2599,7 @@ def base_function_calling(
25982599
)
25992600

26002601
response = cast(llama_types.CreateCompletionResponse, response)
2601-
2602+
print(response["choices"][0])
26022603
if response["choices"][0]["text"] == "</done>":
26032604
break
26042605
tool_name = response["choices"][0]["text"][len("functions.") :].replace(":", "")

0 commit comments

Comments
 (0)
0