You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+53Lines changed: 53 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -216,6 +216,59 @@ Note that `chat_format` option must be set for the particular model you are usin
216
216
217
217
Chat completion is available through the [`create_chat_completion`](https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#llama_cpp.Llama.create_chat_completion) method of the [`Llama`](https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#llama_cpp.Llama) class.
218
218
219
+
### JSON and JSON Schema Mode
220
+
221
+
If you want to constrain chat responses to only valid JSON or a specific JSON Schema you can use the `response_format` argument to the `create_chat_completion` method.
222
+
223
+
#### Json Mode
224
+
225
+
The following example will constrain the response to be valid JSON.
0 commit comments