8000 docs: Fix whitespace · Nagibiku/llama-cpp-python@602ea64 · GitHub
[go: up one dir, main page]

Skip to content
8000

Commit 602ea64

Browse files
committed
docs: Fix whitespace
1 parent 971864c commit 602ea64

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -135,6 +135,7 @@ Below is a short example demonstrating how to use the high-level API to generate
135135
```
136136

137137
### Adjusting the Context Window
138+
138139
The context window of the Llama models determines the maximum number of tokens that can be processed at once. By default, this is set to 512 tokens, but can be adjusted based on your requirements.
139140

140141
For instance, if you want to work with larger contexts, you can expand the context window by setting the n_ctx parameter when initializing the Llama object:

0 commit comments

Comments
 (0)
0