8000 Bump version · bellofils/llama-cpp-python@d605875 · GitHub
[go: up one dir, main page]

Skip to content < 8000 /span>

Commit d605875

Browse files
committed
Bump version
1 parent b82b0e1 commit d605875

File tree

2 files changed

+9
-1
lines changed

2 files changed

+9
-1
lines changed

CHANGELOG.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,14 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## [Unreleased]
99

10+
## [0.2.40]
11+
12+
- feat: Update llama.cpp to ggerganov/llama.cpp@3bdc4cd0f595a6096cca4a64aa75ffa8a3503465
13+
- feat: Generic chatml Function Calling using chat_format="chatml-function-calling"` by @abetlen in #957
14+
- fix: Circular dependancy preventing early Llama object free by @notwa in #1176
15+
- docs: Set the correct command for compiling with syscl support by @akarshanbiswas in #1172
16+
- feat: use gpu backend for clip if available by @iamlemec in #1175
17+
1018
## [0.2.39]
1119

1220
- feat: Update llama.cpp to ggerganov/llama.cpp@b08f22c882a1443e6b97081f3ce718a4d1a741f8

llama_cpp/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
from .llama_cpp import *
22
from .llama import *
33

4-
__version__ = "0.2.39"
4+
__version__ = "0.2.40"

0 commit comments

Comments
 (0)
0