@@ -5,8 +5,36 @@ All notable changes to this project will be documented in this file.
5
5
The format is based on [ Keep a Changelog] ( https://keepachangelog.com/en/1.0.0/ ) ,
6
6
and this project adheres to [ Semantic Versioning] ( https://semver.org/spec/v2.0.0.html ) .
7
7
8
+ ## v0.1.2 (2023-11-08)
9
+
10
+ ### New Features
11
+
12
+ - <csr-id-dcfccdf721eb47a364cce5b1c7a54bcf94335ac0 /> more ` async ` function variants
13
+ - <csr-id-56285a119633682951f8748e85c6b8988e514232 /> add ` LlamaSession.model `
14
+
15
+ ### Commit Statistics
16
+
17
+ <csr-read-only-do-not-edit />
18
+
19
+ - 2 commits contributed to the release.
20
+ - 2 commits were understood as [ conventional] ( https://www.conventionalcommits.org ) .
21
+ - 0 issues like '(#ID)' were seen in commit messages
22
+
23
+ ### Commit Details
24
+
25
+ <csr-read-only-do-not-edit />
26
+
27
+ <details ><summary >view details</summary >
28
+
29
+ * ** Uncategorized**
30
+ - More ` async ` function variants ([ ` dcfccdf ` ] ( https://github.com/binedge/llama_cpp-rs/commit/dcfccdf721eb47a364cce5b1c7a54bcf94335ac0 ) )
31
+ - Add ` LlamaSession.model ` ([ ` 56285a1 ` ] ( https://github.com/binedge/llama_cpp-rs/commit/56285a119633682951f8748e85c6b8988e514232 ) )
32
+ </details >
33
+
8
34
## v0.1.1 (2023-11-08)
9
35
36
+ <csr-id-3eddbab3cc35a59acbe66fa4f5333a9ca0edb326 />
37
+
10
38
### Chore
11
39
12
40
- <csr-id-3eddbab3cc35a59acbe66fa4f5333a9ca0edb326 /> Remove debug binary from Cargo.toml
@@ -26,7 +54,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
26
54
27
55
<csr-read-only-do-not-edit />
28
56
29
- - 5 commits contributed to the release.
57
+ - 6 commits contributed to the release.
30
58
- 13 days passed between releases.
31
59
- 4 commits were understood as [ conventional] ( https://www.conventionalcommits.org ) .
32
60
- 0 issues like '(#ID)' were seen in commit messages
@@ -38,6 +66,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
38
66
<details ><summary >view details</summary >
39
67
40
68
* ** Uncategorized**
69
+ - Release llama_cpp_sys v0.2.1, llama_cpp v0.1.1 ([ ` ef4e3f7 ` ] ( https://github.com/binedge/llama_cpp-rs/commit/ef4e3f7a3c868a892f26acfae2a5211de4900d1c ) )
41
70
- Add ` LlamaModel::load_from_file_async ` ([ ` 3bada65 ` ] ( https://github.com/binedge/llama_cpp-rs/commit/3bada658c9139af1c3dcdb32c60c222efb87a9f6 ) )
42
71
- Remove debug binary from Cargo.toml ([ ` 3eddbab ` ] ( https://github.com/binedge/llama_cpp-rs/commit/3eddbab3cc35a59acbe66fa4f5333a9ca0edb326 ) )
43
72
- Require ` llama_context ` is accessed from behind a mutex ([ ` b676baa ` ] ( https://github.com/binedge/llama_cpp-rs/commit/b676baa3c1a6863c7afd7a88b6f7e8ddd2a1b9bd ) )
0 commit comments