-
Notifications
You must be signed in to change notification settings - Fork 12.5k
llama : move vocab, grammar and sampling into separate files #8508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 1 commit
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
0ddc8e3
llama : move sampling code into llama-sampling
ggerganov 675f305
llama : move grammar code into llama-grammar
ggerganov 5a71d1a
cont
ggerganov b4b242e
cont : pre-fetch rules
ggerganov 689d377
cont
ggerganov e7dffa6
llama : deprecate llama_sample_grammar
ggerganov
10000
Jul 19, 2024
8fef5b1
llama : move tokenizers into llama-vocab
ggerganov 66ac80f
make : update llama.cpp deps [no ci]
ggerganov 39fbaf9
llama : redirect external API to internal APIs
ggerganov dae3cae
llama : suffix the internal APIs with "_impl"
ggerganov fe28a7b
llama : clean-up
ggerganov File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next
Next commit
llama : move sampling code into llama-sampling
ggml-ci
- Loading branch information
commit 0ddc8e361c5ea1bdab8dc14c5658e95ea9e5e731
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
#pragma once | ||
|
||
#define LLAMA_API_INTERNAL | ||
#include "llama.h" | ||
|
||
#include <array> | ||
#include <set> | ||
#include <map> | ||
#include <cstdint> | ||
#include <random> | ||
|
||
#ifdef __has_include | ||
#if __has_include(<unistd.h>) | ||
#include <unistd.h> | ||
#if defined(_POSIX_MAPPED_FILES) | ||
#include <sys/mman.h> | ||
#include <fcntl.h> | ||
#endif | ||
#if defined(_POSIX_MEMLOCK_RANGE) | ||
#include <sys/resource.h> | ||
#endif | ||
#endif | ||
#endif | ||
|
||
// bump if necessary | ||
#define LLAMA_MAX_NODES 8192 | ||
#define LLAMA_MAX_LAYERS 256 | ||
#define LLAMA_MAX_EXPERTS 160 // DeepSeekV2 | ||
|
||
#ifdef __GNUC__ | ||
#ifdef __MINGW32__ | ||
#define LLAMA_ATTRIBUTE_FORMAT(...) __attribute__((format(gnu_printf, __VA_ARGS__))) | ||
#else | ||
#define LLAMA_ATTRIBUTE_FORMAT(...) __attribute__((format(printf, __VA_ARGS__))) | ||
64BA | #endif | |
#else | ||
#define LLAMA_ATTRIBUTE_FORMAT(...) | ||
#endif | ||
|
||
// | ||
// logging | ||
// | ||
|
||
LLAMA_ATTRIBUTE_FORMAT(2, 3) | ||
void llama_log_internal (ggml_log_level level, const char * format, ...); | ||
void llama_log_callback_default(ggml_log_level level, const char * text, void * user_data); | ||
|
||
#define LLAMA_LOG_INFO(...) llama_log_internal(GGML_LOG_LEVEL_INFO , __VA_ARGS__) | ||
#define LLAMA_LOG_WARN(...) llama_log_internal(GGML_LOG_LEVEL_WARN , __VA_ARGS__) | ||
#define LLAMA_LOG_ERROR(...) llama_log_internal(GGML_LOG_LEVEL_ERROR, __VA_ARGS__) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.