8000 Comparing v0.3.5-metal...main · abetlen/llama-cpp-python · GitHub
[go: up one dir, main page]

Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: abetlen/llama-cpp-python
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.3.5-metal
Choose a base ref
...
head repository: abetlen/llama-cpp-python
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: main
Choose a head ref
  • 17 commits
  • 9 files changed
  • 4 contributors

Commits on Dec 19, 2024

  1. feat: Update llama.cpp

    abetlen committed Dec 19, 2024
    Configuration menu
    Copy the full SHA
    2bc1d97 View commit details
    Browse the repository at this point in the history

Commits on Dec 30, 2024

  1. feat: Update llama.cpp

    abetlen committed Dec 30, 2024
    Configuration menu
    Copy the full SHA
    c9dfad4 View commit details
    Browse the repository at this point in the history

Commits on Jan 8, 2025

  1. feat: Update llama.cpp

    abetlen committed Jan 8, 2025
    Configuration menu
    Copy the full SHA
    1d5f534 View commit details
    Browse the repository at this point in the history
  2. fix: streaming resource lock (#1879)

    * fix: correct issue with handling lock during streaming
    
    move locking for streaming into get_event_publisher call so it is locked and unlocked in the correct task for the streaming reponse
    
    * fix: simplify exit stack management for create_chat_completion and create_completion
    
    * fix: correct missing `async with` and format code
    
    * fix: remove unnecessary explicit use of AsyncExitStack
    
    fix: correct type hints for body_model
    
    ---------
    
    Co-authored-by: Andrei <abetlen@gmail.com>
    gjpower and abetlen authored Jan 8, 2025
    Configuration menu
    Copy the full SHA
    e8f14ce View commit details
    Browse the repository at this point in the history
  3. chore: Bump version

    abetlen committed Jan 8, 2025
    Configuration menu
    Copy the full SHA
    0580cf2 View commit details
    Browse the repository at this point in the history

Commits on Jan 29, 2025

  1. feat: Update llama.cpp

    abetlen committed Jan 29, 2025
    Configuration menu
    Copy the full SHA
    80be68a View commit details
    Browse the repository at this point in the history
  2. feat: Update llama.cpp

    abetlen committed Jan 29, 2025
    Configuration menu
    Copy the full SHA
    0b89fe4 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    14879c7 View commit details
    Browse the repository at this point in the history
  4. fix: error showing time spent in llama perf context print (#1898)

    * feat: Sync with llama.cpp
    
    Add `no_perf` field to `llama_context_params` to optionally disable performance timing measurements.
    
    * fix: Display performance metrics by default
    
    ---------
    
    Co-authored-by: 
    8000
    Andrei <abetlen@gmail.com>
    shakalaca and abetlen authored Jan 29, 2025
    Configuration menu
    Copy the full SHA
    4442ff8 View commit details
    Browse the repository at this point in the history
  5. chore: Bump version

    abetlen committed Jan 29, 2025
    Configuration menu
    Copy the full SHA
    710e19a View commit details
    Browse the repository at this point in the history

Commits on Mar 12, 2025

  1. feat: Update llama.cpp

    abetlen committed Mar 12, 2025
    Configuration menu
    Copy the full SHA
    344c106 View commit details
    Browse the repository at this point in the history
  2. feat: Update llama.cpp

    abetlen committed Mar 12, 2025
    Configuration menu
    Copy the full SHA
    e232fae View commit details
    Browse the repository at this point in the history
  3. chore: Bump version

    abetlen committed Mar 12, 2025
    Configuration menu
    Copy the full SHA
    37eb5f0 View commit details
    Browse the repository at this point in the history

Commits on Apr 11, 2025

  1. feat: Update llama.cpp

    abetlen committed Apr 11, 2025
    Configuration menu
    Copy the full SHA
    99f2ebf View commit details
    Browse the repository at this point in the history

Commits on May 8, 2025

  1. feat: Update llama.cpp

    abetlen committed May 8, 2025
    Configuration menu
    Copy the full SHA
    4c6514d View commit details
    Browse the repository at this point in the history
  2. chore: Bump version

    abetlen committed May 8, 2025
    Configuration menu
    Copy the full SHA
    cb2edb9 View commit details
    Browse the repository at this point in the history
  3. hotfix: Disable curl support

    abetlen committed May 8, 2025
    Configuration menu
    Copy the full SHA
    b1d23df View commit details
    Browse the repository at this point in the history
Loading
0