[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UX Request: Update readme to mention llamafile -m foo.llamafile as an option #511

Open
4 tasks done
mofosyne opened this issue Jul 27, 2024 · 1 comment
Open
4 tasks done

Comments

@mofosyne
Copy link
Collaborator
mofosyne commented Jul 27, 2024

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Add mention that -m can run other llamafiles

Motivation

In windows, the 4gig limit means you need to stick under the limit as an executable. A llamafile can be more than that.

Need to update readme to mention you can use a standalone llamafile engine without any weight and use that to bootstrap run a normal llamafile via llamafile -m foo.llamafile

Searched readme.md if it was mentioned, but didn't find one.

Possible Implementation

No response

@jart
Copy link
Collaborator
jart commented Jul 27, 2024

Pull requests welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants