8000 GitHub - AArbey/AIO-LLM: An all in one llm server configured from a docker compose file, using AnythingLLm and LocalAGI
[go: up one dir, main page]

Skip to content
/ AIO-LLM Public
forked from mudler/LocalAGI

An all in one llm server configured from a docker compose file, using AnythingLLm and LocalAGI

License

Notifications You must be signed in to change notification settings

AArbey/AIO-LLM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-server-aio

An all in one llm server configured from a docker compose file

This guide was made to be used on a debian server

Prerequisites

Nvidia drivers

These are commands i think worked for me, i currently didn't have time to try again a clean install of the nvidia drivers

apt update
apt install linux-headers-$(uname -r)

Docker and Docker compose

# https://docs.docker.com/engine/install/debian/
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh ./get-docker.sh

Installation process

Clone the repo :

git clone https://github.com/AArbey/AIO-LLM.git
cd AIO-LLM

Copy the .env.template and edit it with your own info.

cp .env.template .env
nano .env

Run with docker compose

docker compose -f docker-compose-aio.yml up -d

About

An all in one llm server configured from a docker compose file, using AnythingLLm and LocalAGI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 76.9%
  • JavaScript 17.5%
  • CSS 5.5%
  • Other 0.1%
0