Skip to content

leonardorifeli/local-llm-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLM Server com Ollama + Open WebUI

Comandos

  • make up – Sobe tudo (Ollama + UI)
  • make down – Derruba
  • make restart – Reinicia
  • make status – Mostra status
  • make logs – Logs em tempo real

Acessar

How to use

git clone llm-server cd llm-server make up

⚠️ Requisitos:

  • Docker instalado
  • nvidia-container-toolkit configurado (se for usar GPU)

Check using GPU

watch -n1 nvidia-smi

Using the local model

curl http://localhost:11434/api/generate -d '{
  "model": "mistral",
  "prompt": "Explique o que é aprendizado por reforço.",
  "stream": false
}'

About

LLM local server using Mistral model

Resources

Stars

Watchers

Forks

Packages

No packages published