DeepSeek¶
DeepSeek is a self-hosted AI. This instance actually uses Open WebUI installed with Ollama that pulls the DeepSeek model.
Installation¶
Default Port: 8080
Config¶
If you'd like Open AI, create an account and create an API key.
homelab/docker/deepseek/compose.yaml
---
services:
open-webui:
image: ghcr.io/open-webui/open-webui:git-b03fc97-ollama
container_name: ${CONTAINER_NAME}
network_mode: host
restart: always
env_file:
- .env
environment:
OLLAMA_BASE_URL: http://127.0.0.1:11434
OPENAI_API_KEY: ${OPENAI_API_KEY}
volumes:
- ollama:/root/.ollama
- open-webui:/app/backend/data
volumes:
ollama:
open-webui:
Once Open WebUI is up and running, run the following command to pull the DeepSeek model.
Tip
The above command pulls the 1.5b
tag. Other tags
may be used.
The model may be installed via the web UI, but is a bit more involved and beyond the scope of this page.
Traefik¶
homelab/pve/traefik/conf.d/deepseek.yaml
---
http:
#region routers
routers:
deepseek:
entryPoints:
- "websecure"
rule: "Host(`deepseek.l.nicholaswilde.io`)"
middlewares:
- default-headers@file
- https-redirectscheme@file
tls: {}
service: deepseek
#endregion
#region services
services:
deepseek:
loadBalancer:
servers:
- url: "http://192.168.2.94:8080"
passHostHeader: true
#endregion
Task List¶
task: Available tasks for this project:
* decrypt: Decrypt .env using SOPS
* deepseek: Pull deepseek
* encrypt: Encrypt .env using SOPS
* export: Export the task list
* init: Init .env
* prune: Prune Docker items
* pull: Pull docker images
* restart: Restart Docker containers
* status: Status
* stop: Stop registry container
* up: Tun Docker compose in the foreground.
* up-d: Run Docker compose in the background.
* update: Update running containers
* upgrade: upgrade