这里分类和汇总了欣宸的全部原创(含配套源码):https://github.com/zq2599/blog_demos
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
pull Pull a model from a registry
push Push a model to a registry
list List models
cp Copy a model
rm Remove a model
help Help about any command
Model | Parameters | Size | 下载命令 |
---|---|---|---|
Llama 3 | 8B | 4.7GB | ollama run llama3 |
Llama 3 | 70B | 40GB | ollama run llama3:70b |
Phi-3 | 3.8B | 2.3GB | ollama run phi3 |
Mistral | 7B | 4.1GB | ollama run mistral |
Neural Chat | 7B | 4.1GB | ollama run neural-chat |
Starling | 7B | 4.1GB | ollama run starling-lm |
Code Llama | 7B | 3.8GB | ollama run codellama |
Llama 2 Uncensored | 7B | 3.8GB | ollama run llama2-uncensored |
LLaVA | 7B | 4.5GB | ollama run llava |
Gemma | 2B | 1.4GB | ollama run gemma:2b |
Gemma | 7B | 4.8GB | ollama run gemma:7b |
Solar | 10.7B | 6.1GB | ollama run solar |
ollama
✗ free -g
total used free shared buff/cache available
Mem: 31 3 21 0 6 27
Swap: 7 0 7
spring.ai.ollama.base-url=http://ollama:11434
spring.ai.ollama.chat.options.model=qwen:1.8b
spring.ai.ollama.chat.options.temperature=0.7
spring.main.web-application-type=reactive
version: '3.8'
services:
ollama:
image: ollama/ollama:latest
ports:
- 11434:11434
volumes:
- /home/will/data/ollama:/root/.ollama
container_name: ollama
pull_policy: if_not_present
tty: true
restart: always
networks:
- ollama-docker
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
pull_policy: if_not_present
volumes:
- /home/will/data/webui:/app/backend/data
depends_on:
- ollama
ports:
- 13000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY=123456'
- 'HF_ENDPOINT=https://hf-mirror.com'
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
networks:
- ollama-docker
java-app:
image: bolingcavalry/ollam-tutorial:0.0.1-SNAPSHOT
volumes:
- /home/will/temp/202405/15/application.properties:/app/application.properties
container_name: java-app
pull_policy: if_not_present
depends_on:
- ollama
ports:
- 18080:8080
restart: always
networks:
- ollama-docker
networks:
ollama-docker:
external: false
docker-compose up -d
docker-compose down
docker-compose up -d
[+] Building 0.0s (0/0)
[+] Running 4/4
✔ Network files_ollama-docker Created 0.1s
✔ Container ollama Started 0.2s
✔ Container java-app Started 0.4s
✔ Container open-webui Started
ollama