Compare commits

...

5 Commits

Author SHA1 Message Date
8d7a1be7f5 fix(docker): strip hardcoded model/provider assumptions from fleet doc
All checks were successful
ci/woodpecker/push/infra Pipeline was successful
Model choices and provider prereqs belong in onboarding/settings,
not static documentation.
2026-03-01 08:06:15 -06:00
89767e26ef fix(docker): generic naming (mosaic-*), env-var-only config, no hardcoded values
All checks were successful
ci/woodpecker/push/infra Pipeline was successful
- Renamed all jarvis-* to mosaic-* (generic for any deployment)
- Config files are .json.template with ${VAR} placeholders
- entrypoint.sh renders templates via envsubst at startup
- Ollama is optional: set OLLAMA_BASE_URL to auto-inject provider
- Model is configurable via OPENCLAW_MODEL env var
- No hardcoded IPs, keys, model names, or user preferences
- Updated README with full env var reference
2026-03-01 08:02:31 -06:00
50f0dc6018 Revert "fix(docker): use envsubst template pattern — no hardcoded URLs or keys (MS22-P1a)"
All checks were successful
ci/woodpecker/push/infra Pipeline was successful
This reverts commit 11136e2f23.
2026-03-01 07:55:32 -06:00
11136e2f23 fix(docker): use envsubst template pattern — no hardcoded URLs or keys (MS22-P1a)
All checks were successful
ci/woodpecker/push/infra Pipeline was successful
2026-03-01 07:54:28 -06:00
256171cc62 feat(docker): OpenClaw agent fleet compose + real configs (MS22-P1a)
All checks were successful
ci/woodpecker/push/infra Pipeline was successful
2026-03-01 07:54:03 -06:00
12 changed files with 481 additions and 0 deletions

49
docker/OPENCLAW-FLEET.md Normal file
View File

@@ -0,0 +1,49 @@
# Mosaic Agent Fleet
Multi-agent deployment for Mosaic Stack using OpenClaw containers on Docker Swarm.
## Architecture
Each agent runs as an isolated OpenClaw Gateway instance with its own:
- **Workspace** — persistent volume for agent files and memory
- **State** — persistent volume for auth tokens and sessions
- **Config** — template rendered at startup from environment variables
Agents communicate with the Mosaic API via the OpenAI-compatible
`/v1/chat/completions` endpoint. The Mosaic WebUI routes chat requests
to agents through the `OpenClawGatewayModule`.
## Default Agent Roles
| Agent | Role | Description |
| ----------------- | ------------ | ------------------------------------------- |
| mosaic-main | Orchestrator | User-facing gateway, routes to other agents |
| mosaic-projects | Developer | Implementation, coding, PRs |
| mosaic-research | Research | Web search, analysis, discovery |
| mosaic-operations | Operations | Monitoring, health checks, alerts |
> **Models and providers are configured per-deployment** via environment
> variables and the Mosaic Settings UI — not hardcoded in these files.
> See the [Setup Guide](openclaw-instances/README.md) for env var reference.
## Prerequisites
- Docker Swarm initialized on target host
- Mosaic Stack running (`mosaic-stack_internal` network available)
- At least one LLM provider API key (Z.ai, OpenAI, Anthropic, etc.)
## Quick Start
1. **Configure** — Fill in `docker/openclaw-instances/*.env` files
2. **Deploy**`docker stack deploy -c docker/openclaw-compose.yml mosaic-agents`
3. **Auth** — If needed, run `openclaw auth` inside a container (or via Mosaic terminal)
4. **Verify**`docker stack services mosaic-agents`
See [openclaw-instances/README.md](openclaw-instances/README.md) for detailed setup.
## Future: Onboarding Wizard
Model assignments, provider configuration, and agent customization will be
managed through the Mosaic WebUI onboarding wizard and Settings pages (MS22-P4).
Until then, use environment variables per the README.

150
docker/openclaw-compose.yml Normal file
View File

@@ -0,0 +1,150 @@
# Mosaic Agent Fleet — OpenClaw Docker Swarm Stack
# Deploy: docker stack deploy -c docker/openclaw-compose.yml mosaic-agents
# All config via env vars — see openclaw-instances/*.env
services:
mosaic-main:
image: alpine/openclaw:latest
command: ["/config/entrypoint.sh"]
env_file:
- ./openclaw-instances/mosaic-main.env
environment:
OPENCLAW_CONFIG_PATH: /tmp/openclaw.json
volumes:
- mosaic-main-config:/config:ro
- mosaic-main-state:/home/node/.openclaw
networks:
- mosaic-stack_internal
healthcheck:
test: ["CMD", "openclaw", "gateway", "health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 20s
deploy:
replicas: 1
restart_policy:
condition: on-failure
delay: 5s
max_attempts: 3
resources:
limits:
memory: 2G
reservations:
memory: 512M
labels:
- com.mosaic.agent=mosaic-main
- com.mosaic.role=orchestrator
mosaic-projects:
image: alpine/openclaw:latest
command: ["/config/entrypoint.sh"]
env_file:
- ./openclaw-instances/mosaic-projects.env
environment:
OPENCLAW_CONFIG_PATH: /tmp/openclaw.json
volumes:
- mosaic-projects-config:/config:ro
- mosaic-projects-state:/home/node/.openclaw
networks:
- mosaic-stack_internal
healthcheck:
test: ["CMD", "openclaw", "gateway", "health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 20s
deploy:
replicas: 1
restart_policy:
condition: on-failure
delay: 10s
max_attempts: 3
resources:
limits:
memory: 4G
reservations:
memory: 1G
labels:
- com.mosaic.agent=mosaic-projects
- com.mosaic.role=developer
mosaic-research:
image: alpine/openclaw:latest
command: ["/config/entrypoint.sh"]
env_file:
- ./openclaw-instances/mosaic-research.env
environment:
OPENCLAW_CONFIG_PATH: /tmp/openclaw.json
volumes:
- mosaic-research-config:/config:ro
- mosaic-research-state:/home/node/.openclaw
networks:
- mosaic-stack_internal
healthcheck:
test: ["CMD", "openclaw", "gateway", "health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 20s
deploy:
replicas: 1
restart_policy:
condition: on-failure
delay: 10s
max_attempts: 3
resources:
limits:
memory: 1G
reservations:
memory: 256M
labels:
- com.mosaic.agent=mosaic-research
- com.mosaic.role=research
mosaic-operations:
image: alpine/openclaw:latest
command: ["/config/entrypoint.sh"]
env_file:
- ./openclaw-instances/mosaic-operations.env
environment:
OPENCLAW_CONFIG_PATH: /tmp/openclaw.json
volumes:
- mosaic-operations-config:/config:ro
- mosaic-operations-state:/home/node/.openclaw
networks:
- mosaic-stack_internal
healthcheck:
test: ["CMD", "openclaw", "gateway", "health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 20s
deploy:
replicas: 1
restart_policy:
condition: on-failure
delay: 10s
max_attempts: 3
resources:
limits:
memory: 1G
reservations:
memory: 256M
labels:
- com.mosaic.agent=mosaic-operations
- com.mosaic.role=operations
networks:
mosaic-stack_internal:
external: true
volumes:
mosaic-main-config:
mosaic-main-state:
mosaic-projects-config:
mosaic-projects-state:
mosaic-research-config:
mosaic-research-state:
mosaic-operations-config:
mosaic-operations-state:

View File

@@ -0,0 +1,97 @@
# Mosaic Agent Fleet — Setup Guide
## Prerequisites
- Docker Swarm initialized on target host
- Mosaic Stack running (Postgres, Valkey on `mosaic-stack_internal` network)
## 1. Configure Environment Variables
Copy and fill in each agent's `.env` file:
```bash
cd docker/openclaw-instances
# Required for each agent:
# ZAI_API_KEY — Your Z.ai API key (or other LLM provider key)
# OPENCLAW_GATEWAY_TOKEN — Unique bearer token per agent
# Generate unique tokens:
for agent in main projects research operations; do
echo "OPENCLAW_GATEWAY_TOKEN=$(openssl rand -hex 32)"
done
```
### Optional: Local Ollama
If you have an Ollama instance, add to any agent's `.env`:
```bash
OLLAMA_BASE_URL=http://your-ollama-host:11434
OLLAMA_MODEL=cogito # or any model you have pulled
```
The entrypoint script will automatically inject the Ollama provider at startup.
### Optional: Override Default Model
```bash
OPENCLAW_MODEL=anthropic/claude-sonnet-4-6
```
## 2. Populate Config Volumes
Each agent needs its `.json.template` file in its config volume:
```bash
# Create config directories and copy templates
for agent in main projects research operations; do
mkdir -p /var/lib/docker/volumes/mosaic-agents_mosaic-${agent}-config/_data/
cp openclaw-instances/mosaic-${agent}.json.template \
/var/lib/docker/volumes/mosaic-agents_mosaic-${agent}-config/_data/openclaw.json.template
cp openclaw-instances/entrypoint.sh \
/var/lib/docker/volumes/mosaic-agents_mosaic-${agent}-config/_data/entrypoint.sh
done
```
## 3. Deploy
```bash
docker stack deploy -c docker/openclaw-compose.yml mosaic-agents
docker stack services mosaic-agents
```
## 4. First-Time Auth (if needed)
For providers requiring OAuth (e.g., Anthropic):
```bash
docker exec -it $(docker ps -q -f name=mosaic-main) openclaw auth
```
Follow the device-code flow in your browser. Tokens persist in the state volume.
You can also use the Mosaic WebUI terminal (xterm.js) for this.
## 5. Verify
```bash
# Check health
curl http://localhost:18789/health
# Test chat completions endpoint
curl http://localhost:18789/v1/chat/completions \
-H "Authorization: Bearer YOUR_GATEWAY_TOKEN" \
-H "Content-Type: application/json" \
-d '{"model":"openclaw:main","messages":[{"role":"user","content":"hello"}]}'
```
## Environment Variable Reference
| Variable | Required | Description |
| ------------------------ | -------- | ------------------------------------------------- |
| `ZAI_API_KEY` | Yes\* | Z.ai API key (\*or other provider key) |
| `OPENCLAW_GATEWAY_TOKEN` | Yes | Bearer token for this agent (unique per instance) |
| `OPENCLAW_MODEL` | No | Override default model (default: `zai/glm-5`) |
| `OLLAMA_BASE_URL` | No | Ollama endpoint (e.g., `http://10.1.1.42:11434`) |
| `OLLAMA_MODEL` | No | Ollama model name (default: `cogito`) |

View File

@@ -0,0 +1,53 @@
#!/bin/sh
# Mosaic Agent Fleet — OpenClaw container entrypoint
# Renders config template from env vars, optionally adds Ollama provider, starts gateway
set -e
TEMPLATE="/config/openclaw.json.template"
CONFIG="/tmp/openclaw.json"
if [ ! -f "$TEMPLATE" ]; then
echo "ERROR: Config template not found at $TEMPLATE"
echo "Mount your config volume at /config with a .json.template file"
exit 1
fi
# Validate required env vars
: "${OPENCLAW_GATEWAY_TOKEN:?OPENCLAW_GATEWAY_TOKEN is required (generate: openssl rand -hex 32)}"
# Render template with env var substitution
envsubst < "$TEMPLATE" > "$CONFIG"
# If OLLAMA_BASE_URL is set, inject Ollama provider into config
if [ -n "$OLLAMA_BASE_URL" ]; then
# Use python3 if available, fall back to node
if command -v python3 >/dev/null 2>&1; then
python3 -c "
import json, sys
with open('$CONFIG') as f: cfg = json.load(f)
cfg.setdefault('models', {})['mode'] = 'merge'
cfg['models'].setdefault('providers', {})['ollama'] = {
'baseUrl': '$OLLAMA_BASE_URL/v1',
'api': 'openai-completions',
'models': [{'id': '${OLLAMA_MODEL:-cogito}', 'name': '${OLLAMA_MODEL:-cogito} (Local)', 'reasoning': False, 'input': ['text'], 'cost': {'input':0,'output':0,'cacheRead':0,'cacheWrite':0}, 'contextWindow': 128000, 'maxTokens': 8192}]
}
with open('$CONFIG','w') as f: json.dump(cfg, f, indent=2)
"
echo "Ollama provider added: $OLLAMA_BASE_URL (model: ${OLLAMA_MODEL:-cogito})"
elif command -v node >/dev/null 2>&1; then
node -e "
const fs = require('fs');
const cfg = JSON.parse(fs.readFileSync('$CONFIG','utf8'));
cfg.models = cfg.models || {}; cfg.models.mode = 'merge';
cfg.models.providers = cfg.models.providers || {};
cfg.models.providers.ollama = {baseUrl:'$OLLAMA_BASE_URL/v1',api:'openai-completions',models:[{id:'${OLLAMA_MODEL:-cogito}',name:'${OLLAMA_MODEL:-cogito} (Local)',reasoning:false,input:['text'],cost:{input:0,output:0,cacheRead:0,cacheWrite:0},contextWindow:128000,maxTokens:8192}]};
fs.writeFileSync('$CONFIG', JSON.stringify(cfg, null, 2));
"
echo "Ollama provider added: $OLLAMA_BASE_URL (model: ${OLLAMA_MODEL:-cogito})"
else
echo "WARNING: OLLAMA_BASE_URL set but no python3/node available to inject provider"
fi
fi
export OPENCLAW_CONFIG_PATH="$CONFIG"
exec openclaw gateway run --bind lan --auth token "$@"

View File

@@ -0,0 +1,14 @@
# Mosaic Agent: main
# Fill in all values before deploying.
# Required: LLM provider API key (Z.ai, OpenAI, etc.)
ZAI_API_KEY=
# Required: unique bearer token for this agent instance (generate: openssl rand -hex 32)
OPENCLAW_GATEWAY_TOKEN=
# Optional: override default model (default: zai/glm-5)
# OPENCLAW_MODEL=zai/glm-5
# Optional: Ollama endpoint for local inference (uncomment to enable)
# OLLAMA_BASE_URL=

View File

@@ -0,0 +1,19 @@
{
"gateway": {
"mode": "local",
"port": 18789,
"bind": "lan",
"auth": { "mode": "token" },
"http": {
"endpoints": {
"chatCompletions": { "enabled": true }
}
}
},
"agents": {
"defaults": {
"workspace": "/home/node/workspace",
"model": { "primary": "${OPENCLAW_MODEL:-zai/glm-5}" }
}
}
}

View File

@@ -0,0 +1,14 @@
# Mosaic Agent: operations
# Fill in all values before deploying.
# Required: LLM provider API key (Z.ai, OpenAI, etc.)
ZAI_API_KEY=
# Required: unique bearer token for this agent instance (generate: openssl rand -hex 32)
OPENCLAW_GATEWAY_TOKEN=
# Optional: override default model (default: zai/glm-5)
# OPENCLAW_MODEL=zai/glm-5
# Optional: Ollama endpoint for local inference (uncomment to enable)
# OLLAMA_BASE_URL=

View File

@@ -0,0 +1,19 @@
{
"gateway": {
"mode": "local",
"port": 18789,
"bind": "lan",
"auth": { "mode": "token" },
"http": {
"endpoints": {
"chatCompletions": { "enabled": true }
}
}
},
"agents": {
"defaults": {
"workspace": "/home/node/workspace",
"model": { "primary": "${OPENCLAW_MODEL:-zai/glm-5}" }
}
}
}

View File

@@ -0,0 +1,14 @@
# Mosaic Agent: projects
# Fill in all values before deploying.
# Required: LLM provider API key (Z.ai, OpenAI, etc.)
ZAI_API_KEY=
# Required: unique bearer token for this agent instance (generate: openssl rand -hex 32)
OPENCLAW_GATEWAY_TOKEN=
# Optional: override default model (default: zai/glm-5)
# OPENCLAW_MODEL=zai/glm-5
# Optional: Ollama endpoint for local inference (uncomment to enable)
# OLLAMA_BASE_URL=

View File

@@ -0,0 +1,19 @@
{
"gateway": {
"mode": "local",
"port": 18789,
"bind": "lan",
"auth": { "mode": "token" },
"http": {
"endpoints": {
"chatCompletions": { "enabled": true }
}
}
},
"agents": {
"defaults": {
"workspace": "/home/node/workspace",
"model": { "primary": "${OPENCLAW_MODEL:-zai/glm-5}" }
}
}
}

View File

@@ -0,0 +1,14 @@
# Mosaic Agent: research
# Fill in all values before deploying.
# Required: LLM provider API key (Z.ai, OpenAI, etc.)
ZAI_API_KEY=
# Required: unique bearer token for this agent instance (generate: openssl rand -hex 32)
OPENCLAW_GATEWAY_TOKEN=
# Optional: override default model (default: zai/glm-5)
# OPENCLAW_MODEL=zai/glm-5
# Optional: Ollama endpoint for local inference (uncomment to enable)
# OLLAMA_BASE_URL=

View File

@@ -0,0 +1,19 @@
{
"gateway": {
"mode": "local",
"port": 18789,
"bind": "lan",
"auth": { "mode": "token" },
"http": {
"endpoints": {
"chatCompletions": { "enabled": true }
}
}
},
"agents": {
"defaults": {
"workspace": "/home/node/workspace",
"model": { "primary": "${OPENCLAW_MODEL:-zai/glm-5}" }
}
}
}