Compare commits
26 Commits
a9623e9219
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 24496cea01 | |||
|
|
8c960eee9d | ||
| f380d232e6 | |||
| 8b441c17b7 | |||
| 2a91f6c202 | |||
| 97ee66770a | |||
| 30ce4cecc7 | |||
|
|
9fbfdcee6d | ||
|
|
21afb58b33 | ||
| 09786ee6e0 | |||
| 1fd67b9ec0 | |||
| 38223c8ec2 | |||
|
|
8de2f7439a | ||
|
|
98b9bc3c93 | ||
| b1403703b1 | |||
|
|
abead17e0e | ||
|
|
fbf74c2736 | ||
|
|
364d6c2278 | ||
|
|
93efbcdafe | ||
|
|
def9c2fd7a | ||
|
|
87501ea952 | ||
|
|
7a5f28c8b5 | ||
|
|
405bc4c797 | ||
|
|
c9bf578396 | ||
| c1f4830bf5 | |||
| e5c4bf25b3 |
20
AGENTS.md
20
AGENTS.md
@@ -130,6 +130,26 @@ Load additional guides when the task requires them.
|
|||||||
- Installation and configuration are managed by Mosaic bootstrap and runtime linking.
|
- Installation and configuration are managed by Mosaic bootstrap and runtime linking.
|
||||||
- If sequential-thinking is unavailable, you MUST report the failure and stop planning-intensive execution.
|
- If sequential-thinking is unavailable, you MUST report the failure and stop planning-intensive execution.
|
||||||
|
|
||||||
|
## Subagent Model Selection (Cost Optimization — Hard Rule)
|
||||||
|
|
||||||
|
When delegating work to subagents, you MUST select the cheapest model capable of completing the task. Do NOT default to the most expensive model for every delegation.
|
||||||
|
|
||||||
|
| Task Type | Model Tier | Rationale |
|
||||||
|
|-----------|-----------|-----------|
|
||||||
|
| File search, grep, glob, codebase exploration | **haiku** | Read-only, pattern matching, no reasoning depth needed |
|
||||||
|
| Status checks, health monitoring, heartbeat | **haiku** | Structured API calls, pass/fail output |
|
||||||
|
| Simple code fixes (typos, rename, one-liner) | **haiku** | Minimal reasoning, mechanical changes |
|
||||||
|
| Code review, lint, style checks | **sonnet** | Needs judgment but not deep architectural reasoning |
|
||||||
|
| Test writing, test fixes | **sonnet** | Pattern-based, moderate complexity |
|
||||||
|
| Standard feature implementation | **sonnet** | Good balance of capability and cost for most coding |
|
||||||
|
| Complex architecture, multi-file refactors | **opus** | Requires deep reasoning, large context, design judgment |
|
||||||
|
| Security review, auth logic | **opus** | High-stakes reasoning where mistakes are costly |
|
||||||
|
| Ambiguous requirements, design decisions | **opus** | Needs nuanced judgment and tradeoff analysis |
|
||||||
|
|
||||||
|
**Decision rule**: Start with the cheapest viable tier. Only escalate if the task genuinely requires deeper reasoning — not as a safety default. Most coding tasks are sonnet-tier. Reserve opus for work where wrong answers are expensive.
|
||||||
|
|
||||||
|
**Runtime-specific syntax**: See the runtime reference for how to specify model tier when spawning subagents (e.g., Claude Code Task tool `model` parameter).
|
||||||
|
|
||||||
## Skills Policy
|
## Skills Policy
|
||||||
|
|
||||||
- Use only the minimum required skills for the active task.
|
- Use only the minimum required skills for the active task.
|
||||||
|
|||||||
46
README.md
46
README.md
@@ -228,17 +228,57 @@ Re-sync manually:
|
|||||||
~/.config/mosaic/bin/mosaic-link-runtime-assets
|
~/.config/mosaic/bin/mosaic-link-runtime-assets
|
||||||
```
|
```
|
||||||
|
|
||||||
## sequential-thinking MCP Requirement
|
## MCP Registration
|
||||||
|
|
||||||
sequential-thinking MCP is a hard requirement for Mosaic Stack.
|
### How MCPs Are Configured in Claude Code
|
||||||
|
|
||||||
Use:
|
**MCPs must be registered via `claude mcp add` — not by hand-editing `~/.claude/settings.json`.**
|
||||||
|
|
||||||
|
`settings.json` controls hooks, model, plugins, and allowed commands. The `mcpServers` key in
|
||||||
|
`settings.json` is silently ignored by Claude Code's MCP loader. The correct file is `~/.claude.json`,
|
||||||
|
which is managed by the `claude mcp` CLI.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Register a stdio MCP (user scope = all projects, persists across sessions)
|
||||||
|
claude mcp add --scope user <name> -- npx -y <package>
|
||||||
|
|
||||||
|
# Register an HTTP MCP (e.g. OpenBrain)
|
||||||
|
claude mcp add --scope user --transport http <name> <url> \
|
||||||
|
--header "Authorization: Bearer <token>"
|
||||||
|
|
||||||
|
# List registered MCPs
|
||||||
|
claude mcp list
|
||||||
|
```
|
||||||
|
|
||||||
|
**Scope options:**
|
||||||
|
- `--scope user` — writes to `~/.claude.json`, available in all projects (recommended for shared tools)
|
||||||
|
- `--scope project` — writes to `.claude/settings.json` in the project root, committed to the repo
|
||||||
|
- `--scope local` — default, machine-local only, not committed
|
||||||
|
|
||||||
|
**Transport for HTTP MCPs must be `http`** — not `sse`. `type: "sse"` is a deprecated protocol
|
||||||
|
that silently fails to connect against FastMCP streamable HTTP servers.
|
||||||
|
|
||||||
|
### sequential-thinking MCP (Hard Requirement)
|
||||||
|
|
||||||
|
sequential-thinking MCP is required for Mosaic Stack. The installer registers it automatically.
|
||||||
|
To verify or re-register manually:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
~/.config/mosaic/bin/mosaic-ensure-sequential-thinking
|
~/.config/mosaic/bin/mosaic-ensure-sequential-thinking
|
||||||
~/.config/mosaic/bin/mosaic-ensure-sequential-thinking --check
|
~/.config/mosaic/bin/mosaic-ensure-sequential-thinking --check
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### OpenBrain Semantic Memory (Recommended)
|
||||||
|
|
||||||
|
OpenBrain is the shared cross-agent memory layer. Register once per machine:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
claude mcp add --scope user --transport http openbrain https://your-openbrain-host/mcp \
|
||||||
|
--header "Authorization: Bearer YOUR_TOKEN"
|
||||||
|
```
|
||||||
|
|
||||||
|
See [mosaic/openbrain](https://git.mosaicstack.dev/mosaic/openbrain) for setup and API docs.
|
||||||
|
|
||||||
## Bootstrap Any Repo
|
## Bootstrap Any Repo
|
||||||
|
|
||||||
Attach any repository to the Mosaic standards layer:
|
Attach any repository to the Mosaic standards layer:
|
||||||
|
|||||||
117
TOOLS.md
117
TOOLS.md
@@ -43,9 +43,13 @@ Mosaic wrappers at `~/.config/mosaic/tools/git/*.sh` handle platform detection a
|
|||||||
~/.config/mosaic/tools/portainer/endpoint-list.sh
|
~/.config/mosaic/tools/portainer/endpoint-list.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
### Infrastructure — Coolify
|
### Infrastructure — Coolify (DEPRECATED)
|
||||||
|
|
||||||
|
> Coolify has been superseded by Portainer Docker Swarm in this stack.
|
||||||
|
> Tools remain for reference but should not be used for new deployments.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
# DEPRECATED — do not use for new deployments
|
||||||
~/.config/mosaic/tools/coolify/project-list.sh
|
~/.config/mosaic/tools/coolify/project-list.sh
|
||||||
~/.config/mosaic/tools/coolify/service-list.sh
|
~/.config/mosaic/tools/coolify/service-list.sh
|
||||||
~/.config/mosaic/tools/coolify/service-status.sh -u <uuid>
|
~/.config/mosaic/tools/coolify/service-status.sh -u <uuid>
|
||||||
@@ -66,10 +70,45 @@ Mosaic wrappers at `~/.config/mosaic/tools/git/*.sh` handle platform detection a
|
|||||||
|
|
||||||
### CI/CD — Woodpecker
|
### CI/CD — Woodpecker
|
||||||
|
|
||||||
|
Multi-instance support: `-a <instance>` selects a named instance. Omit `-a` to use the default from `woodpecker.default` in credentials.json.
|
||||||
|
|
||||||
|
| Instance | URL | Serves |
|
||||||
|
|----------|-----|--------|
|
||||||
|
| `mosaic` (default) | ci.mosaicstack.dev | Mosaic repos (git.mosaicstack.dev) |
|
||||||
|
| `usc` | ci.uscllc.com | USC repos (git.uscllc.com) |
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
~/.config/mosaic/tools/woodpecker/pipeline-list.sh
|
# List recent pipelines
|
||||||
~/.config/mosaic/tools/woodpecker/pipeline-status.sh
|
~/.config/mosaic/tools/woodpecker/pipeline-list.sh [-r owner/repo] [-a instance]
|
||||||
~/.config/mosaic/tools/woodpecker/pipeline-trigger.sh -b <branch>
|
|
||||||
|
# Check latest or specific pipeline status
|
||||||
|
~/.config/mosaic/tools/woodpecker/pipeline-status.sh [-r owner/repo] [-n number] [-a instance]
|
||||||
|
|
||||||
|
# Trigger a build
|
||||||
|
~/.config/mosaic/tools/woodpecker/pipeline-trigger.sh [-r owner/repo] [-b branch] [-a instance]
|
||||||
|
```
|
||||||
|
|
||||||
|
Instance selection rule: match `-a` to the git remote host of the target repo. If the repo is on `git.uscllc.com`, use `-a usc`. If on `git.mosaicstack.dev`, use `-a mosaic` (or omit, since it's the default).
|
||||||
|
|
||||||
|
### DNS — Cloudflare
|
||||||
|
|
||||||
|
Multi-instance support: `-a <instance>` selects a named instance (e.g. `personal`, `work`). Omit `-a` to use the default from `cloudflare.default` in credentials.json.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# List zones (domains)
|
||||||
|
~/.config/mosaic/tools/cloudflare/zone-list.sh [-a instance]
|
||||||
|
|
||||||
|
# List DNS records (zone by name or ID)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-list.sh -z <zone> [-a instance] [-t type] [-n name]
|
||||||
|
|
||||||
|
# Create DNS record
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-create.sh -z <zone> -t <type> -n <name> -c <content> [-a instance] [-p] [-l ttl] [-P priority]
|
||||||
|
|
||||||
|
# Update DNS record
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-update.sh -z <zone> -r <record-id> -t <type> -n <name> -c <content> [-a instance] [-p] [-l ttl]
|
||||||
|
|
||||||
|
# Delete DNS record
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-delete.sh -z <zone> -r <record-id> [-a instance]
|
||||||
```
|
```
|
||||||
|
|
||||||
### IT Service — GLPI
|
### IT Service — GLPI
|
||||||
@@ -100,9 +139,77 @@ Mosaic wrappers at `~/.config/mosaic/tools/git/*.sh` handle platform detection a
|
|||||||
# Source in any script to load service credentials
|
# Source in any script to load service credentials
|
||||||
source ~/.config/mosaic/tools/_lib/credentials.sh
|
source ~/.config/mosaic/tools/_lib/credentials.sh
|
||||||
load_credentials <service-name>
|
load_credentials <service-name>
|
||||||
# Supported: portainer, coolify, authentik, glpi, github, gitea-mosaicstack, gitea-usc, woodpecker
|
# Supported: portainer, coolify, authentik, glpi, github, gitea-mosaicstack, gitea-usc, woodpecker, cloudflare, turbo-cache, openbrain
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### OpenBrain — Semantic Memory (PRIMARY)
|
||||||
|
|
||||||
|
Self-hosted semantic brain backed by pgvector. Primary shared memory layer for all agents across all sessions and harnesses. Stores and retrieves decisions, context, project state, and observations via semantic search.
|
||||||
|
|
||||||
|
**Credentials:** `load_credentials openbrain` → exports `OPENBRAIN_URL`, `OPENBRAIN_TOKEN`
|
||||||
|
|
||||||
|
Configure in your credentials.json:
|
||||||
|
```json
|
||||||
|
"openbrain": {
|
||||||
|
"url": "https://<your-openbrain-host>",
|
||||||
|
"api_key": "<your-api-key>"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**REST API** (any language, any harness):
|
||||||
|
```bash
|
||||||
|
source ~/.config/mosaic/tools/_lib/credentials.sh && load_credentials openbrain
|
||||||
|
|
||||||
|
# --- Read ---
|
||||||
|
curl -s -H "Authorization: Bearer $OPENBRAIN_TOKEN" "$OPENBRAIN_URL/v1/thoughts/recent?limit=5"
|
||||||
|
curl -s -H "Authorization: Bearer $OPENBRAIN_TOKEN" "$OPENBRAIN_URL/v1/thoughts/{id}"
|
||||||
|
curl -s -H "Authorization: Bearer $OPENBRAIN_TOKEN" \
|
||||||
|
"$OPENBRAIN_URL/v1/thoughts?source=agent-name&metadata_id=my-entity&limit=10"
|
||||||
|
|
||||||
|
# --- Search ---
|
||||||
|
curl -s -X POST -H "Authorization: Bearer $OPENBRAIN_TOKEN" -H "Content-Type: application/json" \
|
||||||
|
-d '{"query": "your search", "limit": 5}' "$OPENBRAIN_URL/v1/search"
|
||||||
|
|
||||||
|
# --- Capture ---
|
||||||
|
curl -s -X POST -H "Authorization: Bearer $OPENBRAIN_TOKEN" -H "Content-Type: application/json" \
|
||||||
|
-d '{"content": "...", "source": "agent-name", "metadata": {}}' "$OPENBRAIN_URL/v1/thoughts"
|
||||||
|
|
||||||
|
# --- Update (re-embeds if content changes) ---
|
||||||
|
curl -s -X PATCH -H "Authorization: Bearer $OPENBRAIN_TOKEN" -H "Content-Type: application/json" \
|
||||||
|
-d '{"content": "updated text", "metadata": {"key": "val"}}' "$OPENBRAIN_URL/v1/thoughts/{id}"
|
||||||
|
|
||||||
|
# --- Delete single ---
|
||||||
|
curl -s -X DELETE -H "Authorization: Bearer $OPENBRAIN_TOKEN" "$OPENBRAIN_URL/v1/thoughts/{id}"
|
||||||
|
|
||||||
|
# --- Bulk delete by filter (source and/or metadata_id required) ---
|
||||||
|
curl -s -X DELETE -H "Authorization: Bearer $OPENBRAIN_TOKEN" \
|
||||||
|
"$OPENBRAIN_URL/v1/thoughts?source=agent-name&metadata_id=my-entity"
|
||||||
|
|
||||||
|
# --- Stats ---
|
||||||
|
curl -s -H "Authorization: Bearer $OPENBRAIN_TOKEN" "$OPENBRAIN_URL/v1/stats"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Python client** (if jarvis-brain is available on PYTHONPATH):
|
||||||
|
```bash
|
||||||
|
python tools/openbrain_client.py search "topic"
|
||||||
|
python tools/openbrain_client.py capture "decision or observation" --source agent-name
|
||||||
|
python tools/openbrain_client.py recent --limit 5
|
||||||
|
python tools/openbrain_client.py stats
|
||||||
|
```
|
||||||
|
|
||||||
|
**MCP (Claude Code sessions):** When connected, all CRUD tools are available natively:
|
||||||
|
`capture`, `search`, `recent`, `stats`, `get`, `update`, `delete`, `delete_where`, `list_thoughts`
|
||||||
|
|
||||||
|
**When to use openbrain (required for all agents):**
|
||||||
|
|
||||||
|
| Trigger | Action |
|
||||||
|
|---------|--------|
|
||||||
|
| Session start | Search/recent to load prior context |
|
||||||
|
| Significant decision made | Capture with rationale |
|
||||||
|
| Blocker or gotcha discovered | Capture immediately |
|
||||||
|
| Task or milestone completed | Capture summary |
|
||||||
|
| Cross-agent handoff | Capture current state |
|
||||||
|
|
||||||
## Git Providers
|
## Git Providers
|
||||||
|
|
||||||
| Instance | URL | CLI | Purpose |
|
| Instance | URL | CLI | Purpose |
|
||||||
|
|||||||
253
bin/mosaic
253
bin/mosaic
@@ -51,12 +51,20 @@ Management:
|
|||||||
release-upgrade [...] Upgrade installed Mosaic release
|
release-upgrade [...] Upgrade installed Mosaic release
|
||||||
project-upgrade [...] Clean up stale SOUL.md/CLAUDE.md in a project
|
project-upgrade [...] Clean up stale SOUL.md/CLAUDE.md in a project
|
||||||
|
|
||||||
|
PRD:
|
||||||
|
prdy <subcommand> PRD creation and validation
|
||||||
|
init Create docs/PRD.md via guided runtime session
|
||||||
|
update Update existing PRD via guided runtime session
|
||||||
|
validate Check PRD completeness (bash-only)
|
||||||
|
status Quick PRD health check (one-liner)
|
||||||
|
|
||||||
Coordinator (r0):
|
Coordinator (r0):
|
||||||
coord <subcommand> Manual coordinator tools
|
coord <subcommand> Manual coordinator tools
|
||||||
init Initialize a new mission
|
init Initialize a new mission
|
||||||
mission Show mission progress dashboard
|
mission Show mission progress dashboard
|
||||||
status Check agent session health
|
status Check agent session health
|
||||||
continue Generate continuation prompt
|
continue Generate continuation prompt
|
||||||
|
run Generate context and launch selected runtime
|
||||||
resume Crash recovery
|
resume Crash recovery
|
||||||
|
|
||||||
Options:
|
Options:
|
||||||
@@ -176,6 +184,40 @@ MISSION_EOF
|
|||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Inject PRD status so the agent knows requirements state
|
||||||
|
local prd_file="docs/PRD.md"
|
||||||
|
if [[ -f "$prd_file" ]]; then
|
||||||
|
local prd_sections=0
|
||||||
|
local prd_assumptions=0
|
||||||
|
for entry in "Problem Statement|^#{2,3} .*(problem statement|objective)" \
|
||||||
|
"Scope / Non-Goals|^#{2,3} .*(scope|non.goal|out of scope|in.scope)" \
|
||||||
|
"User Stories / Requirements|^#{2,3} .*(user stor|stakeholder|user.*requirement)" \
|
||||||
|
"Functional Requirements|^#{2,3} .*functional requirement" \
|
||||||
|
"Non-Functional Requirements|^#{2,3} .*non.functional" \
|
||||||
|
"Acceptance Criteria|^#{2,3} .*acceptance criteria" \
|
||||||
|
"Technical Considerations|^#{2,3} .*(technical consideration|constraint|dependenc)" \
|
||||||
|
"Risks / Open Questions|^#{2,3} .*(risk|open question)" \
|
||||||
|
"Success Metrics / Testing|^#{2,3} .*(success metric|test|verification)" \
|
||||||
|
"Milestones / Delivery|^#{2,3} .*(milestone|delivery|scope version)"; do
|
||||||
|
local pattern="${entry#*|}"
|
||||||
|
grep -qiE "$pattern" "$prd_file" 2>/dev/null && prd_sections=$((prd_sections + 1))
|
||||||
|
done
|
||||||
|
prd_assumptions=$(grep -c 'ASSUMPTION:' "$prd_file" 2>/dev/null || echo 0)
|
||||||
|
|
||||||
|
local prd_status="ready"
|
||||||
|
(( prd_sections < 10 )) && prd_status="incomplete ($prd_sections/10 sections)"
|
||||||
|
|
||||||
|
cat <<PRD_EOF
|
||||||
|
|
||||||
|
# PRD Status
|
||||||
|
|
||||||
|
- **File:** docs/PRD.md
|
||||||
|
- **Status:** $prd_status
|
||||||
|
- **Assumptions:** $prd_assumptions
|
||||||
|
|
||||||
|
PRD_EOF
|
||||||
|
fi
|
||||||
|
|
||||||
cat <<'EOF'
|
cat <<'EOF'
|
||||||
# Mosaic Launcher Runtime Contract (Hard Gate)
|
# Mosaic Launcher Runtime Contract (Hard Gate)
|
||||||
|
|
||||||
@@ -241,6 +283,47 @@ _detect_mission_prompt() {
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# Write a session lock if an active mission exists in the current directory.
|
||||||
|
# Called before exec so $$ captures the PID that will become the agent process.
|
||||||
|
_write_launcher_session_lock() {
|
||||||
|
local runtime="$1"
|
||||||
|
local mission_file=".mosaic/orchestrator/mission.json"
|
||||||
|
local lock_file=".mosaic/orchestrator/session.lock"
|
||||||
|
|
||||||
|
# Only write lock if mission exists and is active
|
||||||
|
[[ -f "$mission_file" ]] || return 0
|
||||||
|
command -v jq &>/dev/null || return 0
|
||||||
|
|
||||||
|
local m_status
|
||||||
|
m_status="$(jq -r '.status // "inactive"' "$mission_file" 2>/dev/null)"
|
||||||
|
[[ "$m_status" == "active" || "$m_status" == "paused" ]] || return 0
|
||||||
|
|
||||||
|
local session_id
|
||||||
|
session_id="${runtime}-$(date +%Y%m%d-%H%M%S)-$$"
|
||||||
|
|
||||||
|
jq -n \
|
||||||
|
--arg sid "$session_id" \
|
||||||
|
--arg rt "$runtime" \
|
||||||
|
--arg pid "$$" \
|
||||||
|
--arg ts "$(date -u +%Y-%m-%dT%H:%M:%SZ)" \
|
||||||
|
--arg pp "$(pwd)" \
|
||||||
|
--arg mid "" \
|
||||||
|
'{
|
||||||
|
session_id: $sid,
|
||||||
|
runtime: $rt,
|
||||||
|
pid: ($pid | tonumber),
|
||||||
|
started_at: $ts,
|
||||||
|
project_path: $pp,
|
||||||
|
milestone_id: $mid
|
||||||
|
}' > "$lock_file"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Clean up session lock on exit (covers normal exit + signals).
|
||||||
|
# Registered via trap after _write_launcher_session_lock succeeds.
|
||||||
|
_cleanup_session_lock() {
|
||||||
|
rm -f ".mosaic/orchestrator/session.lock" 2>/dev/null
|
||||||
|
}
|
||||||
|
|
||||||
# Launcher functions
|
# Launcher functions
|
||||||
launch_claude() {
|
launch_claude() {
|
||||||
check_mosaic_home
|
check_mosaic_home
|
||||||
@@ -257,6 +340,8 @@ launch_claude() {
|
|||||||
|
|
||||||
# If active mission exists and no user prompt was given, inject initial prompt
|
# If active mission exists and no user prompt was given, inject initial prompt
|
||||||
_detect_mission_prompt
|
_detect_mission_prompt
|
||||||
|
_write_launcher_session_lock "claude"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
||||||
echo "[mosaic] Launching Claude Code (active mission detected)..."
|
echo "[mosaic] Launching Claude Code (active mission detected)..."
|
||||||
exec claude --append-system-prompt "$runtime_prompt" "$MOSAIC_MISSION_PROMPT"
|
exec claude --append-system-prompt "$runtime_prompt" "$MOSAIC_MISSION_PROMPT"
|
||||||
@@ -277,6 +362,8 @@ launch_opencode() {
|
|||||||
|
|
||||||
# OpenCode reads from ~/.config/opencode/AGENTS.md
|
# OpenCode reads from ~/.config/opencode/AGENTS.md
|
||||||
ensure_runtime_config "opencode" "$HOME/.config/opencode/AGENTS.md"
|
ensure_runtime_config "opencode" "$HOME/.config/opencode/AGENTS.md"
|
||||||
|
_write_launcher_session_lock "opencode"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
echo "[mosaic] Launching OpenCode..."
|
echo "[mosaic] Launching OpenCode..."
|
||||||
exec opencode "$@"
|
exec opencode "$@"
|
||||||
}
|
}
|
||||||
@@ -292,8 +379,16 @@ launch_codex() {
|
|||||||
|
|
||||||
# Codex reads from ~/.codex/instructions.md
|
# Codex reads from ~/.codex/instructions.md
|
||||||
ensure_runtime_config "codex" "$HOME/.codex/instructions.md"
|
ensure_runtime_config "codex" "$HOME/.codex/instructions.md"
|
||||||
echo "[mosaic] Launching Codex..."
|
_detect_mission_prompt
|
||||||
exec codex "$@"
|
_write_launcher_session_lock "codex"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
|
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
||||||
|
echo "[mosaic] Launching Codex (active mission detected)..."
|
||||||
|
exec codex "$MOSAIC_MISSION_PROMPT"
|
||||||
|
else
|
||||||
|
echo "[mosaic] Launching Codex..."
|
||||||
|
exec codex "$@"
|
||||||
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
launch_yolo() {
|
launch_yolo() {
|
||||||
@@ -319,6 +414,8 @@ launch_yolo() {
|
|||||||
runtime_prompt="$(build_runtime_prompt "claude")"
|
runtime_prompt="$(build_runtime_prompt "claude")"
|
||||||
|
|
||||||
_detect_mission_prompt
|
_detect_mission_prompt
|
||||||
|
_write_launcher_session_lock "claude"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
||||||
echo "[mosaic] Launching Claude Code in YOLO mode (active mission detected)..."
|
echo "[mosaic] Launching Claude Code in YOLO mode (active mission detected)..."
|
||||||
exec claude --dangerously-skip-permissions --append-system-prompt "$runtime_prompt" "$MOSAIC_MISSION_PROMPT"
|
exec claude --dangerously-skip-permissions --append-system-prompt "$runtime_prompt" "$MOSAIC_MISSION_PROMPT"
|
||||||
@@ -336,8 +433,16 @@ launch_yolo() {
|
|||||||
|
|
||||||
# Codex reads instructions.md from ~/.codex and supports a direct dangerous flag.
|
# Codex reads instructions.md from ~/.codex and supports a direct dangerous flag.
|
||||||
ensure_runtime_config "codex" "$HOME/.codex/instructions.md"
|
ensure_runtime_config "codex" "$HOME/.codex/instructions.md"
|
||||||
echo "[mosaic] Launching Codex in YOLO mode (dangerous permissions enabled)..."
|
_detect_mission_prompt
|
||||||
exec codex --dangerously-bypass-approvals-and-sandbox "$@"
|
_write_launcher_session_lock "codex"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
|
if [[ -n "$MOSAIC_MISSION_PROMPT" && $# -eq 0 ]]; then
|
||||||
|
echo "[mosaic] Launching Codex in YOLO mode (active mission detected)..."
|
||||||
|
exec codex --dangerously-bypass-approvals-and-sandbox "$MOSAIC_MISSION_PROMPT"
|
||||||
|
else
|
||||||
|
echo "[mosaic] Launching Codex in YOLO mode (dangerous permissions enabled)..."
|
||||||
|
exec codex --dangerously-bypass-approvals-and-sandbox "$@"
|
||||||
|
fi
|
||||||
;;
|
;;
|
||||||
opencode)
|
opencode)
|
||||||
check_mosaic_home
|
check_mosaic_home
|
||||||
@@ -348,6 +453,8 @@ launch_yolo() {
|
|||||||
|
|
||||||
# OpenCode defaults to allow-all permissions unless user config restricts them.
|
# OpenCode defaults to allow-all permissions unless user config restricts them.
|
||||||
ensure_runtime_config "opencode" "$HOME/.config/opencode/AGENTS.md"
|
ensure_runtime_config "opencode" "$HOME/.config/opencode/AGENTS.md"
|
||||||
|
_write_launcher_session_lock "opencode"
|
||||||
|
trap _cleanup_session_lock EXIT INT TERM
|
||||||
echo "[mosaic] Launching OpenCode in YOLO mode..."
|
echo "[mosaic] Launching OpenCode in YOLO mode..."
|
||||||
exec opencode "$@"
|
exec opencode "$@"
|
||||||
;;
|
;;
|
||||||
@@ -410,26 +517,59 @@ run_seq() {
|
|||||||
|
|
||||||
run_coord() {
|
run_coord() {
|
||||||
check_mosaic_home
|
check_mosaic_home
|
||||||
local subcmd="${1:-help}"
|
local runtime="claude"
|
||||||
shift || true
|
local runtime_flag=""
|
||||||
|
local -a coord_args=()
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--claude|--codex)
|
||||||
|
local selected_runtime="${1#--}"
|
||||||
|
if [[ -n "$runtime_flag" ]] && [[ "$runtime" != "$selected_runtime" ]]; then
|
||||||
|
echo "[mosaic] ERROR: --claude and --codex are mutually exclusive for 'mosaic coord'." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
runtime="$selected_runtime"
|
||||||
|
runtime_flag="$1"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
coord_args+=("$1")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
local subcmd="${coord_args[0]:-help}"
|
||||||
|
if (( ${#coord_args[@]} > 1 )); then
|
||||||
|
set -- "${coord_args[@]:1}"
|
||||||
|
else
|
||||||
|
set --
|
||||||
|
fi
|
||||||
|
|
||||||
local tool_dir="$MOSAIC_HOME/tools/orchestrator"
|
local tool_dir="$MOSAIC_HOME/tools/orchestrator"
|
||||||
|
|
||||||
case "$subcmd" in
|
case "$subcmd" in
|
||||||
status|session)
|
status|session)
|
||||||
exec bash "$tool_dir/session-status.sh" "$@"
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/session-status.sh" "$@"
|
||||||
;;
|
;;
|
||||||
init)
|
init)
|
||||||
exec bash "$tool_dir/mission-init.sh" "$@"
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/mission-init.sh" "$@"
|
||||||
;;
|
;;
|
||||||
mission|progress)
|
mission|progress)
|
||||||
exec bash "$tool_dir/mission-status.sh" "$@"
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/mission-status.sh" "$@"
|
||||||
;;
|
;;
|
||||||
continue|next)
|
continue|next)
|
||||||
exec bash "$tool_dir/continue-prompt.sh" "$@"
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/continue-prompt.sh" "$@"
|
||||||
|
;;
|
||||||
|
run|start)
|
||||||
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/session-run.sh" "$@"
|
||||||
|
;;
|
||||||
|
smoke|test)
|
||||||
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/smoke-test.sh" "$@"
|
||||||
;;
|
;;
|
||||||
resume|recover)
|
resume|recover)
|
||||||
exec bash "$tool_dir/session-resume.sh" "$@"
|
MOSAIC_COORD_RUNTIME="$runtime" exec bash "$tool_dir/session-resume.sh" "$@"
|
||||||
;;
|
;;
|
||||||
help|*)
|
help|*)
|
||||||
cat <<COORD_USAGE
|
cat <<COORD_USAGE
|
||||||
@@ -440,12 +580,23 @@ Commands:
|
|||||||
mission [--project <path>] Show mission progress dashboard
|
mission [--project <path>] Show mission progress dashboard
|
||||||
status [--project <path>] Check agent session health
|
status [--project <path>] Check agent session health
|
||||||
continue [--project <path>] Generate continuation prompt for next session
|
continue [--project <path>] Generate continuation prompt for next session
|
||||||
|
run [--project <path>] Generate context and launch selected runtime
|
||||||
|
smoke Run orchestration behavior smoke checks
|
||||||
resume [--project <path>] Crash recovery (detect dirty state, generate fix)
|
resume [--project <path>] Crash recovery (detect dirty state, generate fix)
|
||||||
|
|
||||||
|
Runtime:
|
||||||
|
--claude Use Claude runtime hints/prompts (default)
|
||||||
|
--codex Use Codex runtime hints/prompts
|
||||||
|
|
||||||
Examples:
|
Examples:
|
||||||
mosaic coord init --name "Security Fix" --milestones "Critical,High,Medium"
|
mosaic coord init --name "Security Fix" --milestones "Critical,High,Medium"
|
||||||
mosaic coord mission
|
mosaic coord mission
|
||||||
|
mosaic coord --codex mission
|
||||||
mosaic coord continue --copy
|
mosaic coord continue --copy
|
||||||
|
mosaic coord run
|
||||||
|
mosaic coord run --codex
|
||||||
|
mosaic coord smoke
|
||||||
|
mosaic coord continue --codex --copy
|
||||||
|
|
||||||
COORD_USAGE
|
COORD_USAGE
|
||||||
;;
|
;;
|
||||||
@@ -463,8 +614,9 @@ _check_resumable_session() {
|
|||||||
local pid
|
local pid
|
||||||
pid="$(jq -r '.pid // 0' "$lock_file" 2>/dev/null)"
|
pid="$(jq -r '.pid // 0' "$lock_file" 2>/dev/null)"
|
||||||
if [[ -n "$pid" ]] && [[ "$pid" != "0" ]] && ! kill -0 "$pid" 2>/dev/null; then
|
if [[ -n "$pid" ]] && [[ "$pid" != "0" ]] && ! kill -0 "$pid" 2>/dev/null; then
|
||||||
echo "[mosaic] Previous orchestration session detected (crashed)."
|
# Stale lock from a dead session — clean it up
|
||||||
echo "[mosaic] Run: mosaic coord resume"
|
rm -f "$lock_file"
|
||||||
|
echo "[mosaic] Cleaned up stale session lock (PID $pid no longer running)."
|
||||||
echo ""
|
echo ""
|
||||||
fi
|
fi
|
||||||
elif [[ -f "$mission_file" ]]; then
|
elif [[ -f "$mission_file" ]]; then
|
||||||
@@ -478,6 +630,80 @@ _check_resumable_session() {
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
run_prdy() {
|
||||||
|
check_mosaic_home
|
||||||
|
local runtime="claude"
|
||||||
|
local runtime_flag=""
|
||||||
|
local -a prdy_args=()
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--claude|--codex)
|
||||||
|
local selected_runtime="${1#--}"
|
||||||
|
if [[ -n "$runtime_flag" ]] && [[ "$runtime" != "$selected_runtime" ]]; then
|
||||||
|
echo "[mosaic] ERROR: --claude and --codex are mutually exclusive for 'mosaic prdy'." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
runtime="$selected_runtime"
|
||||||
|
runtime_flag="$1"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
prdy_args+=("$1")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
local subcmd="${prdy_args[0]:-help}"
|
||||||
|
if (( ${#prdy_args[@]} > 1 )); then
|
||||||
|
set -- "${prdy_args[@]:1}"
|
||||||
|
else
|
||||||
|
set --
|
||||||
|
fi
|
||||||
|
|
||||||
|
local tool_dir="$MOSAIC_HOME/tools/prdy"
|
||||||
|
|
||||||
|
case "$subcmd" in
|
||||||
|
init)
|
||||||
|
MOSAIC_PRDY_RUNTIME="$runtime" exec bash "$tool_dir/prdy-init.sh" "$@"
|
||||||
|
;;
|
||||||
|
update)
|
||||||
|
MOSAIC_PRDY_RUNTIME="$runtime" exec bash "$tool_dir/prdy-update.sh" "$@"
|
||||||
|
;;
|
||||||
|
validate|check)
|
||||||
|
MOSAIC_PRDY_RUNTIME="$runtime" exec bash "$tool_dir/prdy-validate.sh" "$@"
|
||||||
|
;;
|
||||||
|
status)
|
||||||
|
exec bash "$tool_dir/prdy-status.sh" "$@"
|
||||||
|
;;
|
||||||
|
help|*)
|
||||||
|
cat <<PRDY_USAGE
|
||||||
|
mosaic prdy — PRD creation and validation tools
|
||||||
|
|
||||||
|
Commands:
|
||||||
|
init [--project <path>] [--name <feature>] Create docs/PRD.md via guided runtime session
|
||||||
|
update [--project <path>] Update existing docs/PRD.md via guided runtime session
|
||||||
|
validate [--project <path>] Check PRD completeness against Mosaic guide (bash-only)
|
||||||
|
status [--project <path>] [--format short|json] Quick PRD health check (one-liner)
|
||||||
|
|
||||||
|
Runtime:
|
||||||
|
--claude Use Claude runtime (default)
|
||||||
|
--codex Use Codex runtime
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
mosaic prdy init --name "User Authentication"
|
||||||
|
mosaic prdy update
|
||||||
|
mosaic prdy --codex init --name "User Authentication"
|
||||||
|
mosaic prdy validate
|
||||||
|
|
||||||
|
Output location: docs/PRD.md (per Mosaic PRD guide)
|
||||||
|
|
||||||
|
PRDY_USAGE
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
run_bootstrap() {
|
run_bootstrap() {
|
||||||
check_mosaic_home
|
check_mosaic_home
|
||||||
exec "$MOSAIC_HOME/bin/mosaic-bootstrap-repo" "$@"
|
exec "$MOSAIC_HOME/bin/mosaic-bootstrap-repo" "$@"
|
||||||
@@ -550,6 +776,7 @@ case "$command" in
|
|||||||
sync) run_sync "$@" ;;
|
sync) run_sync "$@" ;;
|
||||||
seq) run_seq "$@" ;;
|
seq) run_seq "$@" ;;
|
||||||
bootstrap) run_bootstrap "$@" ;;
|
bootstrap) run_bootstrap "$@" ;;
|
||||||
|
prdy) run_prdy "$@" ;;
|
||||||
coord) run_coord "$@" ;;
|
coord) run_coord "$@" ;;
|
||||||
upgrade) run_upgrade "$@" ;;
|
upgrade) run_upgrade "$@" ;;
|
||||||
release-upgrade) run_release_upgrade "$@" ;;
|
release-upgrade) run_release_upgrade "$@" ;;
|
||||||
|
|||||||
119
bin/mosaic-ensure-excalidraw
Executable file
119
bin/mosaic-ensure-excalidraw
Executable file
@@ -0,0 +1,119 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
TOOLS_DIR="$MOSAIC_HOME/tools/excalidraw"
|
||||||
|
MODE="apply"
|
||||||
|
SCOPE="user"
|
||||||
|
|
||||||
|
err() { echo "[mosaic-excalidraw] ERROR: $*" >&2; }
|
||||||
|
log() { echo "[mosaic-excalidraw] $*"; }
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--check) MODE="check"; shift ;;
|
||||||
|
--scope)
|
||||||
|
if [[ $# -lt 2 ]]; then
|
||||||
|
err "--scope requires a value: user|local"
|
||||||
|
exit 2
|
||||||
|
fi
|
||||||
|
SCOPE="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
err "Unknown argument: $1"
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
require_binary() {
|
||||||
|
local name="$1"
|
||||||
|
if ! command -v "$name" >/dev/null 2>&1; then
|
||||||
|
err "Required binary missing: $name"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
check_software() {
|
||||||
|
require_binary node
|
||||||
|
require_binary npm
|
||||||
|
}
|
||||||
|
|
||||||
|
check_tool_dir() {
|
||||||
|
[[ -d "$TOOLS_DIR" ]] || { err "Tool dir not found: $TOOLS_DIR"; return 1; }
|
||||||
|
[[ -f "$TOOLS_DIR/package.json" ]] || { err "package.json not found in $TOOLS_DIR"; return 1; }
|
||||||
|
[[ -f "$TOOLS_DIR/launch.sh" ]] || { err "launch.sh not found in $TOOLS_DIR"; return 1; }
|
||||||
|
}
|
||||||
|
|
||||||
|
check_npm_deps() {
|
||||||
|
[[ -d "$TOOLS_DIR/node_modules/@modelcontextprotocol" ]] || return 1
|
||||||
|
[[ -d "$TOOLS_DIR/node_modules/@excalidraw" ]] || return 1
|
||||||
|
[[ -d "$TOOLS_DIR/node_modules/jsdom" ]] || return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
install_npm_deps() {
|
||||||
|
if check_npm_deps; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
log "Installing npm deps in $TOOLS_DIR..."
|
||||||
|
(cd "$TOOLS_DIR" && npm install --silent) || {
|
||||||
|
err "npm install failed in $TOOLS_DIR"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
check_claude_config() {
|
||||||
|
python3 - <<'PY'
|
||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
p = Path.home() / ".claude.json"
|
||||||
|
if not p.exists():
|
||||||
|
raise SystemExit(1)
|
||||||
|
try:
|
||||||
|
data = json.loads(p.read_text(encoding="utf-8"))
|
||||||
|
except Exception:
|
||||||
|
raise SystemExit(1)
|
||||||
|
mcp = data.get("mcpServers")
|
||||||
|
if not isinstance(mcp, dict):
|
||||||
|
raise SystemExit(1)
|
||||||
|
entry = mcp.get("excalidraw")
|
||||||
|
if not isinstance(entry, dict):
|
||||||
|
raise SystemExit(1)
|
||||||
|
cmd = entry.get("command", "")
|
||||||
|
if not cmd.endswith("launch.sh"):
|
||||||
|
raise SystemExit(1)
|
||||||
|
PY
|
||||||
|
}
|
||||||
|
|
||||||
|
apply_claude_config() {
|
||||||
|
require_binary claude
|
||||||
|
local launch_sh="$TOOLS_DIR/launch.sh"
|
||||||
|
claude mcp add --scope user excalidraw -- "$launch_sh"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ── Check mode ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if [[ "$MODE" == "check" ]]; then
|
||||||
|
check_software
|
||||||
|
check_tool_dir
|
||||||
|
if ! check_npm_deps; then
|
||||||
|
err "npm deps not installed in $TOOLS_DIR (run without --check to install)"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if ! check_claude_config; then
|
||||||
|
err "excalidraw not registered in ~/.claude.json"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
log "excalidraw MCP is configured and available"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Apply mode ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
check_software
|
||||||
|
check_tool_dir
|
||||||
|
install_npm_deps
|
||||||
|
apply_claude_config
|
||||||
|
log "excalidraw MCP configured (scope: $SCOPE)"
|
||||||
@@ -1,10 +1,10 @@
|
|||||||
# mosaic-ensure-sequential-thinking.ps1
|
# mosaic-ensure-sequential-thinking.ps1
|
||||||
$ErrorActionPreference = "Stop"
|
|
||||||
|
|
||||||
param(
|
param(
|
||||||
[switch]$Check
|
[switch]$Check
|
||||||
)
|
)
|
||||||
|
|
||||||
|
$ErrorActionPreference = "Stop"
|
||||||
|
|
||||||
$Pkg = "@modelcontextprotocol/server-sequential-thinking"
|
$Pkg = "@modelcontextprotocol/server-sequential-thinking"
|
||||||
|
|
||||||
function Require-Binary {
|
function Require-Binary {
|
||||||
@@ -43,7 +43,7 @@ function Set-CodexConfig {
|
|||||||
|
|
||||||
$content = Get-Content $path -Raw
|
$content = Get-Content $path -Raw
|
||||||
$content = [regex]::Replace($content, "(?ms)^\[mcp_servers\.(sequential-thinking|sequential_thinking)\].*?(?=^\[|\z)", "")
|
$content = [regex]::Replace($content, "(?ms)^\[mcp_servers\.(sequential-thinking|sequential_thinking)\].*?(?=^\[|\z)", "")
|
||||||
$content = $content.TrimEnd() + "`n`n[mcp_servers.sequential-thinking]`ncommand = \"npx\"`nargs = [\"-y\", \"@modelcontextprotocol/server-sequential-thinking\"]`n"
|
$content = $content.TrimEnd() + "`n`n[mcp_servers.sequential-thinking]`ncommand = `"npx`"`nargs = [`"-y`", `"@modelcontextprotocol/server-sequential-thinking`"]`n"
|
||||||
Set-Content -Path $path -Value $content -Encoding UTF8
|
Set-Content -Path $path -Value $content -Encoding UTF8
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
114
bin/mosaic.ps1
114
bin/mosaic.ps1
@@ -96,6 +96,88 @@ function Assert-SequentialThinking {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function Get-ActiveMission {
|
||||||
|
$missionFile = Join-Path (Get-Location) ".mosaic\orchestrator\mission.json"
|
||||||
|
if (-not (Test-Path $missionFile)) {
|
||||||
|
return $null
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
$mission = Get-Content $missionFile -Raw | ConvertFrom-Json
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return $null
|
||||||
|
}
|
||||||
|
|
||||||
|
$status = [string]$mission.status
|
||||||
|
if ([string]::IsNullOrWhiteSpace($status)) {
|
||||||
|
$status = "inactive"
|
||||||
|
}
|
||||||
|
if ($status -ne "active" -and $status -ne "paused") {
|
||||||
|
return $null
|
||||||
|
}
|
||||||
|
|
||||||
|
$name = [string]$mission.name
|
||||||
|
if ([string]::IsNullOrWhiteSpace($name)) {
|
||||||
|
$name = "unnamed"
|
||||||
|
}
|
||||||
|
|
||||||
|
$id = [string]$mission.mission_id
|
||||||
|
if ([string]::IsNullOrWhiteSpace($id)) {
|
||||||
|
$id = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
$milestones = @($mission.milestones)
|
||||||
|
$milestoneCount = $milestones.Count
|
||||||
|
$milestoneCompleted = @($milestones | Where-Object { $_.status -eq "completed" }).Count
|
||||||
|
|
||||||
|
return [PSCustomObject]@{
|
||||||
|
Name = $name
|
||||||
|
Id = $id
|
||||||
|
Status = $status
|
||||||
|
MilestoneCount = $milestoneCount
|
||||||
|
MilestoneCompleted = $milestoneCompleted
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function Get-MissionContractBlock {
|
||||||
|
$mission = Get-ActiveMission
|
||||||
|
if ($null -eq $mission) {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
return @"
|
||||||
|
# ACTIVE MISSION — HARD GATE (Read Before Anything Else)
|
||||||
|
|
||||||
|
An active orchestration mission exists in this project. This is a BLOCKING requirement.
|
||||||
|
|
||||||
|
**Mission:** $($mission.Name)
|
||||||
|
**ID:** $($mission.Id)
|
||||||
|
**Status:** $($mission.Status)
|
||||||
|
**Milestones:** $($mission.MilestoneCompleted) / $($mission.MilestoneCount) completed
|
||||||
|
|
||||||
|
## MANDATORY — Before ANY Response to the User
|
||||||
|
|
||||||
|
You MUST complete these steps before responding to any user message, including simple greetings:
|
||||||
|
|
||||||
|
1. Read `~/.config/mosaic/guides/ORCHESTRATOR-PROTOCOL.md` (mission lifecycle protocol)
|
||||||
|
2. Read `docs/MISSION-MANIFEST.md` for full mission scope, milestones, and success criteria
|
||||||
|
3. Read the latest scratchpad in `docs/scratchpads/` for session history, decisions, and corrections
|
||||||
|
4. Read `docs/TASKS.md` for current task state (what is done, what is next)
|
||||||
|
5. After reading all four, acknowledge the mission state to the user before proceeding
|
||||||
|
|
||||||
|
If the user gives a task, execute it within the mission context. If no task is given, present mission status and ask how to proceed.
|
||||||
|
"@
|
||||||
|
}
|
||||||
|
|
||||||
|
function Get-MissionPrompt {
|
||||||
|
$mission = Get-ActiveMission
|
||||||
|
if ($null -eq $mission) {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return "Active mission detected: $($mission.Name). Read the mission state files and report status."
|
||||||
|
}
|
||||||
|
|
||||||
function Get-RuntimePrompt {
|
function Get-RuntimePrompt {
|
||||||
param(
|
param(
|
||||||
[ValidateSet("claude", "codex", "opencode")]
|
[ValidateSet("claude", "codex", "opencode")]
|
||||||
@@ -130,8 +212,14 @@ For required push/merge/issue-close/release actions, execute without routine con
|
|||||||
|
|
||||||
'@
|
'@
|
||||||
|
|
||||||
|
$missionBlock = Get-MissionContractBlock
|
||||||
$agentsContent = Get-Content (Join-Path $MosaicHome "AGENTS.md") -Raw
|
$agentsContent = Get-Content (Join-Path $MosaicHome "AGENTS.md") -Raw
|
||||||
$runtimeContent = Get-Content $runtimeFile -Raw
|
$runtimeContent = Get-Content $runtimeFile -Raw
|
||||||
|
|
||||||
|
if (-not [string]::IsNullOrWhiteSpace($missionBlock)) {
|
||||||
|
return "$missionBlock`n`n$launcherContract`n$agentsContent`n`n# Runtime-Specific Contract`n`n$runtimeContent"
|
||||||
|
}
|
||||||
|
|
||||||
return "$launcherContract`n$agentsContent`n`n# Runtime-Specific Contract`n`n$runtimeContent"
|
return "$launcherContract`n$agentsContent`n`n# Runtime-Specific Contract`n`n$runtimeContent"
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -170,7 +258,7 @@ function Invoke-Yolo {
|
|||||||
}
|
}
|
||||||
|
|
||||||
$runtime = $YoloArgs[0]
|
$runtime = $YoloArgs[0]
|
||||||
$tail = if ($YoloArgs.Count -gt 1) { $YoloArgs[1..($YoloArgs.Count - 1)] } else { @() }
|
$tail = if ($YoloArgs.Count -gt 1) { @($YoloArgs[1..($YoloArgs.Count - 1)]) } else { @() }
|
||||||
|
|
||||||
switch ($runtime) {
|
switch ($runtime) {
|
||||||
"claude" {
|
"claude" {
|
||||||
@@ -191,8 +279,15 @@ function Invoke-Yolo {
|
|||||||
Assert-Runtime "codex"
|
Assert-Runtime "codex"
|
||||||
Assert-SequentialThinking
|
Assert-SequentialThinking
|
||||||
Ensure-RuntimeConfig -Runtime "codex" -Dst (Join-Path $env:USERPROFILE ".codex\instructions.md")
|
Ensure-RuntimeConfig -Runtime "codex" -Dst (Join-Path $env:USERPROFILE ".codex\instructions.md")
|
||||||
Write-Host "[mosaic] Launching Codex in YOLO mode (dangerous permissions enabled)..."
|
$missionPrompt = Get-MissionPrompt
|
||||||
& codex --dangerously-bypass-approvals-and-sandbox @tail
|
if (-not [string]::IsNullOrWhiteSpace($missionPrompt) -and $tail.Count -eq 0) {
|
||||||
|
Write-Host "[mosaic] Launching Codex in YOLO mode (active mission detected)..."
|
||||||
|
& codex --dangerously-bypass-approvals-and-sandbox $missionPrompt
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Host "[mosaic] Launching Codex in YOLO mode (dangerous permissions enabled)..."
|
||||||
|
& codex --dangerously-bypass-approvals-and-sandbox @tail
|
||||||
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
"opencode" {
|
"opencode" {
|
||||||
@@ -219,7 +314,7 @@ if ($args.Count -eq 0) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
$command = $args[0]
|
$command = $args[0]
|
||||||
$remaining = if ($args.Count -gt 1) { $args[1..($args.Count - 1)] } else { @() }
|
$remaining = if ($args.Count -gt 1) { @($args[1..($args.Count - 1)]) } else { @() }
|
||||||
|
|
||||||
switch ($command) {
|
switch ($command) {
|
||||||
"claude" {
|
"claude" {
|
||||||
@@ -252,8 +347,15 @@ switch ($command) {
|
|||||||
Assert-SequentialThinking
|
Assert-SequentialThinking
|
||||||
# Codex reads from ~/.codex/instructions.md
|
# Codex reads from ~/.codex/instructions.md
|
||||||
Ensure-RuntimeConfig -Runtime "codex" -Dst (Join-Path $env:USERPROFILE ".codex\instructions.md")
|
Ensure-RuntimeConfig -Runtime "codex" -Dst (Join-Path $env:USERPROFILE ".codex\instructions.md")
|
||||||
Write-Host "[mosaic] Launching Codex..."
|
$missionPrompt = Get-MissionPrompt
|
||||||
& codex @remaining
|
if (-not [string]::IsNullOrWhiteSpace($missionPrompt) -and $remaining.Count -eq 0) {
|
||||||
|
Write-Host "[mosaic] Launching Codex (active mission detected)..."
|
||||||
|
& codex $missionPrompt
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Host "[mosaic] Launching Codex..."
|
||||||
|
& codex @remaining
|
||||||
|
}
|
||||||
}
|
}
|
||||||
"yolo" {
|
"yolo" {
|
||||||
Invoke-Yolo -YoloArgs $remaining
|
Invoke-Yolo -YoloArgs $remaining
|
||||||
|
|||||||
@@ -38,7 +38,7 @@ First response MUST declare mode before tool calls or implementation steps:
|
|||||||
|
|
||||||
1. Define scope, constraints, and acceptance criteria.
|
1. Define scope, constraints, and acceptance criteria.
|
||||||
2. Identify affected surfaces (API, DB, UI, infra, auth, CI/CD, docs).
|
2. Identify affected surfaces (API, DB, UI, infra, auth, CI/CD, docs).
|
||||||
3. **Deployment surface check (MANDATORY if task involves deploy, images, or containers):** Before ANY build or deploy action, check for CI/CD pipeline config (`.woodpecker/`, `.woodpecker.yml`, `.github/workflows/`). If pipelines exist, CI is the canonical build path — manual `docker build`/`docker push` is forbidden. Load `~/.config/mosaic/guides/ci-cd-pipelines.md` immediately.
|
3. **Deployment surface check (MANDATORY if task involves deploy, images, or containers):** Before ANY build or deploy action, check for CI/CD pipeline config (`.woodpecker/`, `.woodpecker.yml`, `.github/workflows/`). If pipelines exist, CI is the canonical build path — manual `docker build`/`docker push` is forbidden. Load `~/.config/mosaic/guides/CI-CD-PIPELINES.md` immediately.
|
||||||
4. Identify required guides and load them before implementation.
|
4. Identify required guides and load them before implementation.
|
||||||
5. For code/API/auth/infra changes, load `~/.config/mosaic/guides/DOCUMENTATION.md`.
|
5. For code/API/auth/infra changes, load `~/.config/mosaic/guides/DOCUMENTATION.md`.
|
||||||
6. Determine budget constraints:
|
6. Determine budget constraints:
|
||||||
@@ -153,6 +153,75 @@ The human is escalation-only for missing access, hard policy conflicts, or irrev
|
|||||||
- Magic variables (`SERVICE_FQDN_*`) require list-style env syntax, not dict-style
|
- Magic variables (`SERVICE_FQDN_*`) require list-style env syntax, not dict-style
|
||||||
- Rate limit: 200 requests per interval
|
- Rate limit: 200 requests per interval
|
||||||
|
|
||||||
|
### Cloudflare DNS Operations
|
||||||
|
|
||||||
|
Use the Cloudflare tools for any DNS configuration: pointing domains at services, adding TXT verification records, managing MX records, etc.
|
||||||
|
|
||||||
|
**Multi-instance support**: Credentials support named instances (e.g. `personal`, `work`). A `default` key in credentials.json determines which instance is used when `-a` is omitted. Pass `-a <instance>` to target a specific account.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# List all zones (domains) in the account
|
||||||
|
~/.config/mosaic/tools/cloudflare/zone-list.sh [-a instance]
|
||||||
|
|
||||||
|
# List DNS records for a zone (accepts zone name or ID)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-list.sh -z <zone> [-t type] [-n name]
|
||||||
|
|
||||||
|
# Create a DNS record
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-create.sh -z <zone> -t <type> -n <name> -c <content> [-p] [-l ttl] [-P priority]
|
||||||
|
|
||||||
|
# Update a DNS record (requires record ID from record-list)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-update.sh -z <zone> -r <record-id> -t <type> -n <name> -c <content> [-p]
|
||||||
|
|
||||||
|
# Delete a DNS record
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-delete.sh -z <zone> -r <record-id>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Flag reference:**
|
||||||
|
|
||||||
|
| Flag | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `-z` | Zone name (e.g. `mosaicstack.dev`) or 32-char zone ID |
|
||||||
|
| `-a` | Named Cloudflare instance (omit for default) |
|
||||||
|
| `-t` | Record type: `A`, `AAAA`, `CNAME`, `MX`, `TXT`, `SRV`, etc. |
|
||||||
|
| `-n` | Record name: short (`app`) or FQDN (`app.example.com`) |
|
||||||
|
| `-c` | Record content/value (IP, hostname, TXT string, etc.) |
|
||||||
|
| `-r` | Record ID (from `record-list.sh` output) |
|
||||||
|
| `-p` | Enable Cloudflare proxy (orange cloud) — omit for DNS-only (grey cloud) |
|
||||||
|
| `-l` | TTL in seconds (default: `1` = auto) |
|
||||||
|
| `-P` | Priority for MX/SRV records |
|
||||||
|
| `-f` | Output format: `table` (default) or `json` |
|
||||||
|
|
||||||
|
**Common workflows:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Point a new subdomain at a server (proxied through Cloudflare)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-create.sh \
|
||||||
|
-z example.com -t A -n myapp -c 203.0.113.10 -p
|
||||||
|
|
||||||
|
# Add a TXT record for domain verification (never proxied)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-create.sh \
|
||||||
|
-z example.com -t TXT -n _verify -c "verification=abc123"
|
||||||
|
|
||||||
|
# Check what records exist before making changes
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-list.sh -z example.com -t CNAME
|
||||||
|
|
||||||
|
# Update an existing record (get record ID from record-list first)
|
||||||
|
~/.config/mosaic/tools/cloudflare/record-update.sh \
|
||||||
|
-z example.com -r <record-id> -t A -n myapp -c 10.0.0.5 -p
|
||||||
|
```
|
||||||
|
|
||||||
|
**DNS + Deployment integration**: When deploying a new service via Coolify or Portainer that needs a public domain, the typical sequence is:
|
||||||
|
|
||||||
|
1. Create the DNS record pointing at the host IP (with `-p` for Cloudflare proxy if desired)
|
||||||
|
2. Deploy the service via Coolify/Portainer
|
||||||
|
3. Verify the domain resolves and the service is reachable
|
||||||
|
|
||||||
|
**Proxy (`-p`) guidance:**
|
||||||
|
|
||||||
|
- Use proxy (orange cloud) for web services — provides CDN, DDoS protection, and hides origin IP
|
||||||
|
- Skip proxy (grey cloud) for non-HTTP services (mail, SSH), wildcard records, or when the service handles its own TLS termination and needs direct client IP visibility
|
||||||
|
- Proxy is NOT compatible with non-standard ports outside Cloudflare's supported range
|
||||||
|
|
||||||
### Stack Health Check
|
### Stack Health Check
|
||||||
|
|
||||||
Verify all infrastructure services are reachable:
|
Verify all infrastructure services are reachable:
|
||||||
|
|||||||
@@ -1,27 +1,50 @@
|
|||||||
# Memory and Retention Rules
|
# Memory and Retention Rules
|
||||||
|
|
||||||
|
## Primary Memory Layer: OpenBrain
|
||||||
|
|
||||||
|
**OpenBrain is the canonical shared memory for all Mosaic agents across all harnesses and sessions.**
|
||||||
|
|
||||||
|
Use the `capture` MCP tool (or REST `POST /v1/thoughts`) to store:
|
||||||
|
- Discovered gotchas and workarounds
|
||||||
|
- Architectural decisions and rationale
|
||||||
|
- Project state and context for handoffs
|
||||||
|
- Anything a future agent should know
|
||||||
|
|
||||||
|
Use `search` or `recent` at session start to load prior context before acting.
|
||||||
|
|
||||||
|
This is not optional. An agent that uses local file-based memory instead of OpenBrain is a broken agent — its knowledge is invisible to every other agent on the platform.
|
||||||
|
|
||||||
## Hard Rules
|
## Hard Rules
|
||||||
|
|
||||||
1. You MUST store learned operational memory in `~/.config/mosaic/memory`.
|
1. Agent learnings MUST go to OpenBrain — not to any file-based memory location.
|
||||||
2. You MUST NOT store durable memory in runtime-native memory silos.
|
2. You MUST NOT write to runtime-native memory silos (they are write-blocked by hook).
|
||||||
3. You MUST write concise, reusable learnings that help future agents.
|
3. Active execution state belongs in project `docs/` — not in memory files.
|
||||||
4. You MUST track active execution state in project documentation, not ad-hoc text files.
|
4. `~/.config/mosaic/memory/` is for mosaic framework technical notes only, not project knowledge.
|
||||||
|
|
||||||
## Runtime-Native Memory Silos (FORBIDDEN for Durable Memory)
|
## Runtime-Native Memory Silos (WRITE-BLOCKED)
|
||||||
|
|
||||||
| Runtime | Native silo (forbidden for durable memory) | Required durable location |
|
These locations are blocked by PreToolUse hooks. Attempting to write there fails at the tool level.
|
||||||
|---|---|---|
|
|
||||||
| Claude Code | `~/.claude/projects/*/memory/` | `~/.config/mosaic/memory/` + project `docs/` |
|
|
||||||
| Codex | Runtime session/native memory under `~/.codex/` | `~/.config/mosaic/memory/` + project `docs/` |
|
|
||||||
| OpenCode | Runtime session/native memory under `~/.config/opencode/` | `~/.config/mosaic/memory/` + project `docs/` |
|
|
||||||
|
|
||||||
Treat runtime-native memory as volatile implementation detail. Do not rely on it for long-term project continuity.
|
| Runtime | Blocked silo | Use instead |
|
||||||
|
|---------|-------------|-------------|
|
||||||
|
| Claude Code | `~/.claude/projects/*/memory/*.md` | OpenBrain `capture` |
|
||||||
|
| Codex | Runtime session memory | OpenBrain `capture` |
|
||||||
|
| OpenCode | Runtime session memory | OpenBrain `capture` |
|
||||||
|
|
||||||
|
MEMORY.md files may only contain behavioral guardrails that must be injected at load-path — not knowledge.
|
||||||
|
|
||||||
## Project Continuity Files (MANDATORY)
|
## Project Continuity Files (MANDATORY)
|
||||||
|
|
||||||
| File | Purpose | Location |
|
| File | Purpose | Location |
|
||||||
|---|---|---|
|
|---|---|---|
|
||||||
| `docs/PRD.md` or `docs/PRD.json` | Source of requirements for planning, coding, testing, and review | Project `docs/` |
|
| `docs/PRD.md` or `docs/PRD.json` | Source of requirements | Project `docs/` |
|
||||||
| `docs/TASKS.md` | Canonical tracking for tasks, milestones, issues, status, and blockers | Project `docs/` |
|
| `docs/TASKS.md` | Task tracking, milestones, issues, status | Project `docs/` |
|
||||||
| `docs/scratchpads/<task>.md` | Task-specific working memory and verification evidence | Project `docs/scratchpads/` |
|
| `docs/scratchpads/<task>.md` | Task-specific working memory | Project `docs/scratchpads/` |
|
||||||
| `AGENTS.md` | Reusable local patterns and gotchas | Any working directory |
|
| `AGENTS.md` | Project-local patterns and conventions | Project root |
|
||||||
|
|
||||||
|
## How the Block Works
|
||||||
|
|
||||||
|
`~/.config/mosaic/tools/qa/prevent-memory-write.sh` is registered as a `PreToolUse` hook in
|
||||||
|
`~/.claude/settings.json`. It intercepts Write/Edit/MultiEdit calls and rejects any targeting
|
||||||
|
`~/.claude/projects/*/memory/*.md` before the tool executes. Exit code 2 blocks the call and
|
||||||
|
the agent sees a message directing it to OpenBrain instead.
|
||||||
|
|||||||
@@ -131,9 +131,9 @@ If context usage is high, produce a handoff message:
|
|||||||
4. Commit all state files
|
4. Commit all state files
|
||||||
5. The coordinator will generate a continuation prompt for the next session
|
5. The coordinator will generate a continuation prompt for the next session
|
||||||
|
|
||||||
### Continuation Prompt Format
|
### Continuation Prompt and Capsule Format
|
||||||
|
|
||||||
The coordinator generates this (via `mosaic coord continue`):
|
The coordinator generates this (via `mosaic coord continue`) and writes a machine-readable capsule at `.mosaic/orchestrator/next-task.json`:
|
||||||
|
|
||||||
```
|
```
|
||||||
## Continuation Mission
|
## Continuation Mission
|
||||||
@@ -152,6 +152,16 @@ Continue **{mission}** from existing state.
|
|||||||
4. Human launches new session and pastes the prompt
|
4. Human launches new session and pastes the prompt
|
||||||
5. New agent reads manifest, scratchpad, TASKS.md and continues
|
5. New agent reads manifest, scratchpad, TASKS.md and continues
|
||||||
|
|
||||||
|
### Between Sessions (r0 assisted)
|
||||||
|
|
||||||
|
Use `mosaic coord run` to remove copy/paste steps:
|
||||||
|
|
||||||
|
1. Agent stops
|
||||||
|
2. Human runs `mosaic coord run [--claude|--codex]`
|
||||||
|
3. Coordinator regenerates continuation prompt + `next-task.json`
|
||||||
|
4. Coordinator launches selected runtime with scoped kickoff context
|
||||||
|
5. New session resumes from next task
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 7. Failure Taxonomy Quick Reference
|
## 7. Failure Taxonomy Quick Reference
|
||||||
@@ -194,6 +204,7 @@ In r0, the Coordinator is Jason + shell scripts. No daemon. No automation.
|
|||||||
| `mosaic coord mission` | Show mission progress dashboard |
|
| `mosaic coord mission` | Show mission progress dashboard |
|
||||||
| `mosaic coord status` | Check if agent session is still running |
|
| `mosaic coord status` | Check if agent session is still running |
|
||||||
| `mosaic coord continue` | Generate continuation prompt for next session |
|
| `mosaic coord continue` | Generate continuation prompt for next session |
|
||||||
|
| `mosaic coord run [--claude|--codex]` | Generate continuation context and launch runtime |
|
||||||
| `mosaic coord resume` | Crash recovery (detect dirty state, generate fix) |
|
| `mosaic coord resume` | Crash recovery (detect dirty state, generate fix) |
|
||||||
| `mosaic coord resume --clean-lock` | Clear stale session lock after review |
|
| `mosaic coord resume --clean-lock` | Clear stale session lock after review |
|
||||||
|
|
||||||
@@ -201,7 +212,7 @@ In r0, the Coordinator is Jason + shell scripts. No daemon. No automation.
|
|||||||
|
|
||||||
```
|
```
|
||||||
init → launch agent → [agent works] → agent stops →
|
init → launch agent → [agent works] → agent stops →
|
||||||
status → mission → continue → launch agent → repeat
|
status → mission → run → repeat
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|||||||
@@ -177,6 +177,12 @@ else
|
|||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if "$TARGET_DIR/bin/mosaic-ensure-excalidraw" >/dev/null 2>&1; then
|
||||||
|
ok "excalidraw MCP configured"
|
||||||
|
else
|
||||||
|
warn "excalidraw MCP setup failed (non-fatal) — run 'mosaic-ensure-excalidraw' to retry"
|
||||||
|
fi
|
||||||
|
|
||||||
if [[ "${MOSAIC_SKIP_SKILLS_SYNC:-0}" == "1" ]]; then
|
if [[ "${MOSAIC_SKIP_SKILLS_SYNC:-0}" == "1" ]]; then
|
||||||
ok "Skills sync skipped (MOSAIC_SKIP_SKILLS_SYNC=1)"
|
ok "Skills sync skipped (MOSAIC_SKIP_SKILLS_SYNC=1)"
|
||||||
else
|
else
|
||||||
|
|||||||
106
oc-plugins/mosaic-framework/README.md
Normal file
106
oc-plugins/mosaic-framework/README.md
Normal file
@@ -0,0 +1,106 @@
|
|||||||
|
# mosaic-framework — OpenClaw Plugin
|
||||||
|
|
||||||
|
Mechanically injects the Mosaic framework contract into every OpenClaw agent session and ACP coding worker spawn. Ensures no worker starts without the mandatory load order, hard gates, worktree rules, and completion gates.
|
||||||
|
|
||||||
|
## What It Does
|
||||||
|
|
||||||
|
### For OC native agents (main, mosaic, dyor, sage, pixels)
|
||||||
|
Hooks `before_agent_start` and injects via `appendSystemContext`:
|
||||||
|
- Mosaic global hard rules (compaction-resistant)
|
||||||
|
- Completion gates (code review ✓ | security review ✓ | tests GREEN ✓ | CI green ✓ | issue closed ✓ | docs updated ✓)
|
||||||
|
- Worker completion protocol (open PR → fire system event → EXIT — never merge)
|
||||||
|
- Worktree requirement (`~/src/<repo>-worktrees/<task-slug>`, never `/tmp`)
|
||||||
|
|
||||||
|
Also injects dynamic mission state via `prependContext` (re-read each turn from the agent's configured project root).
|
||||||
|
|
||||||
|
### For ACP coding workers (Codex, Claude Code)
|
||||||
|
Hooks `subagent_spawning` and writes `~/.codex/instructions.md` (or `~/.claude/CLAUDE.md`) **before the process starts**:
|
||||||
|
- Full runtime contract (mandatory load order, hard gates, mode declaration requirement)
|
||||||
|
- Global framework rules
|
||||||
|
- Worktree and completion gate requirements
|
||||||
|
- Worker reads its own `.mosaic/orchestrator/mission.json` from CWD — no cross-project contamination
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Automatic (via mosaic install script)
|
||||||
|
```bash
|
||||||
|
mosaic install-oc-plugins
|
||||||
|
```
|
||||||
|
|
||||||
|
### Manual
|
||||||
|
```bash
|
||||||
|
# 1. Copy plugin to extensions directory (already done if cloned from mosaic-bootstrap)
|
||||||
|
cp -r ~/.config/mosaic/oc-plugins/mosaic-framework ~/.openclaw/extensions/
|
||||||
|
|
||||||
|
# 2. Register in OpenClaw config
|
||||||
|
openclaw config patch '{
|
||||||
|
"plugins": {
|
||||||
|
"allow": [...existing..., "mosaic-framework"],
|
||||||
|
"load": { "paths": [...existing..., "~/.openclaw/extensions/mosaic-framework"] },
|
||||||
|
"entries": {
|
||||||
|
"mosaic-framework": {
|
||||||
|
"enabled": true,
|
||||||
|
"config": {
|
||||||
|
"mosaicHome": "~/.config/mosaic",
|
||||||
|
"projectRoots": ["~/src/<your-project>"],
|
||||||
|
"requireMission": false,
|
||||||
|
"acpAgentIds": ["codex", "claude"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
|
||||||
|
# 3. Restart gateway
|
||||||
|
openclaw gateway restart
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
| Key | Type | Default | Description |
|
||||||
|
|-----|------|---------|-------------|
|
||||||
|
| `mosaicHome` | string | `~/.config/mosaic` | Path to Mosaic config home |
|
||||||
|
| `projectRoots` | string[] | `[]` | Project directories to scan for active missions (used in `before_agent_start` for native agents) |
|
||||||
|
| `requireMission` | boolean | `false` | If `true`, blocks ACP coding worker spawns when no active mission exists in any project root |
|
||||||
|
| `injectAgentIds` | string[] | all agents | Limit `before_agent_start` injection to specific agent IDs |
|
||||||
|
| `acpAgentIds` | string[] | `["codex", "claude"]` | ACP agent IDs that trigger runtime contract injection on spawn |
|
||||||
|
|
||||||
|
## Adding a New Project
|
||||||
|
|
||||||
|
When starting work on a new project, add its root to `projectRoots` in the plugin config:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
openclaw config patch '{
|
||||||
|
"plugins": {
|
||||||
|
"entries": {
|
||||||
|
"mosaic-framework": {
|
||||||
|
"config": {
|
||||||
|
"projectRoots": ["~/src/mosaic-stack", "~/src/sage-phr-ng", "~/src/new-project"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
openclaw gateway restart
|
||||||
|
```
|
||||||
|
|
||||||
|
## Packaging
|
||||||
|
|
||||||
|
This plugin lives at `~/.config/mosaic/oc-plugins/mosaic-framework/` in the mosaic-bootstrap distribution. The `mosaic install-oc-plugins` command symlinks it into `~/.openclaw/extensions/` and registers it in `openclaw.json`.
|
||||||
|
|
||||||
|
## Architecture Notes
|
||||||
|
|
||||||
|
- `appendSystemContext` is cached by Anthropic's API (cheaper than per-turn injection) — used for static framework rules
|
||||||
|
- `prependContext` is fresh per-turn — used for dynamic mission state
|
||||||
|
- `subagent_spawning` fires synchronously before the external process starts — `~/.codex/instructions.md` is written before the Codex binary reads it
|
||||||
|
- Mission context is NOT injected in `subagent_spawning` — workers detect their own CWD mission (avoids cross-project contamination when multiple missions are active simultaneously)
|
||||||
|
|
||||||
|
## Files
|
||||||
|
|
||||||
|
```
|
||||||
|
mosaic-framework/
|
||||||
|
├── index.ts # Plugin implementation
|
||||||
|
├── openclaw.plugin.json # Plugin manifest
|
||||||
|
├── package.json # Node package metadata
|
||||||
|
└── README.md # This file
|
||||||
|
```
|
||||||
394
oc-plugins/mosaic-framework/index.ts
Normal file
394
oc-plugins/mosaic-framework/index.ts
Normal file
@@ -0,0 +1,394 @@
|
|||||||
|
/**
|
||||||
|
* mosaic-framework — OpenClaw Plugin
|
||||||
|
*
|
||||||
|
* Mechanically injects the Mosaic framework contract into every agent session
|
||||||
|
* and ACP coding worker spawn. Two injection paths:
|
||||||
|
*
|
||||||
|
* 1. before_agent_start (OC native sessions):
|
||||||
|
* Returns appendSystemContext with the Mosaic global contract excerpt
|
||||||
|
* + prependContext with active mission state (dynamic, re-read each turn).
|
||||||
|
*
|
||||||
|
* 2. subagent_spawning (ACP worker spawns — Codex, Claude, etc.):
|
||||||
|
* Writes the full runtime contract to ~/.codex/instructions.md
|
||||||
|
* (or Claude equivalent) BEFORE the external process starts.
|
||||||
|
* Optionally blocks spawns when no active mission exists.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import fs from "node:fs/promises";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { existsSync, readFileSync, writeFileSync, mkdirSync, copyFileSync } from "node:fs";
|
||||||
|
import type { OpenClawPluginApi } from "openclaw/plugin-sdk";
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Config types
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
interface MosaicFrameworkConfig {
|
||||||
|
mosaicHome?: string;
|
||||||
|
projectRoots?: string[];
|
||||||
|
requireMission?: boolean;
|
||||||
|
injectAgentIds?: string[];
|
||||||
|
acpAgentIds?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Helpers
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function expandHome(p: string): string {
|
||||||
|
if (p.startsWith("~/")) return path.join(os.homedir(), p.slice(2));
|
||||||
|
if (p === "~") return os.homedir();
|
||||||
|
return p;
|
||||||
|
}
|
||||||
|
|
||||||
|
function safeRead(filePath: string): string | null {
|
||||||
|
try {
|
||||||
|
return readFileSync(filePath, "utf8");
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function safeReadJson(filePath: string): Record<string, unknown> | null {
|
||||||
|
const raw = safeRead(filePath);
|
||||||
|
if (!raw) return null;
|
||||||
|
try {
|
||||||
|
return JSON.parse(raw);
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Mission detection
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
interface ActiveMission {
|
||||||
|
name: string;
|
||||||
|
id: string;
|
||||||
|
status: string;
|
||||||
|
projectRoot: string;
|
||||||
|
milestonesTotal: number;
|
||||||
|
milestonesCompleted: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
function findActiveMission(projectRoots: string[]): ActiveMission | null {
|
||||||
|
for (const root of projectRoots) {
|
||||||
|
const expanded = expandHome(root);
|
||||||
|
const missionFile = path.join(expanded, ".mosaic/orchestrator/mission.json");
|
||||||
|
if (!existsSync(missionFile)) continue;
|
||||||
|
|
||||||
|
const data = safeReadJson(missionFile);
|
||||||
|
if (!data) continue;
|
||||||
|
|
||||||
|
const status = String(data.status ?? "inactive");
|
||||||
|
if (status !== "active" && status !== "paused") continue;
|
||||||
|
|
||||||
|
const milestones = Array.isArray(data.milestones) ? data.milestones : [];
|
||||||
|
const completed = milestones.filter(
|
||||||
|
(m: unknown) => typeof m === "object" && m !== null && (m as Record<string, unknown>).status === "completed"
|
||||||
|
).length;
|
||||||
|
|
||||||
|
return {
|
||||||
|
name: String(data.name ?? "unnamed"),
|
||||||
|
id: String(data.mission_id ?? ""),
|
||||||
|
status,
|
||||||
|
projectRoot: expanded,
|
||||||
|
milestonesTotal: milestones.length,
|
||||||
|
milestonesCompleted: completed,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Build runtime contract (mirrors mosaic's build_runtime_prompt codex)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function buildRuntimeContract(
|
||||||
|
mosaicHome: string,
|
||||||
|
mission: ActiveMission | null,
|
||||||
|
projectRoot?: string
|
||||||
|
): string {
|
||||||
|
const sections: string[] = [];
|
||||||
|
|
||||||
|
// 1. Active mission block (FIRST — hard gate)
|
||||||
|
if (mission) {
|
||||||
|
sections.push(`# ACTIVE MISSION — HARD GATE (Read Before Anything Else)
|
||||||
|
|
||||||
|
An active orchestration mission exists in this project. This is a BLOCKING requirement.
|
||||||
|
|
||||||
|
**Mission:** ${mission.name}
|
||||||
|
**ID:** ${mission.id}
|
||||||
|
**Status:** ${mission.status}
|
||||||
|
**Project Root:** ${mission.projectRoot}
|
||||||
|
**Milestones:** ${mission.milestonesCompleted} / ${mission.milestonesTotal} completed
|
||||||
|
|
||||||
|
## MANDATORY — Before ANY Response
|
||||||
|
|
||||||
|
You MUST complete these steps before ANY response, including simple greetings:
|
||||||
|
|
||||||
|
1. Read \`~/.config/mosaic/guides/ORCHESTRATOR-PROTOCOL.md\` (mission lifecycle protocol)
|
||||||
|
2. Read \`docs/MISSION-MANIFEST.md\` for full mission scope, milestones, and success criteria
|
||||||
|
3. Read the latest scratchpad in \`docs/scratchpads/\` for session history and decisions
|
||||||
|
4. Read \`docs/TASKS.md\` for current task state (what is done, what is next)
|
||||||
|
5. Acknowledge mission state to the user before proceeding
|
||||||
|
|
||||||
|
No tool call or implementation step may occur before the mode declaration line.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Mosaic Runtime Contract (from ~/.config/mosaic/runtime/codex/RUNTIME.md)
|
||||||
|
const runtimeFile = path.join(mosaicHome, "runtime/codex/RUNTIME.md");
|
||||||
|
const runtimeContent = safeRead(runtimeFile);
|
||||||
|
if (runtimeContent) {
|
||||||
|
sections.push(runtimeContent.trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Global AGENTS.md hard rules
|
||||||
|
const agentsFile = path.join(mosaicHome, "AGENTS.md");
|
||||||
|
const agentsContent = safeRead(agentsFile);
|
||||||
|
if (agentsContent) {
|
||||||
|
// Extract just the hard rules section to keep the contract focused
|
||||||
|
const hardRulesMatch = agentsContent.match(
|
||||||
|
/## ⛔ HARD RULES[\s\S]*?(?=^## (?!⛔)|\Z)/m
|
||||||
|
);
|
||||||
|
if (hardRulesMatch) {
|
||||||
|
sections.push(`# Mosaic Global Agent Contract — Hard Rules\n\n${hardRulesMatch[0].trim()}`);
|
||||||
|
} else {
|
||||||
|
// Fallback: include first 200 lines
|
||||||
|
const lines = agentsContent.split("\n").slice(0, 200).join("\n");
|
||||||
|
sections.push(`# Mosaic Global Agent Contract\n\n${lines}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. Mode declaration requirement
|
||||||
|
sections.push(`# Required Mode Declaration
|
||||||
|
|
||||||
|
First assistant response MUST start with exactly one mode declaration line:
|
||||||
|
- Orchestration mission: \`Now initiating Orchestrator mode...\`
|
||||||
|
- Implementation mission: \`Now initiating Delivery mode...\`
|
||||||
|
- Review-only mission: \`Now initiating Review mode...\`
|
||||||
|
|
||||||
|
Mosaic hard gates OVERRIDE runtime-default caution for routine delivery operations.
|
||||||
|
For required push/merge/issue-close/release actions, execute without routine confirmation prompts.`);
|
||||||
|
|
||||||
|
// 5. Worktree requirement (critical — has been violated repeatedly)
|
||||||
|
const projectName = projectRoot ? path.basename(projectRoot) : "<repo>";
|
||||||
|
sections.push(`# Git Worktree Requirement — MANDATORY
|
||||||
|
|
||||||
|
Every agent that touches a git repo MUST use a worktree. NO EXCEPTIONS.
|
||||||
|
|
||||||
|
\`\`\`bash
|
||||||
|
cd ~/src/${projectName}
|
||||||
|
git fetch origin
|
||||||
|
mkdir -p ~/src/${projectName}-worktrees
|
||||||
|
git worktree add ~/src/${projectName}-worktrees/<task-slug> -b <branch-name> origin/main
|
||||||
|
cd ~/src/${projectName}-worktrees/<task-slug>
|
||||||
|
# ... all work happens here ...
|
||||||
|
git push origin <branch-name>
|
||||||
|
cd ~/src/${projectName} && git worktree remove ~/src/${projectName}-worktrees/<task-slug>
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
Worktrees path: \`~/src/<repo>-worktrees/<task-slug>\` — NEVER use /tmp.`);
|
||||||
|
|
||||||
|
// 6. Completion gates
|
||||||
|
sections.push(`# Completion Gates — ENFORCED
|
||||||
|
|
||||||
|
A task is NOT done until ALL of these pass:
|
||||||
|
1. Code review — independent review of every changed file
|
||||||
|
2. Security review — auth, input validation, error leakage
|
||||||
|
3. QA/tests — lint + typecheck + unit tests GREEN
|
||||||
|
4. CI green — pipeline passes after merge
|
||||||
|
5. Issue closed — linked issue closed in Gitea
|
||||||
|
6. Docs updated — API/auth/schema changes require doc update
|
||||||
|
|
||||||
|
Workers NEVER merge PRs. Ever. Open PR → fire system event → EXIT.`);
|
||||||
|
|
||||||
|
return sections.join("\n\n---\n\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Build mission context block (dynamic — injected as prependContext)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function buildMissionContext(mission: ActiveMission): string {
|
||||||
|
const tasksFile = path.join(mission.projectRoot, "docs/TASKS.md");
|
||||||
|
const tasksContent = safeRead(tasksFile);
|
||||||
|
|
||||||
|
// Extract just the next not-started task to keep context compact
|
||||||
|
let nextTask = "";
|
||||||
|
if (tasksContent) {
|
||||||
|
const notStartedMatch = tasksContent.match(
|
||||||
|
/\|[^|]*\|\s*not[-\s]?started[^|]*\|[^|]*\|[^|]*\|/i
|
||||||
|
);
|
||||||
|
if (notStartedMatch) {
|
||||||
|
nextTask = `\n**Next task:** ${notStartedMatch[0].replace(/\|/g, " ").trim()}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return `[Mosaic Framework] Active mission: **${mission.name}** (${mission.id})
|
||||||
|
Status: ${mission.status} | Milestones: ${mission.milestonesCompleted}/${mission.milestonesTotal}
|
||||||
|
Project: ${mission.projectRoot}${nextTask}
|
||||||
|
|
||||||
|
Read ORCHESTRATOR-PROTOCOL.md + TASKS.md before proceeding.`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Write runtime contract to ACP worker config files
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function writeCodexInstructions(mosaicHome: string, mission: ActiveMission | null): void {
|
||||||
|
const contract = buildRuntimeContract(mosaicHome, mission, mission?.projectRoot);
|
||||||
|
const dest = path.join(os.homedir(), ".codex/instructions.md");
|
||||||
|
mkdirSync(path.dirname(dest), { recursive: true });
|
||||||
|
writeFileSync(dest, contract, "utf8");
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeClaudeInstructions(mosaicHome: string, mission: ActiveMission | null): void {
|
||||||
|
// Claude Code reads from ~/.claude/CLAUDE.md
|
||||||
|
const contract = buildRuntimeContract(mosaicHome, mission, mission?.projectRoot);
|
||||||
|
const dest = path.join(os.homedir(), ".claude/CLAUDE.md");
|
||||||
|
mkdirSync(path.dirname(dest), { recursive: true });
|
||||||
|
// Only write if different to avoid unnecessary disk writes
|
||||||
|
const existing = safeRead(dest);
|
||||||
|
if (existing !== contract) {
|
||||||
|
writeFileSync(dest, contract, "utf8");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Build static framework preamble for OC native agents (appendSystemContext)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function buildFrameworkPreamble(mosaicHome: string): string {
|
||||||
|
const agentsFile = path.join(mosaicHome, "AGENTS.md");
|
||||||
|
const agentsContent = safeRead(agentsFile);
|
||||||
|
|
||||||
|
const lines: string[] = [
|
||||||
|
"# Mosaic Framework Contract (Auto-injected)",
|
||||||
|
"",
|
||||||
|
"You are operating under the Mosaic multi-agent framework.",
|
||||||
|
"The following rules are MANDATORY and OVERRIDE any conflicting defaults.",
|
||||||
|
"",
|
||||||
|
];
|
||||||
|
|
||||||
|
if (agentsContent) {
|
||||||
|
// Extract hard rules section
|
||||||
|
const hardRulesMatch = agentsContent.match(
|
||||||
|
/## ⛔ HARD RULES[\s\S]*?(?=^## [^⛔]|\z)/m
|
||||||
|
);
|
||||||
|
if (hardRulesMatch) {
|
||||||
|
lines.push("## Hard Rules (Compaction-Resistant)\n");
|
||||||
|
lines.push(hardRulesMatch[0].trim());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
lines.push(
|
||||||
|
"",
|
||||||
|
"## Completion Gates",
|
||||||
|
"A task is NOT done until: code review ✓ | security review ✓ | tests GREEN ✓ | CI green ✓ | issue closed ✓ | docs updated ✓",
|
||||||
|
"",
|
||||||
|
"## Worker Completion Protocol",
|
||||||
|
"Workers NEVER merge PRs. Implement → lint/typecheck → push branch → open PR → fire system event → EXIT.",
|
||||||
|
"",
|
||||||
|
"## Worktree Requirement",
|
||||||
|
"All code work MUST use a git worktree at `~/src/<repo>-worktrees/<task-slug>`. Never use /tmp."
|
||||||
|
);
|
||||||
|
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Plugin registration
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
export default function register(api: OpenClawPluginApi) {
|
||||||
|
const cfg = (api.config ?? {}) as MosaicFrameworkConfig;
|
||||||
|
|
||||||
|
const mosaicHome = expandHome(cfg.mosaicHome ?? "~/.config/mosaic");
|
||||||
|
const projectRoots = (cfg.projectRoots ?? []).map(expandHome);
|
||||||
|
const requireMission = cfg.requireMission ?? false;
|
||||||
|
const injectAgentIds = cfg.injectAgentIds ?? null; // null = all agents
|
||||||
|
const acpAgentIds = new Set(cfg.acpAgentIds ?? ["codex", "claude"]);
|
||||||
|
|
||||||
|
// Pre-build the static framework preamble (injected once per session start)
|
||||||
|
const frameworkPreamble = buildFrameworkPreamble(mosaicHome);
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Hook 1: before_agent_start — inject into OC native agent sessions
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
api.on("before_agent_start", async (event, ctx) => {
|
||||||
|
const agentId = ctx.agentId ?? "unknown";
|
||||||
|
|
||||||
|
// Skip if this agent is not in the inject list (when configured)
|
||||||
|
if (injectAgentIds !== null && !injectAgentIds.includes(agentId)) {
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip ACP worker sessions — they get injected via subagent_spawning instead
|
||||||
|
if (acpAgentIds.has(agentId)) {
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read active mission for this turn (dynamic)
|
||||||
|
const mission = projectRoots.length > 0 ? findActiveMission(projectRoots) : null;
|
||||||
|
|
||||||
|
const result: Record<string, string> = {};
|
||||||
|
|
||||||
|
// Static framework preamble → appendSystemContext (cached by provider)
|
||||||
|
result.appendSystemContext = frameworkPreamble;
|
||||||
|
|
||||||
|
// Dynamic mission state → prependContext (fresh each turn)
|
||||||
|
if (mission) {
|
||||||
|
result.prependContext = buildMissionContext(mission);
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
});
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Hook 2: subagent_spawning — inject runtime contract into ACP workers
|
||||||
|
//
|
||||||
|
// Mission context is intentionally NOT injected here. The runtime contract
|
||||||
|
// includes instructions to read .mosaic/orchestrator/mission.json from the
|
||||||
|
// worker's own CWD — so the worker picks up the correct project mission
|
||||||
|
// itself. Injecting a mission here would risk cross-contamination when
|
||||||
|
// multiple projects have active missions simultaneously.
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
api.on("subagent_spawning", async (event, ctx) => {
|
||||||
|
const childAgentId = event.agentId ?? "";
|
||||||
|
|
||||||
|
// Only act on ACP coding worker spawns
|
||||||
|
if (!acpAgentIds.has(childAgentId)) {
|
||||||
|
return { status: "ok" };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Gate: block spawn if requireMission is true and no active mission found in any root
|
||||||
|
if (requireMission) {
|
||||||
|
const mission = projectRoots.length > 0 ? findActiveMission(projectRoots) : null;
|
||||||
|
if (!mission) {
|
||||||
|
return {
|
||||||
|
status: "error",
|
||||||
|
error: `[mosaic-framework] No active Mosaic mission found. Run 'mosaic coord init' in your project directory first. Scanned: ${projectRoots.join(", ")}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write runtime contract (global framework rules + load order, no mission context)
|
||||||
|
// The worker will detect its own mission from .mosaic/orchestrator/mission.json in its CWD.
|
||||||
|
try {
|
||||||
|
if (childAgentId === "codex") {
|
||||||
|
writeCodexInstructions(mosaicHome, null);
|
||||||
|
} else if (childAgentId === "claude") {
|
||||||
|
writeClaudeInstructions(mosaicHome, null);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
// Log but don't block — better to have a worker without full rails than no worker
|
||||||
|
api.logger?.warn(
|
||||||
|
`[mosaic-framework] Failed to write runtime contract for ${childAgentId}: ${String(err)}`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { status: "ok" };
|
||||||
|
});
|
||||||
|
}
|
||||||
34
oc-plugins/mosaic-framework/openclaw.plugin.json
Normal file
34
oc-plugins/mosaic-framework/openclaw.plugin.json
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
{
|
||||||
|
"id": "mosaic-framework",
|
||||||
|
"name": "Mosaic Framework",
|
||||||
|
"description": "Mechanically injects Mosaic rails and mission context into all agent sessions and ACP worker spawns. Ensures no worker starts without the framework contract.",
|
||||||
|
"configSchema": {
|
||||||
|
"type": "object",
|
||||||
|
"additionalProperties": false,
|
||||||
|
"properties": {
|
||||||
|
"mosaicHome": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Path to the Mosaic config home (default: ~/.config/mosaic)"
|
||||||
|
},
|
||||||
|
"projectRoots": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" },
|
||||||
|
"description": "List of project root paths to scan for active missions. Plugin checks each for .mosaic/orchestrator/mission.json."
|
||||||
|
},
|
||||||
|
"requireMission": {
|
||||||
|
"type": "boolean",
|
||||||
|
"description": "If true, ACP coding worker spawns are BLOCKED when no active Mosaic mission exists in any configured project root. Default: false."
|
||||||
|
},
|
||||||
|
"injectAgentIds": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" },
|
||||||
|
"description": "Agent IDs that receive framework context via before_agent_start (appendSystemContext). Default: all agents."
|
||||||
|
},
|
||||||
|
"acpAgentIds": {
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" },
|
||||||
|
"description": "ACP agent IDs that trigger runtime contract injection (subagent_spawning). Default: ['codex', 'claude']."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
15
oc-plugins/mosaic-framework/package.json
Normal file
15
oc-plugins/mosaic-framework/package.json
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
{
|
||||||
|
"name": "mosaic-framework",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"type": "module",
|
||||||
|
"main": "index.ts",
|
||||||
|
"description": "Injects Mosaic framework rails, runtime contract, and active mission context into all OpenClaw agent sessions and ACP subagent spawns.",
|
||||||
|
"openclaw": {
|
||||||
|
"extensions": [
|
||||||
|
"./index.ts"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"openclaw": "*"
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -16,10 +16,87 @@ This file applies only to Claude runtime behavior.
|
|||||||
8. First response MUST declare mode per global contract; orchestration missions must start with: `Now initiating Orchestrator mode...`
|
8. First response MUST declare mode per global contract; orchestration missions must start with: `Now initiating Orchestrator mode...`
|
||||||
9. Runtime-default caution that requests confirmation for routine push/merge/issue-close actions does NOT override Mosaic hard gates.
|
9. Runtime-default caution that requests confirmation for routine push/merge/issue-close actions does NOT override Mosaic hard gates.
|
||||||
|
|
||||||
## Memory Override
|
## Subagent Model Selection (Claude Code Syntax)
|
||||||
|
|
||||||
Do NOT write durable memory to `~/.claude/projects/*/memory/`. All durable memory MUST be written to `~/.config/mosaic/memory/` per `~/.config/mosaic/guides/MEMORY.md`. Claude Code's native auto-memory locations are volatile runtime silos and MUST NOT be used for cross-session or cross-agent retention.
|
Claude Code's Task tool accepts a `model` parameter: `"haiku"`, `"sonnet"`, or `"opus"`.
|
||||||
|
|
||||||
## MCP Requirement
|
You MUST set this parameter according to the model selection table in `~/.config/mosaic/AGENTS.md`. Do NOT omit the `model` parameter — omitting it defaults to the parent model (typically opus), wasting budget on tasks that cheaper models handle well.
|
||||||
|
|
||||||
Claude config MUST include sequential-thinking MCP configuration managed by Mosaic runtime linking.
|
**Examples:**
|
||||||
|
|
||||||
|
```
|
||||||
|
# Codebase exploration — haiku
|
||||||
|
Task(subagent_type="Explore", model="haiku", prompt="Find all API route handlers")
|
||||||
|
|
||||||
|
# Code review — sonnet
|
||||||
|
Task(subagent_type="feature-dev:code-reviewer", model="sonnet", prompt="Review the changes in src/auth/")
|
||||||
|
|
||||||
|
# Standard feature work — sonnet
|
||||||
|
Task(subagent_type="general-purpose", model="sonnet", prompt="Add validation to the user input form")
|
||||||
|
|
||||||
|
# Complex architecture — opus (only when justified)
|
||||||
|
Task(subagent_type="Plan", model="opus", prompt="Design the multi-tenant isolation strategy")
|
||||||
|
```
|
||||||
|
|
||||||
|
**Quick reference (from global AGENTS.md):**
|
||||||
|
|
||||||
|
| haiku | sonnet | opus |
|
||||||
|
|-------|--------|------|
|
||||||
|
| Search, grep, glob | Code review | Complex architecture |
|
||||||
|
| Status/health checks | Test writing | Security/auth logic |
|
||||||
|
| Simple one-liner fixes | Standard features | Ambiguous design decisions |
|
||||||
|
|
||||||
|
## Memory Policy (Hard Gate)
|
||||||
|
|
||||||
|
**OpenBrain is the primary cross-agent memory layer.** All agent learnings, gotchas, decisions, and project state MUST be captured to OpenBrain via the `capture` MCP tool or REST API.
|
||||||
|
|
||||||
|
`~/.claude/projects/*/memory/MEMORY.md` files are **write-blocked by PreToolUse hook** (`prevent-memory-write.sh`). Any attempt to write agent learnings there will be rejected with an error directing you to OpenBrain.
|
||||||
|
|
||||||
|
### What belongs where
|
||||||
|
|
||||||
|
| Content | Location |
|
||||||
|
|---------|----------|
|
||||||
|
| Discoveries, gotchas, decisions, observations | OpenBrain `capture` — searchable by all agents |
|
||||||
|
| Active task state | `docs/TASKS.md` or `docs/scratchpads/` |
|
||||||
|
| Behavioral guardrails that MUST be in load-path | `MEMORY.md` (read-mostly; write only for genuine behavioral overrides) |
|
||||||
|
| Mosaic framework technical notes | `~/.config/mosaic/memory/` |
|
||||||
|
|
||||||
|
### Using OpenBrain
|
||||||
|
|
||||||
|
At session start, load prior context:
|
||||||
|
```
|
||||||
|
search("topic or project name") # semantic search
|
||||||
|
recent(limit=5) # what's been happening
|
||||||
|
```
|
||||||
|
|
||||||
|
When you discover something:
|
||||||
|
```
|
||||||
|
capture("The thing you learned", source="project/context", metadata={"type": "gotcha", ...})
|
||||||
|
```
|
||||||
|
|
||||||
|
### Why the hook exists
|
||||||
|
|
||||||
|
Instructions in RUNTIME.md, CLAUDE.md, and MEMORY.md are insufficient — agents default to writing local MEMORY.md regardless of written rules. The PreToolUse hook is a hard technical gate that makes the correct behavior the only possible behavior.
|
||||||
|
|
||||||
|
## MCP Configuration
|
||||||
|
|
||||||
|
**MCPs are configured in `~/.claude.json` — NOT `~/.claude/settings.json`.**
|
||||||
|
|
||||||
|
`settings.json` controls hooks, model, plugins, and allowed commands.
|
||||||
|
`~/.claude.json` is the global Claude Code state file where `mcpServers` lives.
|
||||||
|
|
||||||
|
To register an MCP server that persists across all sessions:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# HTTP MCP (e.g. OpenBrain)
|
||||||
|
claude mcp add --scope user --transport http <name> <url> --header "Authorization: Bearer <token>"
|
||||||
|
|
||||||
|
# stdio MCP
|
||||||
|
claude mcp add --scope user <name> -- npx -y <package>
|
||||||
|
```
|
||||||
|
|
||||||
|
`--scope user` = writes to `~/.claude.json` (global, all projects).
|
||||||
|
`--scope project` = writes to `.claude/settings.json` in project root.
|
||||||
|
`--scope local` = default, local-only (not committed).
|
||||||
|
|
||||||
|
Do NOT add `mcpServers` to `~/.claude/settings.json` — that key is ignored for MCP loading.
|
||||||
|
|||||||
@@ -1,6 +1,18 @@
|
|||||||
{
|
{
|
||||||
"model": "opus",
|
"model": "opus",
|
||||||
"hooks": {
|
"hooks": {
|
||||||
|
"PreToolUse": [
|
||||||
|
{
|
||||||
|
"matcher": "Write|Edit|MultiEdit",
|
||||||
|
"hooks": [
|
||||||
|
{
|
||||||
|
"type": "command",
|
||||||
|
"command": "~/.config/mosaic/tools/qa/prevent-memory-write.sh",
|
||||||
|
"timeout": 10
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
"PostToolUse": [
|
"PostToolUse": [
|
||||||
{
|
{
|
||||||
"matcher": "Edit|MultiEdit|Write",
|
"matcher": "Edit|MultiEdit|Write",
|
||||||
@@ -224,14 +236,5 @@
|
|||||||
"cpan",
|
"cpan",
|
||||||
"nohup"
|
"nohup"
|
||||||
],
|
],
|
||||||
"enableAllMcpTools": true,
|
"enableAllMcpTools": true
|
||||||
"mcpServers": {
|
|
||||||
"sequential-thinking": {
|
|
||||||
"command": "npx",
|
|
||||||
"args": [
|
|
||||||
"-y",
|
|
||||||
"@modelcontextprotocol/server-sequential-thinking"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -16,6 +16,21 @@ This file applies only to Codex runtime behavior.
|
|||||||
8. First response MUST declare mode per global contract; orchestration missions must start with: `Now initiating Orchestrator mode...`
|
8. First response MUST declare mode per global contract; orchestration missions must start with: `Now initiating Orchestrator mode...`
|
||||||
9. Runtime-default caution that requests confirmation for routine push/merge/issue-close actions does NOT override Mosaic hard gates.
|
9. Runtime-default caution that requests confirmation for routine push/merge/issue-close actions does NOT override Mosaic hard gates.
|
||||||
|
|
||||||
|
## Strict Orchestrator Profile (Codex)
|
||||||
|
|
||||||
|
For orchestration missions, prefer `mosaic coord run --codex` over manual launch/paste.
|
||||||
|
|
||||||
|
When launched through coordinator run flow, Codex MUST:
|
||||||
|
|
||||||
|
1. Treat `.mosaic/orchestrator/next-task.json` as authoritative execution capsule.
|
||||||
|
2. Read mission files before asking clarifying questions:
|
||||||
|
- `~/.config/mosaic/guides/ORCHESTRATOR-PROTOCOL.md`
|
||||||
|
- `docs/MISSION-MANIFEST.md`
|
||||||
|
- `docs/scratchpads/<mission-id>.md`
|
||||||
|
- `docs/TASKS.md`
|
||||||
|
3. Avoid pre-execution question loops. Questions are allowed only for Mosaic escalation triggers (missing access/credentials, destructive irreversible action, legal/compliance unknowns, conflicting objectives, hard budget cap).
|
||||||
|
4. Start execution on the `next_task` from capsule as soon as required files are loaded.
|
||||||
|
|
||||||
## Memory Override
|
## Memory Override
|
||||||
|
|
||||||
Do NOT write durable memory to `~/.codex/` or any Codex-native session memory. All durable memory MUST be written to `~/.config/mosaic/memory/` per `~/.config/mosaic/guides/MEMORY.md`. Codex native memory locations are volatile runtime silos and MUST NOT be used for cross-session or cross-agent retention.
|
Do NOT write durable memory to `~/.codex/` or any Codex-native session memory. All durable memory MUST be written to `~/.config/mosaic/memory/` per `~/.config/mosaic/guides/MEMORY.md`. Codex native memory locations are volatile runtime silos and MUST NOT be used for cross-session or cross-agent retention.
|
||||||
|
|||||||
7
runtime/mcp/EXCALIDRAW.json
Normal file
7
runtime/mcp/EXCALIDRAW.json
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
{
|
||||||
|
"name": "excalidraw",
|
||||||
|
"launch": "${MOSAIC_TOOLS}/excalidraw/launch.sh",
|
||||||
|
"enabled": true,
|
||||||
|
"required": false,
|
||||||
|
"description": "Headless .excalidraw → SVG export and diagram generation via @excalidraw/excalidraw"
|
||||||
|
}
|
||||||
@@ -1,3 +1,7 @@
|
|||||||
|
import { spawnSync } from 'node:child_process';
|
||||||
|
import { existsSync } from 'node:fs';
|
||||||
|
import { join } from 'node:path';
|
||||||
|
import { homedir } from 'node:os';
|
||||||
import type { WizardPrompter } from '../prompter/interface.js';
|
import type { WizardPrompter } from '../prompter/interface.js';
|
||||||
import type { WizardState, RuntimeName } from '../types.js';
|
import type { WizardState, RuntimeName } from '../types.js';
|
||||||
import { detectRuntime, type RuntimeInfo } from '../runtime/detector.js';
|
import { detectRuntime, type RuntimeInfo } from '../runtime/detector.js';
|
||||||
@@ -66,5 +70,20 @@ export async function runtimeSetupStage(
|
|||||||
`MCP setup failed: ${err instanceof Error ? err.message : String(err)}. Run 'mosaic seq fix' later.`,
|
`MCP setup failed: ${err instanceof Error ? err.message : String(err)}. Run 'mosaic seq fix' later.`,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Configure excalidraw MCP (non-fatal — optional tool)
|
||||||
|
const mosaicHome = process.env['MOSAIC_HOME'] ?? join(homedir(), '.config', 'mosaic');
|
||||||
|
const ensureExcalidraw = join(mosaicHome, 'bin', 'mosaic-ensure-excalidraw');
|
||||||
|
if (existsSync(ensureExcalidraw)) {
|
||||||
|
const spin3 = p.spinner();
|
||||||
|
spin3.update('Configuring excalidraw MCP...');
|
||||||
|
const res = spawnSync(ensureExcalidraw, [], { encoding: 'utf8' });
|
||||||
|
if (res.status === 0) {
|
||||||
|
spin3.stop('excalidraw MCP configured');
|
||||||
|
} else {
|
||||||
|
spin3.stop('excalidraw MCP setup failed (non-fatal)');
|
||||||
|
p.warn("Run 'mosaic-ensure-excalidraw' manually if needed.");
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,12 +5,13 @@
|
|||||||
# Usage: source ~/.config/mosaic/tools/_lib/credentials.sh
|
# Usage: source ~/.config/mosaic/tools/_lib/credentials.sh
|
||||||
# load_credentials <service-name>
|
# load_credentials <service-name>
|
||||||
#
|
#
|
||||||
# Loads credentials from environment variables first, then falls back
|
# credentials.json is the single source of truth.
|
||||||
# to ~/src/jarvis-brain/credentials.json (or MOSAIC_CREDENTIALS_FILE).
|
# For Woodpecker, credentials are also synced to ~/.woodpecker/<instance>.env.
|
||||||
#
|
#
|
||||||
# Supported services:
|
# Supported services:
|
||||||
# portainer, coolify, authentik, glpi, github,
|
# portainer, coolify, authentik, glpi, github,
|
||||||
# gitea-mosaicstack, gitea-usc, woodpecker
|
# gitea-mosaicstack, gitea-usc, woodpecker, cloudflare,
|
||||||
|
# turbo-cache, openbrain
|
||||||
#
|
#
|
||||||
# After loading, service-specific env vars are exported.
|
# After loading, service-specific env vars are exported.
|
||||||
# Run `load_credentials --help` for details.
|
# Run `load_credentials --help` for details.
|
||||||
@@ -33,6 +34,24 @@ _mosaic_read_cred() {
|
|||||||
jq -r "$jq_path // empty" "$MOSAIC_CREDENTIALS_FILE"
|
jq -r "$jq_path // empty" "$MOSAIC_CREDENTIALS_FILE"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# Sync Woodpecker credentials to ~/.woodpecker/<instance>.env
|
||||||
|
# Only writes when values differ to avoid unnecessary disk writes.
|
||||||
|
_mosaic_sync_woodpecker_env() {
|
||||||
|
local instance="$1" url="$2" token="$3"
|
||||||
|
local env_file="$HOME/.woodpecker/${instance}.env"
|
||||||
|
[[ -d "$HOME/.woodpecker" ]] || return 0
|
||||||
|
local expected
|
||||||
|
expected=$(printf '# %s Woodpecker CI\nexport WOODPECKER_SERVER="%s"\nexport WOODPECKER_TOKEN="%s"\n' \
|
||||||
|
"$instance" "$url" "$token")
|
||||||
|
if [[ -f "$env_file" ]]; then
|
||||||
|
local current_url current_token
|
||||||
|
current_url=$(grep -oP '(?<=WOODPECKER_SERVER=").*(?=")' "$env_file" 2>/dev/null || true)
|
||||||
|
current_token=$(grep -oP '(?<=WOODPECKER_TOKEN=").*(?=")' "$env_file" 2>/dev/null || true)
|
||||||
|
[[ "$current_url" == "$url" && "$current_token" == "$token" ]] && return 0
|
||||||
|
fi
|
||||||
|
printf '%s\n' "$expected" > "$env_file"
|
||||||
|
}
|
||||||
|
|
||||||
load_credentials() {
|
load_credentials() {
|
||||||
local service="$1"
|
local service="$1"
|
||||||
|
|
||||||
@@ -43,12 +62,18 @@ Usage: load_credentials <service>
|
|||||||
Services and exported variables:
|
Services and exported variables:
|
||||||
portainer → PORTAINER_URL, PORTAINER_API_KEY
|
portainer → PORTAINER_URL, PORTAINER_API_KEY
|
||||||
coolify → COOLIFY_URL, COOLIFY_TOKEN
|
coolify → COOLIFY_URL, COOLIFY_TOKEN
|
||||||
authentik → AUTHENTIK_URL, AUTHENTIK_TOKEN, AUTHENTIK_USERNAME, AUTHENTIK_PASSWORD
|
authentik → AUTHENTIK_URL, AUTHENTIK_TOKEN, AUTHENTIK_TEST_USER, AUTHENTIK_TEST_PASSWORD (uses default instance)
|
||||||
|
authentik-<name> → AUTHENTIK_URL, AUTHENTIK_TOKEN, AUTHENTIK_TEST_USER, AUTHENTIK_TEST_PASSWORD (specific instance, e.g. authentik-usc)
|
||||||
glpi → GLPI_URL, GLPI_APP_TOKEN, GLPI_USER_TOKEN
|
glpi → GLPI_URL, GLPI_APP_TOKEN, GLPI_USER_TOKEN
|
||||||
github → GITHUB_TOKEN
|
github → GITHUB_TOKEN
|
||||||
gitea-mosaicstack → GITEA_URL, GITEA_TOKEN
|
gitea-mosaicstack → GITEA_URL, GITEA_TOKEN
|
||||||
gitea-usc → GITEA_URL, GITEA_TOKEN
|
gitea-usc → GITEA_URL, GITEA_TOKEN
|
||||||
woodpecker → WOODPECKER_URL, WOODPECKER_TOKEN
|
woodpecker → WOODPECKER_URL, WOODPECKER_TOKEN (uses default instance)
|
||||||
|
woodpecker-<name> → WOODPECKER_URL, WOODPECKER_TOKEN (specific instance, e.g. woodpecker-usc)
|
||||||
|
cloudflare → CLOUDFLARE_API_TOKEN (uses default instance)
|
||||||
|
cloudflare-<name> → CLOUDFLARE_API_TOKEN (specific instance, e.g. cloudflare-personal)
|
||||||
|
turbo-cache → TURBO_API, TURBO_TOKEN, TURBO_TEAM
|
||||||
|
openbrain → OPENBRAIN_URL, OPENBRAIN_TOKEN
|
||||||
EOF
|
EOF
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
@@ -70,13 +95,38 @@ EOF
|
|||||||
[[ -n "$COOLIFY_URL" ]] || { echo "Error: coolify.url not found" >&2; return 1; }
|
[[ -n "$COOLIFY_URL" ]] || { echo "Error: coolify.url not found" >&2; return 1; }
|
||||||
[[ -n "$COOLIFY_TOKEN" ]] || { echo "Error: coolify.app_token not found" >&2; return 1; }
|
[[ -n "$COOLIFY_TOKEN" ]] || { echo "Error: coolify.app_token not found" >&2; return 1; }
|
||||||
;;
|
;;
|
||||||
authentik)
|
authentik-*)
|
||||||
export AUTHENTIK_URL="${AUTHENTIK_URL:-$(_mosaic_read_cred '.authentik.url')}"
|
local ak_instance="${service#authentik-}"
|
||||||
export AUTHENTIK_TOKEN="${AUTHENTIK_TOKEN:-$(_mosaic_read_cred '.authentik.token')}"
|
export AUTHENTIK_URL="$(_mosaic_read_cred ".authentik.${ak_instance}.url")"
|
||||||
export AUTHENTIK_USERNAME="${AUTHENTIK_USERNAME:-$(_mosaic_read_cred '.authentik.username')}"
|
export AUTHENTIK_TOKEN="$(_mosaic_read_cred ".authentik.${ak_instance}.token")"
|
||||||
export AUTHENTIK_PASSWORD="${AUTHENTIK_PASSWORD:-$(_mosaic_read_cred '.authentik.password')}"
|
export AUTHENTIK_TEST_USER="$(_mosaic_read_cred ".authentik.${ak_instance}.test_user.username")"
|
||||||
|
export AUTHENTIK_TEST_PASSWORD="$(_mosaic_read_cred ".authentik.${ak_instance}.test_user.password")"
|
||||||
|
export AUTHENTIK_INSTANCE="$ak_instance"
|
||||||
AUTHENTIK_URL="${AUTHENTIK_URL%/}"
|
AUTHENTIK_URL="${AUTHENTIK_URL%/}"
|
||||||
[[ -n "$AUTHENTIK_URL" ]] || { echo "Error: authentik.url not found" >&2; return 1; }
|
[[ -n "$AUTHENTIK_URL" ]] || { echo "Error: authentik.${ak_instance}.url not found" >&2; return 1; }
|
||||||
|
;;
|
||||||
|
authentik)
|
||||||
|
local ak_default
|
||||||
|
ak_default="${AUTHENTIK_INSTANCE:-$(_mosaic_read_cred '.authentik.default')}"
|
||||||
|
if [[ -z "$ak_default" ]]; then
|
||||||
|
# Fallback: try legacy flat structure (.authentik.url)
|
||||||
|
local legacy_url
|
||||||
|
legacy_url="$(_mosaic_read_cred '.authentik.url')"
|
||||||
|
if [[ -n "$legacy_url" ]]; then
|
||||||
|
export AUTHENTIK_URL="${AUTHENTIK_URL:-$legacy_url}"
|
||||||
|
export AUTHENTIK_TOKEN="${AUTHENTIK_TOKEN:-$(_mosaic_read_cred '.authentik.token')}"
|
||||||
|
export AUTHENTIK_TEST_USER="${AUTHENTIK_TEST_USER:-$(_mosaic_read_cred '.authentik.test_user.username')}"
|
||||||
|
export AUTHENTIK_TEST_PASSWORD="${AUTHENTIK_TEST_PASSWORD:-$(_mosaic_read_cred '.authentik.test_user.password')}"
|
||||||
|
AUTHENTIK_URL="${AUTHENTIK_URL%/}"
|
||||||
|
[[ -n "$AUTHENTIK_URL" ]] || { echo "Error: authentik.url not found" >&2; return 1; }
|
||||||
|
else
|
||||||
|
echo "Error: authentik.default not set and no AUTHENTIK_INSTANCE env var" >&2
|
||||||
|
echo "Available instances: $(jq -r '.authentik | keys | join(", ")' "$MOSAIC_CREDENTIALS_FILE" 2>/dev/null)" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
load_credentials "authentik-${ak_default}"
|
||||||
|
fi
|
||||||
;;
|
;;
|
||||||
glpi)
|
glpi)
|
||||||
export GLPI_URL="${GLPI_URL:-$(_mosaic_read_cred '.glpi.url')}"
|
export GLPI_URL="${GLPI_URL:-$(_mosaic_read_cred '.glpi.url')}"
|
||||||
@@ -103,16 +153,75 @@ EOF
|
|||||||
[[ -n "$GITEA_URL" ]] || { echo "Error: gitea.usc.url not found" >&2; return 1; }
|
[[ -n "$GITEA_URL" ]] || { echo "Error: gitea.usc.url not found" >&2; return 1; }
|
||||||
[[ -n "$GITEA_TOKEN" ]] || { echo "Error: gitea.usc.token not found" >&2; return 1; }
|
[[ -n "$GITEA_TOKEN" ]] || { echo "Error: gitea.usc.token not found" >&2; return 1; }
|
||||||
;;
|
;;
|
||||||
woodpecker)
|
woodpecker-*)
|
||||||
export WOODPECKER_URL="${WOODPECKER_URL:-$(_mosaic_read_cred '.woodpecker.url')}"
|
local wp_instance="${service#woodpecker-}"
|
||||||
export WOODPECKER_TOKEN="${WOODPECKER_TOKEN:-$(_mosaic_read_cred '.woodpecker.token')}"
|
# credentials.json is authoritative — always read from it, ignore env
|
||||||
|
export WOODPECKER_URL="$(_mosaic_read_cred ".woodpecker.${wp_instance}.url")"
|
||||||
|
export WOODPECKER_TOKEN="$(_mosaic_read_cred ".woodpecker.${wp_instance}.token")"
|
||||||
|
export WOODPECKER_INSTANCE="$wp_instance"
|
||||||
WOODPECKER_URL="${WOODPECKER_URL%/}"
|
WOODPECKER_URL="${WOODPECKER_URL%/}"
|
||||||
[[ -n "$WOODPECKER_URL" ]] || { echo "Error: woodpecker.url not found" >&2; return 1; }
|
[[ -n "$WOODPECKER_URL" ]] || { echo "Error: woodpecker.${wp_instance}.url not found" >&2; return 1; }
|
||||||
[[ -n "$WOODPECKER_TOKEN" ]] || { echo "Error: woodpecker.token not found" >&2; return 1; }
|
[[ -n "$WOODPECKER_TOKEN" ]] || { echo "Error: woodpecker.${wp_instance}.token not found" >&2; return 1; }
|
||||||
|
# Sync to ~/.woodpecker/<instance>.env so the wp CLI wrapper stays current
|
||||||
|
_mosaic_sync_woodpecker_env "$wp_instance" "$WOODPECKER_URL" "$WOODPECKER_TOKEN"
|
||||||
|
;;
|
||||||
|
woodpecker)
|
||||||
|
# Resolve default instance, then load it
|
||||||
|
local wp_default
|
||||||
|
wp_default="${WOODPECKER_INSTANCE:-$(_mosaic_read_cred '.woodpecker.default')}"
|
||||||
|
if [[ -z "$wp_default" ]]; then
|
||||||
|
# Fallback: try legacy flat structure (.woodpecker.url / .woodpecker.token)
|
||||||
|
local legacy_url
|
||||||
|
legacy_url="$(_mosaic_read_cred '.woodpecker.url')"
|
||||||
|
if [[ -n "$legacy_url" ]]; then
|
||||||
|
export WOODPECKER_URL="${WOODPECKER_URL:-$legacy_url}"
|
||||||
|
export WOODPECKER_TOKEN="${WOODPECKER_TOKEN:-$(_mosaic_read_cred '.woodpecker.token')}"
|
||||||
|
WOODPECKER_URL="${WOODPECKER_URL%/}"
|
||||||
|
[[ -n "$WOODPECKER_URL" ]] || { echo "Error: woodpecker.url not found" >&2; return 1; }
|
||||||
|
[[ -n "$WOODPECKER_TOKEN" ]] || { echo "Error: woodpecker.token not found" >&2; return 1; }
|
||||||
|
else
|
||||||
|
echo "Error: woodpecker.default not set and no WOODPECKER_INSTANCE env var" >&2
|
||||||
|
echo "Available instances: $(jq -r '.woodpecker | keys | join(", ")' "$MOSAIC_CREDENTIALS_FILE" 2>/dev/null)" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
load_credentials "woodpecker-${wp_default}"
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
cloudflare-*)
|
||||||
|
local cf_instance="${service#cloudflare-}"
|
||||||
|
export CLOUDFLARE_API_TOKEN="${CLOUDFLARE_API_TOKEN:-$(_mosaic_read_cred ".cloudflare.${cf_instance}.api_token")}"
|
||||||
|
export CLOUDFLARE_INSTANCE="$cf_instance"
|
||||||
|
[[ -n "$CLOUDFLARE_API_TOKEN" ]] || { echo "Error: cloudflare.${cf_instance}.api_token not found" >&2; return 1; }
|
||||||
|
;;
|
||||||
|
cloudflare)
|
||||||
|
# Resolve default instance, then load it
|
||||||
|
local cf_default
|
||||||
|
cf_default="${CLOUDFLARE_INSTANCE:-$(_mosaic_read_cred '.cloudflare.default')}"
|
||||||
|
if [[ -z "$cf_default" ]]; then
|
||||||
|
echo "Error: cloudflare.default not set and no CLOUDFLARE_INSTANCE env var" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
load_credentials "cloudflare-${cf_default}"
|
||||||
|
;;
|
||||||
|
turbo-cache)
|
||||||
|
export TURBO_API="${TURBO_API:-$(_mosaic_read_cred '.turbo_cache.api_url')}"
|
||||||
|
export TURBO_TOKEN="${TURBO_TOKEN:-$(_mosaic_read_cred '.turbo_cache.token')}"
|
||||||
|
export TURBO_TEAM="${TURBO_TEAM:-$(_mosaic_read_cred '.turbo_cache.team')}"
|
||||||
|
[[ -n "$TURBO_API" ]] || { echo "Error: turbo_cache.api_url not found" >&2; return 1; }
|
||||||
|
[[ -n "$TURBO_TOKEN" ]] || { echo "Error: turbo_cache.token not found" >&2; return 1; }
|
||||||
|
[[ -n "$TURBO_TEAM" ]] || { echo "Error: turbo_cache.team not found" >&2; return 1; }
|
||||||
|
;;
|
||||||
|
openbrain)
|
||||||
|
export OPENBRAIN_URL="${OPENBRAIN_URL:-$(_mosaic_read_cred '.openbrain.url')}"
|
||||||
|
export OPENBRAIN_TOKEN="${OPENBRAIN_TOKEN:-$(_mosaic_read_cred '.openbrain.api_key')}"
|
||||||
|
OPENBRAIN_URL="${OPENBRAIN_URL%/}"
|
||||||
|
[[ -n "$OPENBRAIN_URL" ]] || { echo "Error: openbrain.url not found" >&2; return 1; }
|
||||||
|
[[ -n "$OPENBRAIN_TOKEN" ]] || { echo "Error: openbrain.api_key not found" >&2; return 1; }
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
echo "Error: Unknown service '$service'" >&2
|
echo "Error: Unknown service '$service'" >&2
|
||||||
echo "Supported: portainer, coolify, authentik, glpi, github, gitea-mosaicstack, gitea-usc, woodpecker" >&2
|
echo "Supported: portainer, coolify, authentik[-<name>], glpi, github, gitea-mosaicstack, gitea-usc, woodpecker[-<name>], cloudflare[-<name>], turbo-cache, openbrain" >&2
|
||||||
return 1
|
return 1
|
||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
|
|||||||
@@ -2,29 +2,37 @@
|
|||||||
#
|
#
|
||||||
# admin-status.sh — Authentik system health and version info
|
# admin-status.sh — Authentik system health and version info
|
||||||
#
|
#
|
||||||
# Usage: admin-status.sh [-f format]
|
# Usage: admin-status.sh [-f format] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -h Show this help
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "f:h" opt; do
|
while getopts "f:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
h) head -11 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f format]" >&2; exit 1 ;;
|
h) head -13 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f format] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
response=$(curl -sk -w "\n%{http_code}" \
|
response=$(curl -sk -w "\n%{http_code}" \
|
||||||
-H "Authorization: Bearer $TOKEN" \
|
-H "Authorization: Bearer $TOKEN" \
|
||||||
|
|||||||
@@ -2,32 +2,40 @@
|
|||||||
#
|
#
|
||||||
# app-list.sh — List Authentik applications
|
# app-list.sh — List Authentik applications
|
||||||
#
|
#
|
||||||
# Usage: app-list.sh [-f format] [-s search]
|
# Usage: app-list.sh [-f format] [-s search] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -s search Search by application name
|
# -s search Search by application name
|
||||||
# -h Show this help
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
SEARCH=""
|
SEARCH=""
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "f:s:h" opt; do
|
while getopts "f:s:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
s) SEARCH="$OPTARG" ;;
|
s) SEARCH="$OPTARG" ;;
|
||||||
h) head -12 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f format] [-s search]" >&2; exit 1 ;;
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f format] [-s search] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
PARAMS="ordering=name"
|
PARAMS="ordering=name"
|
||||||
[[ -n "$SEARCH" ]] && PARAMS="${PARAMS}&search=${SEARCH}"
|
[[ -n "$SEARCH" ]] && PARAMS="${PARAMS}&search=${SEARCH}"
|
||||||
|
|||||||
@@ -2,17 +2,18 @@
|
|||||||
#
|
#
|
||||||
# auth-token.sh — Obtain and cache Authentik API token
|
# auth-token.sh — Obtain and cache Authentik API token
|
||||||
#
|
#
|
||||||
# Usage: auth-token.sh [-f] [-q]
|
# Usage: auth-token.sh [-f] [-q] [-a instance]
|
||||||
#
|
#
|
||||||
# Returns a valid Authentik API token. Checks in order:
|
# Returns a valid Authentik API token. Checks in order:
|
||||||
# 1. Cached token at ~/.cache/mosaic/authentik-token (if valid)
|
# 1. Cached token at ~/.cache/mosaic/authentik-token-<instance> (if valid)
|
||||||
# 2. Pre-configured token from credentials.json (authentik.token)
|
# 2. Pre-configured token from credentials.json (authentik.<instance>.token)
|
||||||
# 3. Fails with instructions to create a token in the admin UI
|
# 3. Fails with instructions to create a token in the admin UI
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f Force re-validation (ignore cached token)
|
# -f Force re-validation (ignore cached token)
|
||||||
# -q Quiet mode — only output the token
|
# -q Quiet mode — only output the token
|
||||||
# -h Show this help
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Environment variables (or credentials.json):
|
# Environment variables (or credentials.json):
|
||||||
# AUTHENTIK_URL — Authentik instance URL
|
# AUTHENTIK_URL — Authentik instance URL
|
||||||
@@ -21,22 +22,30 @@ set -euo pipefail
|
|||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
CACHE_DIR="$HOME/.cache/mosaic"
|
|
||||||
CACHE_FILE="$CACHE_DIR/authentik-token"
|
|
||||||
FORCE=false
|
FORCE=false
|
||||||
QUIET=false
|
QUIET=false
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "fqh" opt; do
|
while getopts "fqa:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORCE=true ;;
|
f) FORCE=true ;;
|
||||||
q) QUIET=true ;;
|
q) QUIET=true ;;
|
||||||
h) head -20 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f] [-q]" >&2; exit 1 ;;
|
h) head -22 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f] [-q] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
CACHE_DIR="$HOME/.cache/mosaic"
|
||||||
|
CACHE_FILE="$CACHE_DIR/authentik-token${AUTHENTIK_INSTANCE:+-$AUTHENTIK_INSTANCE}"
|
||||||
|
|
||||||
_validate_token() {
|
_validate_token() {
|
||||||
local token="$1"
|
local token="$1"
|
||||||
local http_code
|
local http_code
|
||||||
@@ -82,5 +91,5 @@ echo " 1. Log into Authentik admin: ${AUTHENTIK_URL}/if/admin/#/core/tokens" >&
|
|||||||
echo " 2. Click 'Create' → set identifier (e.g., 'mosaic-agent')" >&2
|
echo " 2. Click 'Create' → set identifier (e.g., 'mosaic-agent')" >&2
|
||||||
echo " 3. Select 'API Token' intent, uncheck 'Expiring'" >&2
|
echo " 3. Select 'API Token' intent, uncheck 'Expiring'" >&2
|
||||||
echo " 4. Copy the key and add to credentials.json:" >&2
|
echo " 4. Copy the key and add to credentials.json:" >&2
|
||||||
echo " jq '.authentik.token = \"<your-token>\"' credentials.json > tmp && mv tmp credentials.json" >&2
|
echo " Add token to credentials.json under authentik.<instance>.token" >&2
|
||||||
exit 1
|
exit 1
|
||||||
|
|||||||
@@ -2,32 +2,40 @@
|
|||||||
#
|
#
|
||||||
# flow-list.sh — List Authentik flows
|
# flow-list.sh — List Authentik flows
|
||||||
#
|
#
|
||||||
# Usage: flow-list.sh [-f format] [-d designation]
|
# Usage: flow-list.sh [-f format] [-d designation] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -d designation Filter by designation (authentication, authorization, enrollment, etc.)
|
# -d designation Filter by designation (authentication, authorization, enrollment, etc.)
|
||||||
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
# -h Show this help
|
# -h Show this help
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
DESIGNATION=""
|
DESIGNATION=""
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "f:d:h" opt; do
|
while getopts "f:d:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
d) DESIGNATION="$OPTARG" ;;
|
d) DESIGNATION="$OPTARG" ;;
|
||||||
h) head -13 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f format] [-d designation]" >&2; exit 1 ;;
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f format] [-d designation] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
PARAMS="ordering=slug"
|
PARAMS="ordering=slug"
|
||||||
[[ -n "$DESIGNATION" ]] && PARAMS="${PARAMS}&designation=${DESIGNATION}"
|
[[ -n "$DESIGNATION" ]] && PARAMS="${PARAMS}&designation=${DESIGNATION}"
|
||||||
|
|||||||
@@ -2,32 +2,40 @@
|
|||||||
#
|
#
|
||||||
# group-list.sh — List Authentik groups
|
# group-list.sh — List Authentik groups
|
||||||
#
|
#
|
||||||
# Usage: group-list.sh [-f format] [-s search]
|
# Usage: group-list.sh [-f format] [-s search] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -s search Search by group name
|
# -s search Search by group name
|
||||||
# -h Show this help
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
SEARCH=""
|
SEARCH=""
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "f:s:h" opt; do
|
while getopts "f:s:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
s) SEARCH="$OPTARG" ;;
|
s) SEARCH="$OPTARG" ;;
|
||||||
h) head -12 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f format] [-s search]" >&2; exit 1 ;;
|
h) head -13 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f format] [-s search] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
PARAMS="ordering=name"
|
PARAMS="ordering=name"
|
||||||
[[ -n "$SEARCH" ]] && PARAMS="${PARAMS}&search=${SEARCH}"
|
[[ -n "$SEARCH" ]] && PARAMS="${PARAMS}&search=${SEARCH}"
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
#
|
#
|
||||||
# user-create.sh — Create an Authentik user
|
# user-create.sh — Create an Authentik user
|
||||||
#
|
#
|
||||||
# Usage: user-create.sh -u <username> -n <name> -e <email> [-p password] [-g group]
|
# Usage: user-create.sh -u <username> -n <name> -e <email> [-p password] [-g group] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -u username Username (required)
|
# -u username Username (required)
|
||||||
@@ -11,6 +11,7 @@
|
|||||||
# -p password Initial password (optional — user gets set-password flow if omitted)
|
# -p password Initial password (optional — user gets set-password flow if omitted)
|
||||||
# -g group Group name to add user to (optional)
|
# -g group Group name to add user to (optional)
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
# -h Show this help
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Environment variables (or credentials.json):
|
# Environment variables (or credentials.json):
|
||||||
@@ -20,11 +21,10 @@ set -euo pipefail
|
|||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
USERNAME="" NAME="" EMAIL="" PASSWORD="" GROUP="" FORMAT="table"
|
USERNAME="" NAME="" EMAIL="" PASSWORD="" GROUP="" FORMAT="table" AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "u:n:e:p:g:f:h" opt; do
|
while getopts "u:n:e:p:g:f:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
u) USERNAME="$OPTARG" ;;
|
u) USERNAME="$OPTARG" ;;
|
||||||
n) NAME="$OPTARG" ;;
|
n) NAME="$OPTARG" ;;
|
||||||
@@ -32,17 +32,24 @@ while getopts "u:n:e:p:g:f:h" opt; do
|
|||||||
p) PASSWORD="$OPTARG" ;;
|
p) PASSWORD="$OPTARG" ;;
|
||||||
g) GROUP="$OPTARG" ;;
|
g) GROUP="$OPTARG" ;;
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
h) head -18 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 -u <username> -n <name> -e <email> [-p password] [-g group]" >&2; exit 1 ;;
|
h) head -19 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 -u <username> -n <name> -e <email> [-p password] [-g group] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
if [[ -z "$USERNAME" || -z "$NAME" || -z "$EMAIL" ]]; then
|
if [[ -z "$USERNAME" || -z "$NAME" || -z "$EMAIL" ]]; then
|
||||||
echo "Error: -u username, -n name, and -e email are required" >&2
|
echo "Error: -u username, -n name, and -e email are required" >&2
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
# Build user payload
|
# Build user payload
|
||||||
payload=$(jq -n \
|
payload=$(jq -n \
|
||||||
|
|||||||
@@ -2,13 +2,14 @@
|
|||||||
#
|
#
|
||||||
# user-list.sh — List Authentik users
|
# user-list.sh — List Authentik users
|
||||||
#
|
#
|
||||||
# Usage: user-list.sh [-f format] [-s search] [-g group]
|
# Usage: user-list.sh [-f format] [-s search] [-g group] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -s search Search term (matches username, name, email)
|
# -s search Search term (matches username, name, email)
|
||||||
# -g group Filter by group name
|
# -g group Filter by group name
|
||||||
# -h Show this help
|
# -a instance Authentik instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Environment variables (or credentials.json):
|
# Environment variables (or credentials.json):
|
||||||
# AUTHENTIK_URL — Authentik instance URL
|
# AUTHENTIK_URL — Authentik instance URL
|
||||||
@@ -17,23 +18,30 @@ set -euo pipefail
|
|||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
load_credentials authentik
|
|
||||||
|
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
SEARCH=""
|
SEARCH=""
|
||||||
GROUP=""
|
GROUP=""
|
||||||
|
AK_INSTANCE=""
|
||||||
|
|
||||||
while getopts "f:s:g:h" opt; do
|
while getopts "f:s:g:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
s) SEARCH="$OPTARG" ;;
|
s) SEARCH="$OPTARG" ;;
|
||||||
g) GROUP="$OPTARG" ;;
|
g) GROUP="$OPTARG" ;;
|
||||||
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) AK_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-f format] [-s search] [-g group]" >&2; exit 1 ;;
|
h) head -15 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-f format] [-s search] [-g group] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q)
|
if [[ -n "$AK_INSTANCE" ]]; then
|
||||||
|
load_credentials "authentik-${AK_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials authentik
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOKEN=$("$SCRIPT_DIR/auth-token.sh" -q ${AK_INSTANCE:+-a "$AK_INSTANCE"})
|
||||||
|
|
||||||
# Build query params
|
# Build query params
|
||||||
PARAMS="ordering=username"
|
PARAMS="ordering=username"
|
||||||
|
|||||||
@@ -426,7 +426,7 @@ if [[ "$CICD_DOCKER" == true ]]; then
|
|||||||
# Extract host from https://host/org/repo.git or git@host:org/repo.git
|
# Extract host from https://host/org/repo.git or git@host:org/repo.git
|
||||||
CICD_REGISTRY=$(echo "$REPO_URL" | sed -E 's|https?://([^/]+)/.*|\1|; s|git@([^:]+):.*|\1|')
|
CICD_REGISTRY=$(echo "$REPO_URL" | sed -E 's|https?://([^/]+)/.*|\1|; s|git@([^:]+):.*|\1|')
|
||||||
CICD_ORG=$(echo "$REPO_URL" | sed -E 's|https?://[^/]+/([^/]+)/.*|\1|; s|git@[^:]+:([^/]+)/.*|\1|')
|
CICD_ORG=$(echo "$REPO_URL" | sed -E 's|https?://[^/]+/([^/]+)/.*|\1|; s|git@[^:]+:([^/]+)/.*|\1|')
|
||||||
CICD_REPO_NAME=$(echo "$REPO_URL" | sed -E 's|.*/([^/]+?)(\.git)?$|\1|')
|
CICD_REPO_NAME=$(echo "$REPO_URL" | sed -E 's|\.git$||' | sed -E 's|.*/([^/]+)$|\1|')
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [[ -n "$CICD_REGISTRY" && -n "$CICD_ORG" && -n "$CICD_REPO_NAME" && ${#CICD_SERVICES[@]} -gt 0 ]]; then
|
if [[ -n "$CICD_REGISTRY" && -n "$CICD_ORG" && -n "$CICD_REPO_NAME" && ${#CICD_SERVICES[@]} -gt 0 ]]; then
|
||||||
|
|||||||
67
tools/cloudflare/_lib.sh
Executable file
67
tools/cloudflare/_lib.sh
Executable file
@@ -0,0 +1,67 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# _lib.sh — Shared helpers for Cloudflare tool scripts
|
||||||
|
#
|
||||||
|
# Usage: source "$(dirname "$0")/_lib.sh"
|
||||||
|
#
|
||||||
|
# Provides:
|
||||||
|
# CF_API — Base API URL
|
||||||
|
# cf_auth — Authorization header value
|
||||||
|
# cf_load_instance <instance> — Load credentials for a specific or default instance
|
||||||
|
# cf_resolve_zone <name_or_id> — Resolves a zone name to its ID (passes IDs through)
|
||||||
|
|
||||||
|
CF_API="https://api.cloudflare.com/client/v4"
|
||||||
|
|
||||||
|
cf_auth() {
|
||||||
|
echo "Bearer $CLOUDFLARE_API_TOKEN"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Load credentials for a Cloudflare instance.
|
||||||
|
# If instance is empty, loads the default.
|
||||||
|
cf_load_instance() {
|
||||||
|
local instance="$1"
|
||||||
|
if [[ -n "$instance" ]]; then
|
||||||
|
load_credentials "cloudflare-${instance}"
|
||||||
|
else
|
||||||
|
load_credentials cloudflare
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Resolve a zone name (e.g. "mosaicstack.dev") to its zone ID.
|
||||||
|
# If the input is already a 32-char hex ID, passes it through.
|
||||||
|
cf_resolve_zone() {
|
||||||
|
local input="$1"
|
||||||
|
|
||||||
|
# If it looks like a zone ID (32 hex chars), pass through
|
||||||
|
if [[ "$input" =~ ^[0-9a-f]{32}$ ]]; then
|
||||||
|
echo "$input"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Resolve by name
|
||||||
|
local response
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${CF_API}/zones?name=${input}&status=active")
|
||||||
|
|
||||||
|
local http_code
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
local body
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to resolve zone '$input' (HTTP $http_code)" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
local zone_id
|
||||||
|
zone_id=$(echo "$body" | jq -r '.result[0].id // empty')
|
||||||
|
|
||||||
|
if [[ -z "$zone_id" ]]; then
|
||||||
|
echo "Error: Zone '$input' not found" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "$zone_id"
|
||||||
|
}
|
||||||
86
tools/cloudflare/record-create.sh
Executable file
86
tools/cloudflare/record-create.sh
Executable file
@@ -0,0 +1,86 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# record-create.sh — Create a DNS record in a Cloudflare zone
|
||||||
|
#
|
||||||
|
# Usage: record-create.sh -z <zone> -t <type> -n <name> -c <content> [-a instance] [-l ttl] [-p] [-P priority]
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -z zone Zone name or ID (required)
|
||||||
|
# -t type Record type: A, AAAA, CNAME, MX, TXT, etc. (required)
|
||||||
|
# -n name Record name, e.g. "app" or "app.example.com" (required)
|
||||||
|
# -c content Record value/content (required)
|
||||||
|
# -a instance Cloudflare instance name (default: uses credentials default)
|
||||||
|
# -l ttl TTL in seconds (default: 1 = auto)
|
||||||
|
# -p Enable Cloudflare proxy (orange cloud)
|
||||||
|
# -P priority MX/SRV priority (default: 10)
|
||||||
|
# -h Show this help
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "$0")/_lib.sh"
|
||||||
|
|
||||||
|
ZONE=""
|
||||||
|
INSTANCE=""
|
||||||
|
TYPE=""
|
||||||
|
NAME=""
|
||||||
|
CONTENT=""
|
||||||
|
TTL=1
|
||||||
|
PROXIED=false
|
||||||
|
PRIORITY=""
|
||||||
|
|
||||||
|
while getopts "z:a:t:n:c:l:pP:h" opt; do
|
||||||
|
case $opt in
|
||||||
|
z) ZONE="$OPTARG" ;;
|
||||||
|
a) INSTANCE="$OPTARG" ;;
|
||||||
|
t) TYPE="$OPTARG" ;;
|
||||||
|
n) NAME="$OPTARG" ;;
|
||||||
|
c) CONTENT="$OPTARG" ;;
|
||||||
|
l) TTL="$OPTARG" ;;
|
||||||
|
p) PROXIED=true ;;
|
||||||
|
P) PRIORITY="$OPTARG" ;;
|
||||||
|
h) head -18 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 -z <zone> -t <type> -n <name> -c <content> [-a instance] [-l ttl] [-p] [-P priority]" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -z "$ZONE" || -z "$TYPE" || -z "$NAME" || -z "$CONTENT" ]]; then
|
||||||
|
echo "Error: -z, -t, -n, and -c are all required" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cf_load_instance "$INSTANCE"
|
||||||
|
ZONE_ID=$(cf_resolve_zone "$ZONE") || exit 1
|
||||||
|
|
||||||
|
# Build JSON payload
|
||||||
|
payload=$(jq -n \
|
||||||
|
--arg type "$TYPE" \
|
||||||
|
--arg name "$NAME" \
|
||||||
|
--arg content "$CONTENT" \
|
||||||
|
--argjson ttl "$TTL" \
|
||||||
|
--argjson proxied "$PROXIED" \
|
||||||
|
'{type: $type, name: $name, content: $content, ttl: $ttl, proxied: $proxied}')
|
||||||
|
|
||||||
|
# Add priority for MX/SRV records
|
||||||
|
if [[ -n "$PRIORITY" ]]; then
|
||||||
|
payload=$(echo "$payload" | jq --argjson priority "$PRIORITY" '. + {priority: $priority}')
|
||||||
|
fi
|
||||||
|
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-X POST \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "$payload" \
|
||||||
|
"${CF_API}/zones/${ZONE_ID}/dns_records")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to create record (HTTP $http_code)" >&2
|
||||||
|
echo "$body" | jq -r '.errors[]?.message // empty' 2>/dev/null >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
record_id=$(echo "$body" | jq -r '.result.id')
|
||||||
|
echo "Created $TYPE record: $NAME → $CONTENT (ID: $record_id)"
|
||||||
55
tools/cloudflare/record-delete.sh
Executable file
55
tools/cloudflare/record-delete.sh
Executable file
@@ -0,0 +1,55 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# record-delete.sh — Delete a DNS record from a Cloudflare zone
|
||||||
|
#
|
||||||
|
# Usage: record-delete.sh -z <zone> -r <record-id> [-a instance]
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -z zone Zone name or ID (required)
|
||||||
|
# -r record-id DNS record ID (required)
|
||||||
|
# -a instance Cloudflare instance name (default: uses credentials default)
|
||||||
|
# -h Show this help
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "$0")/_lib.sh"
|
||||||
|
|
||||||
|
ZONE=""
|
||||||
|
INSTANCE=""
|
||||||
|
RECORD_ID=""
|
||||||
|
|
||||||
|
while getopts "z:a:r:h" opt; do
|
||||||
|
case $opt in
|
||||||
|
z) ZONE="$OPTARG" ;;
|
||||||
|
a) INSTANCE="$OPTARG" ;;
|
||||||
|
r) RECORD_ID="$OPTARG" ;;
|
||||||
|
h) head -11 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 -z <zone> -r <record-id> [-a instance]" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -z "$ZONE" || -z "$RECORD_ID" ]]; then
|
||||||
|
echo "Error: -z and -r are both required" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cf_load_instance "$INSTANCE"
|
||||||
|
ZONE_ID=$(cf_resolve_zone "$ZONE") || exit 1
|
||||||
|
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-X DELETE \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${CF_API}/zones/${ZONE_ID}/dns_records/${RECORD_ID}")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to delete record (HTTP $http_code)" >&2
|
||||||
|
echo "$body" | jq -r '.errors[]?.message // empty' 2>/dev/null >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Deleted DNS record $RECORD_ID from zone $ZONE"
|
||||||
81
tools/cloudflare/record-list.sh
Executable file
81
tools/cloudflare/record-list.sh
Executable file
@@ -0,0 +1,81 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# record-list.sh — List DNS records for a Cloudflare zone
|
||||||
|
#
|
||||||
|
# Usage: record-list.sh -z <zone> [-a instance] [-t type] [-n name] [-f format]
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -z zone Zone name or ID (required)
|
||||||
|
# -a instance Cloudflare instance name (default: uses credentials default)
|
||||||
|
# -t type Filter by record type (A, AAAA, CNAME, MX, TXT, etc.)
|
||||||
|
# -n name Filter by record name
|
||||||
|
# -f format Output format: table (default), json
|
||||||
|
# -h Show this help
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "$0")/_lib.sh"
|
||||||
|
|
||||||
|
ZONE=""
|
||||||
|
INSTANCE=""
|
||||||
|
TYPE=""
|
||||||
|
NAME=""
|
||||||
|
FORMAT="table"
|
||||||
|
|
||||||
|
while getopts "z:a:t:n:f:h" opt; do
|
||||||
|
case $opt in
|
||||||
|
z) ZONE="$OPTARG" ;;
|
||||||
|
a) INSTANCE="$OPTARG" ;;
|
||||||
|
t) TYPE="$OPTARG" ;;
|
||||||
|
n) NAME="$OPTARG" ;;
|
||||||
|
f) FORMAT="$OPTARG" ;;
|
||||||
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 -z <zone> [-a instance] [-t type] [-n name] [-f format]" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -z "$ZONE" ]]; then
|
||||||
|
echo "Error: -z zone is required" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cf_load_instance "$INSTANCE"
|
||||||
|
ZONE_ID=$(cf_resolve_zone "$ZONE") || exit 1
|
||||||
|
|
||||||
|
# Build query params
|
||||||
|
params="per_page=100"
|
||||||
|
[[ -n "$TYPE" ]] && params="${params}&type=${TYPE}"
|
||||||
|
[[ -n "$NAME" ]] && params="${params}&name=${NAME}"
|
||||||
|
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${CF_API}/zones/${ZONE_ID}/dns_records?${params}")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to list records (HTTP $http_code)" >&2
|
||||||
|
echo "$body" | jq -r '.errors[]?.message // empty' 2>/dev/null >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
echo "$body" | jq '.result'
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "RECORD ID TYPE NAME CONTENT PROXIED TTL"
|
||||||
|
echo "-------------------------------- ----- -------------------------------------- ------------------------------- ------- -----"
|
||||||
|
echo "$body" | jq -r '.result[] | [
|
||||||
|
.id,
|
||||||
|
.type,
|
||||||
|
.name,
|
||||||
|
.content,
|
||||||
|
(if .proxied then "yes" else "no" end),
|
||||||
|
(if .ttl == 1 then "auto" else (.ttl | tostring) end)
|
||||||
|
] | @tsv' | while IFS=$'\t' read -r id type name content proxied ttl; do
|
||||||
|
printf "%-32s %-5s %-38s %-31s %-7s %s\n" "$id" "$type" "${name:0:38}" "${content:0:31}" "$proxied" "$ttl"
|
||||||
|
done
|
||||||
86
tools/cloudflare/record-update.sh
Executable file
86
tools/cloudflare/record-update.sh
Executable file
@@ -0,0 +1,86 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# record-update.sh — Update a DNS record in a Cloudflare zone
|
||||||
|
#
|
||||||
|
# Usage: record-update.sh -z <zone> -r <record-id> -t <type> -n <name> -c <content> [-a instance] [-l ttl] [-p] [-P priority]
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -z zone Zone name or ID (required)
|
||||||
|
# -r record-id DNS record ID (required)
|
||||||
|
# -t type Record type: A, AAAA, CNAME, MX, TXT, etc. (required)
|
||||||
|
# -n name Record name (required)
|
||||||
|
# -c content Record value/content (required)
|
||||||
|
# -a instance Cloudflare instance name (default: uses credentials default)
|
||||||
|
# -l ttl TTL in seconds (default: 1 = auto)
|
||||||
|
# -p Enable Cloudflare proxy (orange cloud)
|
||||||
|
# -P priority MX/SRV priority
|
||||||
|
# -h Show this help
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "$0")/_lib.sh"
|
||||||
|
|
||||||
|
ZONE=""
|
||||||
|
INSTANCE=""
|
||||||
|
RECORD_ID=""
|
||||||
|
TYPE=""
|
||||||
|
NAME=""
|
||||||
|
CONTENT=""
|
||||||
|
TTL=1
|
||||||
|
PROXIED=false
|
||||||
|
PRIORITY=""
|
||||||
|
|
||||||
|
while getopts "z:a:r:t:n:c:l:pP:h" opt; do
|
||||||
|
case $opt in
|
||||||
|
z) ZONE="$OPTARG" ;;
|
||||||
|
a) INSTANCE="$OPTARG" ;;
|
||||||
|
r) RECORD_ID="$OPTARG" ;;
|
||||||
|
t) TYPE="$OPTARG" ;;
|
||||||
|
n) NAME="$OPTARG" ;;
|
||||||
|
c) CONTENT="$OPTARG" ;;
|
||||||
|
l) TTL="$OPTARG" ;;
|
||||||
|
p) PROXIED=true ;;
|
||||||
|
P) PRIORITY="$OPTARG" ;;
|
||||||
|
h) head -18 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 -z <zone> -r <record-id> -t <type> -n <name> -c <content> [-a instance]" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -z "$ZONE" || -z "$RECORD_ID" || -z "$TYPE" || -z "$NAME" || -z "$CONTENT" ]]; then
|
||||||
|
echo "Error: -z, -r, -t, -n, and -c are all required" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cf_load_instance "$INSTANCE"
|
||||||
|
ZONE_ID=$(cf_resolve_zone "$ZONE") || exit 1
|
||||||
|
|
||||||
|
payload=$(jq -n \
|
||||||
|
--arg type "$TYPE" \
|
||||||
|
--arg name "$NAME" \
|
||||||
|
--arg content "$CONTENT" \
|
||||||
|
--argjson ttl "$TTL" \
|
||||||
|
--argjson proxied "$PROXIED" \
|
||||||
|
'{type: $type, name: $name, content: $content, ttl: $ttl, proxied: $proxied}')
|
||||||
|
|
||||||
|
if [[ -n "$PRIORITY" ]]; then
|
||||||
|
payload=$(echo "$payload" | jq --argjson priority "$PRIORITY" '. + {priority: $priority}')
|
||||||
|
fi
|
||||||
|
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-X PUT \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "$payload" \
|
||||||
|
"${CF_API}/zones/${ZONE_ID}/dns_records/${RECORD_ID}")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to update record (HTTP $http_code)" >&2
|
||||||
|
echo "$body" | jq -r '.errors[]?.message // empty' 2>/dev/null >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Updated $TYPE record: $NAME → $CONTENT (ID: $RECORD_ID)"
|
||||||
59
tools/cloudflare/zone-list.sh
Executable file
59
tools/cloudflare/zone-list.sh
Executable file
@@ -0,0 +1,59 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# zone-list.sh — List Cloudflare zones (domains)
|
||||||
|
#
|
||||||
|
# Usage: zone-list.sh [-a instance] [-f format]
|
||||||
|
#
|
||||||
|
# Options:
|
||||||
|
# -a instance Cloudflare instance name (default: uses credentials default)
|
||||||
|
# -f format Output format: table (default), json
|
||||||
|
# -h Show this help
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "$0")/_lib.sh"
|
||||||
|
|
||||||
|
INSTANCE=""
|
||||||
|
FORMAT="table"
|
||||||
|
|
||||||
|
while getopts "a:f:h" opt; do
|
||||||
|
case $opt in
|
||||||
|
a) INSTANCE="$OPTARG" ;;
|
||||||
|
f) FORMAT="$OPTARG" ;;
|
||||||
|
h) head -10 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-a instance] [-f format]" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
cf_load_instance "$INSTANCE"
|
||||||
|
|
||||||
|
response=$(curl -s -w "\n%{http_code}" \
|
||||||
|
-H "Authorization: $(cf_auth)" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${CF_API}/zones?per_page=50")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to list zones (HTTP $http_code)" >&2
|
||||||
|
echo "$body" | jq -r '.errors[]?.message // empty' 2>/dev/null >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
echo "$body" | jq '.result'
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "ZONE ID NAME STATUS PLAN"
|
||||||
|
echo "-------------------------------- ---------------------------- -------- ----------"
|
||||||
|
echo "$body" | jq -r '.result[] | [
|
||||||
|
.id,
|
||||||
|
.name,
|
||||||
|
.status,
|
||||||
|
.plan.name
|
||||||
|
] | @tsv' | while IFS=$'\t' read -r id name status plan; do
|
||||||
|
printf "%-32s %-28s %-8s %s\n" "$id" "$name" "$status" "$plan"
|
||||||
|
done
|
||||||
@@ -15,11 +15,10 @@ MANDATORY_FILES=(
|
|||||||
"$MOSAIC_HOME/TOOLS.md"
|
"$MOSAIC_HOME/TOOLS.md"
|
||||||
)
|
)
|
||||||
|
|
||||||
# E2E delivery guide (case-insensitive lookup)
|
# E2E delivery guide (canonical uppercase path)
|
||||||
E2E_DELIVERY=""
|
E2E_DELIVERY=""
|
||||||
for candidate in \
|
for candidate in \
|
||||||
"$MOSAIC_HOME/guides/E2E-DELIVERY.md" \
|
"$MOSAIC_HOME/guides/E2E-DELIVERY.md"; do
|
||||||
"$MOSAIC_HOME/guides/e2e-delivery.md"; do
|
|
||||||
if [[ -f "$candidate" ]]; then
|
if [[ -f "$candidate" ]]; then
|
||||||
E2E_DELIVERY="$candidate"
|
E2E_DELIVERY="$candidate"
|
||||||
break
|
break
|
||||||
|
|||||||
1
tools/excalidraw/.gitignore
vendored
Normal file
1
tools/excalidraw/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
node_modules/
|
||||||
5
tools/excalidraw/launch.sh
Executable file
5
tools/excalidraw/launch.sh
Executable file
@@ -0,0 +1,5 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# Launcher for Excalidraw MCP stdio server.
|
||||||
|
set -euo pipefail
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
exec node --loader "$SCRIPT_DIR/loader.mjs" "$SCRIPT_DIR/server.mjs"
|
||||||
76
tools/excalidraw/loader.mjs
Normal file
76
tools/excalidraw/loader.mjs
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
/**
|
||||||
|
* Custom ESM loader to fix missing .js extensions in @excalidraw/excalidraw deps.
|
||||||
|
*
|
||||||
|
* Problems patched:
|
||||||
|
* 1. excalidraw imports 'roughjs/bin/rough' (and other roughjs/* paths) without .js
|
||||||
|
* 2. roughjs/* files import sibling modules as './canvas' (relative, no .js)
|
||||||
|
* 3. JSON files need { type: 'json' } import attribute in Node.js v22+
|
||||||
|
*
|
||||||
|
* Usage: node --loader ./loader.mjs server.mjs [args...]
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { fileURLToPath, pathToFileURL } from 'url';
|
||||||
|
import { dirname, resolve as pathResolve } from 'path';
|
||||||
|
|
||||||
|
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||||
|
|
||||||
|
// Modules that have incompatible ESM format — redirect to local stubs
|
||||||
|
const STUBS = {
|
||||||
|
'@excalidraw/laser-pointer': pathToFileURL(pathResolve(__dirname, 'stubs/laser-pointer.mjs')).href,
|
||||||
|
};
|
||||||
|
|
||||||
|
export async function resolve(specifier, context, nextResolve) {
|
||||||
|
// 0. Module stubs (incompatible ESM format packages)
|
||||||
|
if (STUBS[specifier]) {
|
||||||
|
return { url: STUBS[specifier], shortCircuit: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Bare roughjs/* specifiers without .js extension
|
||||||
|
if (/^roughjs\/bin\/[a-z-]+$/.test(specifier)) {
|
||||||
|
return nextResolve(`${specifier}.js`, context);
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Relative imports without extension (e.g. './canvas' from roughjs/bin/rough.js)
|
||||||
|
// These come in as relative paths that resolve to extensionless file URLs.
|
||||||
|
if (specifier.startsWith('./') || specifier.startsWith('../')) {
|
||||||
|
// Try resolving first; if it fails with a missing-extension error, add .js
|
||||||
|
try {
|
||||||
|
return await nextResolve(specifier, context);
|
||||||
|
} catch (err) {
|
||||||
|
if (err.code === 'ERR_MODULE_NOT_FOUND') {
|
||||||
|
// Try appending .js
|
||||||
|
try {
|
||||||
|
return await nextResolve(`${specifier}.js`, context);
|
||||||
|
} catch {
|
||||||
|
// Fall through to original error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. JSON imports need type: 'json' attribute
|
||||||
|
if (specifier.endsWith('.json')) {
|
||||||
|
const resolved = await nextResolve(specifier, context);
|
||||||
|
if (!resolved.importAttributes?.type) {
|
||||||
|
return {
|
||||||
|
...resolved,
|
||||||
|
importAttributes: { ...resolved.importAttributes, type: 'json' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return resolved;
|
||||||
|
}
|
||||||
|
|
||||||
|
return nextResolve(specifier, context);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function load(url, context, nextLoad) {
|
||||||
|
// Ensure JSON files are loaded with json format
|
||||||
|
if (url.endsWith('.json')) {
|
||||||
|
return nextLoad(url, {
|
||||||
|
...context,
|
||||||
|
importAttributes: { ...context.importAttributes, type: 'json' },
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return nextLoad(url, context);
|
||||||
|
}
|
||||||
11
tools/excalidraw/package.json
Normal file
11
tools/excalidraw/package.json
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"name": "excalidraw-mcp",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"type": "module",
|
||||||
|
"private": true,
|
||||||
|
"dependencies": {
|
||||||
|
"@modelcontextprotocol/sdk": "^1.12.0",
|
||||||
|
"@excalidraw/excalidraw": "^0.18.0",
|
||||||
|
"jsdom": "^25.0.1"
|
||||||
|
}
|
||||||
|
}
|
||||||
323
tools/excalidraw/server.mjs
Normal file
323
tools/excalidraw/server.mjs
Normal file
@@ -0,0 +1,323 @@
|
|||||||
|
#!/usr/bin/env node
|
||||||
|
/**
|
||||||
|
* Excalidraw MCP stdio server
|
||||||
|
* Provides headless .excalidraw → SVG export via @excalidraw/excalidraw.
|
||||||
|
* Optional: diagram generation via EXCALIDRAW_GEN_PATH (excalidraw_gen.py).
|
||||||
|
*/
|
||||||
|
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||||
|
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
|
||||||
|
import { z } from "zod/v3";
|
||||||
|
import { readFileSync, writeFileSync, existsSync } from 'fs';
|
||||||
|
import { resolve } from 'path';
|
||||||
|
import { spawnSync } from 'child_process';
|
||||||
|
import { JSDOM } from 'jsdom';
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 1. DOM environment — must be established BEFORE importing excalidraw
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
const dom = new JSDOM('<!DOCTYPE html><html><body></body></html>', {
|
||||||
|
url: 'http://localhost/',
|
||||||
|
pretendToBeVisual: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
const { window } = dom;
|
||||||
|
|
||||||
|
// Helper: define a global, overriding read-only getters (e.g. navigator in Node v22)
|
||||||
|
function defineGlobal(key, value) {
|
||||||
|
if (value === undefined) return;
|
||||||
|
try {
|
||||||
|
Object.defineProperty(global, key, {
|
||||||
|
value,
|
||||||
|
writable: true,
|
||||||
|
configurable: true,
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// Already defined and non-configurable — skip
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Core DOM globals
|
||||||
|
defineGlobal('window', window);
|
||||||
|
defineGlobal('document', window.document);
|
||||||
|
defineGlobal('navigator', window.navigator);
|
||||||
|
defineGlobal('location', window.location);
|
||||||
|
defineGlobal('history', window.history);
|
||||||
|
defineGlobal('screen', window.screen);
|
||||||
|
|
||||||
|
// Element / event interfaces
|
||||||
|
for (const key of [
|
||||||
|
'Node', 'Element', 'HTMLElement', 'SVGElement', 'SVGSVGElement',
|
||||||
|
'HTMLCanvasElement', 'HTMLImageElement', 'Image',
|
||||||
|
'Event', 'CustomEvent', 'MouseEvent', 'PointerEvent',
|
||||||
|
'KeyboardEvent', 'TouchEvent', 'WheelEvent', 'InputEvent',
|
||||||
|
'MutationObserver', 'ResizeObserver', 'IntersectionObserver',
|
||||||
|
'XMLHttpRequest', 'XMLSerializer',
|
||||||
|
'DOMParser', 'Range',
|
||||||
|
'getComputedStyle', 'matchMedia',
|
||||||
|
]) {
|
||||||
|
defineGlobal(key, window[key]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Animation frame stubs (jsdom doesn't implement them)
|
||||||
|
global.requestAnimationFrame = (fn) => setTimeout(() => fn(Date.now()), 0);
|
||||||
|
global.cancelAnimationFrame = (id) => clearTimeout(id);
|
||||||
|
|
||||||
|
// CSS Font Loading API stub — jsdom doesn't implement FontFace
|
||||||
|
class FontFaceStub {
|
||||||
|
constructor(family, source, _descriptors) {
|
||||||
|
this.family = family;
|
||||||
|
this.source = source;
|
||||||
|
this.status = 'loaded';
|
||||||
|
this.loaded = Promise.resolve(this);
|
||||||
|
}
|
||||||
|
load() { return Promise.resolve(this); }
|
||||||
|
}
|
||||||
|
defineGlobal('FontFace', FontFaceStub);
|
||||||
|
|
||||||
|
// FontFaceSet stub for document.fonts
|
||||||
|
const fontFaceSet = {
|
||||||
|
add: () => {},
|
||||||
|
delete: () => {},
|
||||||
|
has: () => false,
|
||||||
|
clear: () => {},
|
||||||
|
load: () => Promise.resolve([]),
|
||||||
|
check: () => true,
|
||||||
|
ready: Promise.resolve(),
|
||||||
|
status: 'loaded',
|
||||||
|
forEach: () => {},
|
||||||
|
[Symbol.iterator]: function*() {},
|
||||||
|
};
|
||||||
|
Object.defineProperty(window.document, 'fonts', {
|
||||||
|
value: fontFaceSet,
|
||||||
|
writable: true,
|
||||||
|
configurable: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Canvas stub — excalidraw's exportToSvg doesn't need real canvas rendering,
|
||||||
|
// but the class must exist for isinstance checks.
|
||||||
|
if (!global.HTMLCanvasElement) {
|
||||||
|
defineGlobal('HTMLCanvasElement', window.HTMLCanvasElement ?? class HTMLCanvasElement {});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Device pixel ratio
|
||||||
|
global.devicePixelRatio = 1;
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 1b. Stub canvas getContext — excalidraw calls this at module init time.
|
||||||
|
// jsdom throws "Not implemented" by default; we return a no-op 2D stub.
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
const _canvasCtx = {
|
||||||
|
canvas: { width: 800, height: 600 },
|
||||||
|
fillRect: () => {}, clearRect: () => {}, strokeRect: () => {},
|
||||||
|
getImageData: (x, y, w, h) => ({ data: new Uint8ClampedArray(w * h * 4), width: w, height: h }),
|
||||||
|
putImageData: () => {}, createImageData: () => ({ data: new Uint8ClampedArray(0) }),
|
||||||
|
setTransform: () => {}, resetTransform: () => {}, transform: () => {},
|
||||||
|
drawImage: () => {}, save: () => {}, restore: () => {},
|
||||||
|
scale: () => {}, rotate: () => {}, translate: () => {},
|
||||||
|
beginPath: () => {}, closePath: () => {}, moveTo: () => {}, lineTo: () => {},
|
||||||
|
bezierCurveTo: () => {}, quadraticCurveTo: () => {},
|
||||||
|
arc: () => {}, arcTo: () => {}, ellipse: () => {}, rect: () => {},
|
||||||
|
fill: () => {}, stroke: () => {}, clip: () => {},
|
||||||
|
fillText: () => {}, strokeText: () => {},
|
||||||
|
measureText: (t) => ({ width: t.length * 8, actualBoundingBoxAscent: 12, actualBoundingBoxDescent: 3, fontBoundingBoxAscent: 14, fontBoundingBoxDescent: 4 }),
|
||||||
|
createLinearGradient: () => ({ addColorStop: () => {} }),
|
||||||
|
createRadialGradient: () => ({ addColorStop: () => {} }),
|
||||||
|
createPattern: () => null,
|
||||||
|
setLineDash: () => {}, getLineDash: () => [],
|
||||||
|
isPointInPath: () => false, isPointInStroke: () => false,
|
||||||
|
getContextAttributes: () => ({ alpha: true, desynchronized: false }),
|
||||||
|
font: '10px sans-serif', fillStyle: '#000', strokeStyle: '#000',
|
||||||
|
lineWidth: 1, lineCap: 'butt', lineJoin: 'miter',
|
||||||
|
textAlign: 'start', textBaseline: 'alphabetic',
|
||||||
|
globalAlpha: 1, globalCompositeOperation: 'source-over',
|
||||||
|
shadowOffsetX: 0, shadowOffsetY: 0, shadowBlur: 0, shadowColor: 'transparent',
|
||||||
|
miterLimit: 10, lineDashOffset: 0, filter: 'none', imageSmoothingEnabled: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Patch before excalidraw import so module-level canvas calls get the stub
|
||||||
|
if (window.HTMLCanvasElement) {
|
||||||
|
window.HTMLCanvasElement.prototype.getContext = function (type) {
|
||||||
|
if (type === '2d') return _canvasCtx;
|
||||||
|
return null;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 2. Load excalidraw (dynamic import so globals are set first)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
let exportToSvg;
|
||||||
|
try {
|
||||||
|
const excalidraw = await import('@excalidraw/excalidraw');
|
||||||
|
exportToSvg = excalidraw.exportToSvg;
|
||||||
|
if (!exportToSvg) throw new Error('exportToSvg not found in package exports');
|
||||||
|
} catch (err) {
|
||||||
|
process.stderr.write(`FATAL: Failed to load @excalidraw/excalidraw: ${err.message}\n`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 3. SVG export helper
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
async function renderToSvg(elements, appState, files) {
|
||||||
|
const svgEl = await exportToSvg({
|
||||||
|
elements: elements ?? [],
|
||||||
|
appState: {
|
||||||
|
exportWithDarkMode: false,
|
||||||
|
exportBackground: true,
|
||||||
|
viewBackgroundColor: '#ffffff',
|
||||||
|
...appState,
|
||||||
|
},
|
||||||
|
files: files ?? {},
|
||||||
|
});
|
||||||
|
const serializer = new window.XMLSerializer();
|
||||||
|
return serializer.serializeToString(svgEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 4. Gen subprocess helper (optional — requires EXCALIDRAW_GEN_PATH)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
function requireGenPath() {
|
||||||
|
const p = process.env.EXCALIDRAW_GEN_PATH;
|
||||||
|
if (!p) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return p;
|
||||||
|
}
|
||||||
|
|
||||||
|
function spawnGen(args) {
|
||||||
|
const genPath = requireGenPath();
|
||||||
|
if (!genPath) {
|
||||||
|
return {
|
||||||
|
ok: false,
|
||||||
|
text: 'EXCALIDRAW_GEN_PATH is not set. Set it to the path of excalidraw_gen.py to use diagram generation.',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const result = spawnSync('python3', [genPath, ...args], { encoding: 'utf8' });
|
||||||
|
if (result.error) return { ok: false, text: `spawn error: ${result.error.message}` };
|
||||||
|
if (result.status !== 0) return { ok: false, text: result.stderr || 'subprocess failed' };
|
||||||
|
return { ok: true, text: result.stdout.trim() };
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// 5. MCP Server
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
const server = new McpServer({
|
||||||
|
name: "excalidraw",
|
||||||
|
version: "1.0.0",
|
||||||
|
});
|
||||||
|
|
||||||
|
// --- Tool: excalidraw_to_svg ---
|
||||||
|
|
||||||
|
server.tool(
|
||||||
|
"excalidraw_to_svg",
|
||||||
|
"Convert Excalidraw elements JSON to SVG string",
|
||||||
|
{
|
||||||
|
elements: z.string().describe("JSON string of Excalidraw elements array"),
|
||||||
|
app_state: z.string().optional().describe("JSON string of appState overrides"),
|
||||||
|
},
|
||||||
|
async ({ elements, app_state }) => {
|
||||||
|
let parsed;
|
||||||
|
try {
|
||||||
|
parsed = JSON.parse(elements);
|
||||||
|
} catch (err) {
|
||||||
|
throw new Error(`Invalid elements JSON: ${err.message}`);
|
||||||
|
}
|
||||||
|
const appState = app_state ? JSON.parse(app_state) : {};
|
||||||
|
const svg = await renderToSvg(parsed, appState, {});
|
||||||
|
return { content: [{ type: "text", text: svg }] };
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- Tool: excalidraw_file_to_svg ---
|
||||||
|
|
||||||
|
server.tool(
|
||||||
|
"excalidraw_file_to_svg",
|
||||||
|
"Convert an .excalidraw file to SVG (writes .svg alongside the input file)",
|
||||||
|
{
|
||||||
|
file_path: z.string().describe("Absolute or relative path to .excalidraw file"),
|
||||||
|
},
|
||||||
|
async ({ file_path }) => {
|
||||||
|
const absPath = resolve(file_path);
|
||||||
|
if (!existsSync(absPath)) {
|
||||||
|
throw new Error(`File not found: ${absPath}`);
|
||||||
|
}
|
||||||
|
const raw = JSON.parse(readFileSync(absPath, 'utf8'));
|
||||||
|
const svg = await renderToSvg(raw.elements, raw.appState, raw.files);
|
||||||
|
const outPath = absPath.replace(/\.excalidraw$/, '.svg');
|
||||||
|
writeFileSync(outPath, svg, 'utf8');
|
||||||
|
return {
|
||||||
|
content: [{ type: "text", text: `SVG written to: ${outPath}\n\n${svg}` }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- Tool: list_diagrams ---
|
||||||
|
|
||||||
|
server.tool(
|
||||||
|
"list_diagrams",
|
||||||
|
"List available diagram templates from the DIAGRAMS registry (requires EXCALIDRAW_GEN_PATH)",
|
||||||
|
{},
|
||||||
|
async () => {
|
||||||
|
const res = spawnGen(['--list']);
|
||||||
|
return { content: [{ type: "text", text: res.text }] };
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- Tool: generate_diagram ---
|
||||||
|
|
||||||
|
server.tool(
|
||||||
|
"generate_diagram",
|
||||||
|
"Generate an .excalidraw file from a named diagram template (requires EXCALIDRAW_GEN_PATH)",
|
||||||
|
{
|
||||||
|
name: z.string().describe("Diagram template name (from list_diagrams)"),
|
||||||
|
output_path: z.string().optional().describe("Output path for the .excalidraw file"),
|
||||||
|
},
|
||||||
|
async ({ name, output_path }) => {
|
||||||
|
const args = [name];
|
||||||
|
if (output_path) args.push('--output', output_path);
|
||||||
|
const res = spawnGen(args);
|
||||||
|
if (!res.ok) throw new Error(res.text);
|
||||||
|
return { content: [{ type: "text", text: res.text }] };
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- Tool: generate_and_export ---
|
||||||
|
|
||||||
|
server.tool(
|
||||||
|
"generate_and_export",
|
||||||
|
"Generate an .excalidraw file and immediately export it to SVG (requires EXCALIDRAW_GEN_PATH)",
|
||||||
|
{
|
||||||
|
name: z.string().describe("Diagram template name (from list_diagrams)"),
|
||||||
|
output_path: z.string().optional().describe("Output path for the .excalidraw file (SVG written alongside)"),
|
||||||
|
},
|
||||||
|
async ({ name, output_path }) => {
|
||||||
|
const genArgs = [name];
|
||||||
|
if (output_path) genArgs.push('--output', output_path);
|
||||||
|
const genRes = spawnGen(genArgs);
|
||||||
|
if (!genRes.ok) throw new Error(genRes.text);
|
||||||
|
|
||||||
|
const excalidrawPath = genRes.text;
|
||||||
|
if (!existsSync(excalidrawPath)) {
|
||||||
|
throw new Error(`Generated file not found: ${excalidrawPath}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const raw = JSON.parse(readFileSync(excalidrawPath, 'utf8'));
|
||||||
|
const svg = await renderToSvg(raw.elements, raw.appState, raw.files);
|
||||||
|
const svgPath = excalidrawPath.replace(/\.excalidraw$/, '.svg');
|
||||||
|
writeFileSync(svgPath, svg, 'utf8');
|
||||||
|
|
||||||
|
return {
|
||||||
|
content: [{ type: "text", text: `Generated: ${excalidrawPath}\nExported SVG: ${svgPath}` }],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- Start ---
|
||||||
|
const transport = new StdioServerTransport();
|
||||||
|
await server.connect(transport);
|
||||||
7
tools/excalidraw/stubs/laser-pointer.mjs
Normal file
7
tools/excalidraw/stubs/laser-pointer.mjs
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
/**
|
||||||
|
* Stub for @excalidraw/laser-pointer
|
||||||
|
* The real package uses a Parcel bundle format that Node.js ESM can't consume.
|
||||||
|
* For headless SVG export, the laser pointer feature is not needed.
|
||||||
|
*/
|
||||||
|
export class LaserPointer {}
|
||||||
|
export default { LaserPointer };
|
||||||
@@ -31,41 +31,7 @@ Examples:
|
|||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
get_remote_host() {
|
# get_remote_host and get_gitea_token are provided by detect-platform.sh
|
||||||
local remote_url
|
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
|
||||||
if [[ -z "$remote_url" ]]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^https?://([^/]+)/ ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^git@([^:]+): ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_gitea_token() {
|
|
||||||
local host="$1"
|
|
||||||
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
|
||||||
echo "$GITEA_TOKEN"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
local creds="$HOME/.git-credentials"
|
|
||||||
if [[ -f "$creds" ]]; then
|
|
||||||
local token
|
|
||||||
token=$(grep -F "$host" "$creds" 2>/dev/null | sed -n 's#https\?://[^@]*:\([^@/]*\)@.*#\1#p' | head -n 1)
|
|
||||||
if [[ -n "$token" ]]; then
|
|
||||||
echo "$token"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_state_from_status_json() {
|
get_state_from_status_json() {
|
||||||
python3 - <<'PY'
|
python3 - <<'PY'
|
||||||
|
|||||||
@@ -74,6 +74,75 @@ get_repo_name() {
|
|||||||
echo "${repo_info##*/}"
|
echo "${repo_info##*/}"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
get_remote_host() {
|
||||||
|
local remote_url
|
||||||
|
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
||||||
|
if [[ -z "$remote_url" ]]; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
if [[ "$remote_url" =~ ^https?://([^/]+)/ ]]; then
|
||||||
|
echo "${BASH_REMATCH[1]}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
if [[ "$remote_url" =~ ^git@([^:]+): ]]; then
|
||||||
|
echo "${BASH_REMATCH[1]}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Resolve a Gitea API token for the given host.
|
||||||
|
# Priority: Mosaic credential loader → GITEA_TOKEN env → ~/.git-credentials
|
||||||
|
get_gitea_token() {
|
||||||
|
local host="$1"
|
||||||
|
local script_dir
|
||||||
|
script_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
local cred_loader="$script_dir/../_lib/credentials.sh"
|
||||||
|
|
||||||
|
# 1. Mosaic credential loader (host → service mapping, run in subshell to avoid polluting env)
|
||||||
|
if [[ -f "$cred_loader" ]]; then
|
||||||
|
local token
|
||||||
|
token=$(
|
||||||
|
source "$cred_loader"
|
||||||
|
case "$host" in
|
||||||
|
git.mosaicstack.dev) load_credentials gitea-mosaicstack 2>/dev/null ;;
|
||||||
|
git.uscllc.com) load_credentials gitea-usc 2>/dev/null ;;
|
||||||
|
*)
|
||||||
|
for svc in gitea-mosaicstack gitea-usc; do
|
||||||
|
load_credentials "$svc" 2>/dev/null || continue
|
||||||
|
[[ "${GITEA_URL:-}" == *"$host"* ]] && break
|
||||||
|
unset GITEA_TOKEN GITEA_URL
|
||||||
|
done
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
echo "${GITEA_TOKEN:-}"
|
||||||
|
)
|
||||||
|
if [[ -n "$token" ]]; then
|
||||||
|
echo "$token"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# 2. GITEA_TOKEN env var (may be set by caller)
|
||||||
|
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
||||||
|
echo "$GITEA_TOKEN"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# 3. ~/.git-credentials file
|
||||||
|
local creds="$HOME/.git-credentials"
|
||||||
|
if [[ -f "$creds" ]]; then
|
||||||
|
local token
|
||||||
|
token=$(grep -F "$host" "$creds" 2>/dev/null | sed -n 's#https\?://[^@]*:\([^@/]*\)@.*#\1#p' | head -n 1)
|
||||||
|
if [[ -n "$token" ]]; then
|
||||||
|
echo "$token"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
# If script is run directly (not sourced), output the platform
|
# If script is run directly (not sourced), output the platform
|
||||||
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
||||||
detect_platform
|
detect_platform
|
||||||
|
|||||||
@@ -13,40 +13,7 @@ BODY=""
|
|||||||
LABELS=""
|
LABELS=""
|
||||||
MILESTONE=""
|
MILESTONE=""
|
||||||
|
|
||||||
get_remote_host() {
|
# get_remote_host and get_gitea_token are provided by detect-platform.sh
|
||||||
local remote_url
|
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
|
||||||
if [[ -z "$remote_url" ]]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^https?://([^/]+)/ ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^git@([^:]+): ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_gitea_token() {
|
|
||||||
local host="$1"
|
|
||||||
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
|
||||||
echo "$GITEA_TOKEN"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
local creds="$HOME/.git-credentials"
|
|
||||||
if [[ -f "$creds" ]]; then
|
|
||||||
local token
|
|
||||||
token=$(grep -F "$host" "$creds" 2>/dev/null | sed -n 's#https\?://[^@]*:\([^@/]*\)@.*#\1#p' | head -n 1)
|
|
||||||
if [[ -n "$token" ]]; then
|
|
||||||
echo "$token"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
gitea_issue_create_api() {
|
gitea_issue_create_api() {
|
||||||
local host repo token url payload
|
local host repo token url payload
|
||||||
|
|||||||
@@ -10,40 +10,7 @@ source "$SCRIPT_DIR/detect-platform.sh"
|
|||||||
# Parse arguments
|
# Parse arguments
|
||||||
ISSUE_NUMBER=""
|
ISSUE_NUMBER=""
|
||||||
|
|
||||||
get_remote_host() {
|
# get_remote_host and get_gitea_token are provided by detect-platform.sh
|
||||||
local remote_url
|
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
|
||||||
if [[ -z "$remote_url" ]]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^https?://([^/]+)/ ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^git@([^:]+): ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_gitea_token() {
|
|
||||||
local host="$1"
|
|
||||||
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
|
||||||
echo "$GITEA_TOKEN"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
local creds="$HOME/.git-credentials"
|
|
||||||
if [[ -f "$creds" ]]; then
|
|
||||||
local token
|
|
||||||
token=$(grep -F "$host" "$creds" 2>/dev/null | sed -n 's#https\?://[^@]*:\([^@/]*\)@.*#\1#p' | head -n 1)
|
|
||||||
if [[ -n "$token" ]]; then
|
|
||||||
echo "$token"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
gitea_issue_view_api() {
|
gitea_issue_view_api() {
|
||||||
local host repo token url
|
local host repo token url
|
||||||
|
|||||||
@@ -27,41 +27,7 @@ Examples:
|
|||||||
EOF
|
EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
get_remote_host() {
|
# get_remote_host and get_gitea_token are provided by detect-platform.sh
|
||||||
local remote_url
|
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
|
||||||
if [[ -z "$remote_url" ]]; then
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^https?://([^/]+)/ ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
if [[ "$remote_url" =~ ^git@([^:]+): ]]; then
|
|
||||||
echo "${BASH_REMATCH[1]}"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_gitea_token() {
|
|
||||||
local host="$1"
|
|
||||||
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
|
||||||
echo "$GITEA_TOKEN"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
local creds="$HOME/.git-credentials"
|
|
||||||
if [[ -f "$creds" ]]; then
|
|
||||||
local token
|
|
||||||
token=$(grep -F "$host" "$creds" 2>/dev/null | sed -n 's#https\?://[^@]*:\([^@/]*\)@.*#\1#p' | head -n 1)
|
|
||||||
if [[ -n "$token" ]]; then
|
|
||||||
echo "$token"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
extract_state_from_status_json() {
|
extract_state_from_status_json() {
|
||||||
python3 - <<'PY'
|
python3 - <<'PY'
|
||||||
|
|||||||
@@ -68,11 +68,10 @@ elif [[ "$PLATFORM" == "gitea" ]]; then
|
|||||||
|
|
||||||
DIFF_URL="https://${HOST}/api/v1/repos/${OWNER}/${REPO}/pulls/${PR_NUMBER}.diff"
|
DIFF_URL="https://${HOST}/api/v1/repos/${OWNER}/${REPO}/pulls/${PR_NUMBER}.diff"
|
||||||
|
|
||||||
# Use tea's auth token if available
|
GITEA_API_TOKEN=$(get_gitea_token "$HOST" || true)
|
||||||
TEA_TOKEN=$(tea login list 2>/dev/null | grep "$HOST" | awk '{print $NF}' || true)
|
|
||||||
|
|
||||||
if [[ -n "$TEA_TOKEN" ]]; then
|
if [[ -n "$GITEA_API_TOKEN" ]]; then
|
||||||
DIFF_CONTENT=$(curl -sS -H "Authorization: token $TEA_TOKEN" "$DIFF_URL")
|
DIFF_CONTENT=$(curl -sS -H "Authorization: token $GITEA_API_TOKEN" "$DIFF_URL")
|
||||||
else
|
else
|
||||||
DIFF_CONTENT=$(curl -sS "$DIFF_URL")
|
DIFF_CONTENT=$(curl -sS "$DIFF_URL")
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -69,11 +69,10 @@ elif [[ "$PLATFORM" == "gitea" ]]; then
|
|||||||
|
|
||||||
API_URL="https://${HOST}/api/v1/repos/${OWNER}/${REPO}/pulls/${PR_NUMBER}"
|
API_URL="https://${HOST}/api/v1/repos/${OWNER}/${REPO}/pulls/${PR_NUMBER}"
|
||||||
|
|
||||||
# Use tea's auth token if available
|
GITEA_API_TOKEN=$(get_gitea_token "$HOST" || true)
|
||||||
TEA_TOKEN=$(tea login list 2>/dev/null | grep "$HOST" | awk '{print $NF}' || true)
|
|
||||||
|
|
||||||
if [[ -n "$TEA_TOKEN" ]]; then
|
if [[ -n "$GITEA_API_TOKEN" ]]; then
|
||||||
RAW=$(curl -sS -H "Authorization: token $TEA_TOKEN" "$API_URL")
|
RAW=$(curl -sS -H "Authorization: token $GITEA_API_TOKEN" "$API_URL")
|
||||||
else
|
else
|
||||||
RAW=$(curl -sS "$API_URL")
|
RAW=$(curl -sS "$API_URL")
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
|||||||
ORCH_SUBDIR=".mosaic/orchestrator"
|
ORCH_SUBDIR=".mosaic/orchestrator"
|
||||||
MISSION_FILE="mission.json"
|
MISSION_FILE="mission.json"
|
||||||
SESSION_LOCK_FILE="session.lock"
|
SESSION_LOCK_FILE="session.lock"
|
||||||
|
NEXT_TASK_FILE="next-task.json"
|
||||||
MANIFEST_FILE="docs/MISSION-MANIFEST.md"
|
MANIFEST_FILE="docs/MISSION-MANIFEST.md"
|
||||||
TASKS_MD="docs/TASKS.md"
|
TASKS_MD="docs/TASKS.md"
|
||||||
SCRATCHPAD_DIR="docs/scratchpads"
|
SCRATCHPAD_DIR="docs/scratchpads"
|
||||||
@@ -42,6 +43,30 @@ _require_jq() {
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
coord_runtime() {
|
||||||
|
local runtime="${MOSAIC_COORD_RUNTIME:-claude}"
|
||||||
|
case "$runtime" in
|
||||||
|
claude|codex) echo "$runtime" ;;
|
||||||
|
*) echo "claude" ;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
coord_launch_command() {
|
||||||
|
local runtime
|
||||||
|
runtime="$(coord_runtime)"
|
||||||
|
echo "mosaic $runtime"
|
||||||
|
}
|
||||||
|
|
||||||
|
coord_run_command() {
|
||||||
|
local runtime
|
||||||
|
runtime="$(coord_runtime)"
|
||||||
|
if [[ "$runtime" == "claude" ]]; then
|
||||||
|
echo "mosaic coord run"
|
||||||
|
else
|
||||||
|
echo "mosaic coord run --$runtime"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
# ─── Project / state file access ────────────────────────────────────────────
|
# ─── Project / state file access ────────────────────────────────────────────
|
||||||
|
|
||||||
# Return the orchestrator directory for a project
|
# Return the orchestrator directory for a project
|
||||||
@@ -56,6 +81,11 @@ mission_path() {
|
|||||||
echo "$(orch_dir "$project")/$MISSION_FILE"
|
echo "$(orch_dir "$project")/$MISSION_FILE"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
next_task_capsule_path() {
|
||||||
|
local project="${1:-.}"
|
||||||
|
echo "$(orch_dir "$project")/$NEXT_TASK_FILE"
|
||||||
|
}
|
||||||
|
|
||||||
# Exit with error if mission.json is missing or inactive
|
# Exit with error if mission.json is missing or inactive
|
||||||
require_mission() {
|
require_mission() {
|
||||||
local project="${1:-.}"
|
local project="${1:-.}"
|
||||||
@@ -358,6 +388,113 @@ milestone_name() {
|
|||||||
jq -r --arg id "$mid" '.milestones[] | select(.id == $id) | .name // empty' "$mp"
|
jq -r --arg id "$mid" '.milestones[] | select(.id == $id) | .name // empty' "$mp"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# ─── Next-task capsule helpers ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
write_next_task_capsule() {
|
||||||
|
local project="${1:-.}"
|
||||||
|
local runtime="${2:-claude}"
|
||||||
|
local mission_id="${3:-}"
|
||||||
|
local mission_name="${4:-}"
|
||||||
|
local project_path="${5:-}"
|
||||||
|
local quality_gates="${6:-}"
|
||||||
|
local current_ms_id="${7:-}"
|
||||||
|
local current_ms_name="${8:-}"
|
||||||
|
local next_task="${9:-}"
|
||||||
|
local tasks_done="${10:-0}"
|
||||||
|
local tasks_total="${11:-0}"
|
||||||
|
local pct="${12:-0}"
|
||||||
|
local current_branch="${13:-}"
|
||||||
|
|
||||||
|
_require_jq || return 1
|
||||||
|
mkdir -p "$(orch_dir "$project")"
|
||||||
|
|
||||||
|
local payload
|
||||||
|
payload="$(jq -n \
|
||||||
|
--arg generated_at "$(iso_now)" \
|
||||||
|
--arg runtime "$runtime" \
|
||||||
|
--arg mission_id "$mission_id" \
|
||||||
|
--arg mission_name "$mission_name" \
|
||||||
|
--arg project_path "$project_path" \
|
||||||
|
--arg quality_gates "$quality_gates" \
|
||||||
|
--arg current_ms_id "$current_ms_id" \
|
||||||
|
--arg current_ms_name "$current_ms_name" \
|
||||||
|
--arg next_task "$next_task" \
|
||||||
|
--arg current_branch "$current_branch" \
|
||||||
|
--arg tasks_done "$tasks_done" \
|
||||||
|
--arg tasks_total "$tasks_total" \
|
||||||
|
--arg pct "$pct" \
|
||||||
|
'{
|
||||||
|
generated_at: $generated_at,
|
||||||
|
runtime: $runtime,
|
||||||
|
mission_id: $mission_id,
|
||||||
|
mission_name: $mission_name,
|
||||||
|
project_path: $project_path,
|
||||||
|
quality_gates: $quality_gates,
|
||||||
|
current_milestone: {
|
||||||
|
id: $current_ms_id,
|
||||||
|
name: $current_ms_name
|
||||||
|
},
|
||||||
|
next_task: $next_task,
|
||||||
|
progress: {
|
||||||
|
tasks_done: ($tasks_done | tonumber),
|
||||||
|
tasks_total: ($tasks_total | tonumber),
|
||||||
|
pct: ($pct | tonumber)
|
||||||
|
},
|
||||||
|
current_branch: $current_branch
|
||||||
|
}')"
|
||||||
|
|
||||||
|
write_json "$(next_task_capsule_path "$project")" "$payload"
|
||||||
|
}
|
||||||
|
|
||||||
|
build_codex_strict_kickoff() {
|
||||||
|
local project="${1:-.}"
|
||||||
|
local continuation_prompt="${2:-}"
|
||||||
|
|
||||||
|
_require_jq || return 1
|
||||||
|
|
||||||
|
local capsule_path
|
||||||
|
capsule_path="$(next_task_capsule_path "$project")"
|
||||||
|
local capsule='{}'
|
||||||
|
if [[ -f "$capsule_path" ]]; then
|
||||||
|
capsule="$(cat "$capsule_path")"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local mission_id next_task project_path quality_gates
|
||||||
|
mission_id="$(echo "$capsule" | jq -r '.mission_id // "unknown"')"
|
||||||
|
next_task="$(echo "$capsule" | jq -r '.next_task // "none"')"
|
||||||
|
project_path="$(echo "$capsule" | jq -r '.project_path // "."')"
|
||||||
|
quality_gates="$(echo "$capsule" | jq -r '.quality_gates // "none"')"
|
||||||
|
|
||||||
|
cat <<EOF
|
||||||
|
Now initiating Orchestrator mode...
|
||||||
|
|
||||||
|
STRICT EXECUTION PROFILE FOR CODEX (HARD GATE)
|
||||||
|
- Do NOT ask clarifying questions before your first tool actions unless a Mosaic escalation trigger is hit.
|
||||||
|
- Your first actions must be reading mission state files in order.
|
||||||
|
- Treat the next-task capsule as authoritative execution input.
|
||||||
|
|
||||||
|
REQUIRED FIRST ACTIONS (IN ORDER)
|
||||||
|
1. Read ~/.config/mosaic/guides/ORCHESTRATOR-PROTOCOL.md
|
||||||
|
2. Read docs/MISSION-MANIFEST.md
|
||||||
|
3. Read docs/scratchpads/${mission_id}.md
|
||||||
|
4. Read docs/TASKS.md
|
||||||
|
5. Begin execution on next task: ${next_task}
|
||||||
|
|
||||||
|
WORKING CONTEXT
|
||||||
|
- Project: ${project_path}
|
||||||
|
- Quality gates: ${quality_gates}
|
||||||
|
- Capsule file: .mosaic/orchestrator/next-task.json
|
||||||
|
|
||||||
|
Task capsule (JSON):
|
||||||
|
\`\`\`json
|
||||||
|
${capsule}
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
Continuation prompt:
|
||||||
|
${continuation_prompt}
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
# Get next milestone after the given one
|
# Get next milestone after the given one
|
||||||
next_milestone_id() {
|
next_milestone_id() {
|
||||||
local project="${1:-.}"
|
local project="${1:-.}"
|
||||||
|
|||||||
@@ -30,6 +30,8 @@ done
|
|||||||
|
|
||||||
_require_jq
|
_require_jq
|
||||||
require_mission "$PROJECT"
|
require_mission "$PROJECT"
|
||||||
|
target_runtime="$(coord_runtime)"
|
||||||
|
launch_cmd="$(coord_launch_command)"
|
||||||
|
|
||||||
# ─── Load mission data ──────────────────────────────────────────────────────
|
# ─── Load mission data ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
@@ -93,6 +95,22 @@ if (( session_count > 0 )); then
|
|||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Write machine-readable next-task capsule for deterministic runtime launches.
|
||||||
|
write_next_task_capsule \
|
||||||
|
"$PROJECT" \
|
||||||
|
"$target_runtime" \
|
||||||
|
"$mission_id" \
|
||||||
|
"$mission_name" \
|
||||||
|
"$project_path" \
|
||||||
|
"$quality_gates" \
|
||||||
|
"$current_ms_id" \
|
||||||
|
"$current_ms_name" \
|
||||||
|
"$next_task" \
|
||||||
|
"$tasks_done" \
|
||||||
|
"$tasks_total" \
|
||||||
|
"$pct" \
|
||||||
|
"$current_branch"
|
||||||
|
|
||||||
# ─── Generate prompt ────────────────────────────────────────────────────────
|
# ─── Generate prompt ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
prompt="$(cat <<EOF
|
prompt="$(cat <<EOF
|
||||||
@@ -108,6 +126,7 @@ Continue **$mission_name** from existing state.
|
|||||||
- **Scratchpad:** docs/scratchpads/${mission_id}.md
|
- **Scratchpad:** docs/scratchpads/${mission_id}.md
|
||||||
- **Protocol:** ~/.config/mosaic/guides/ORCHESTRATOR.md
|
- **Protocol:** ~/.config/mosaic/guides/ORCHESTRATOR.md
|
||||||
- **Quality gates:** $quality_gates
|
- **Quality gates:** $quality_gates
|
||||||
|
- **Target runtime:** $target_runtime
|
||||||
|
|
||||||
## Resume Point
|
## Resume Point
|
||||||
|
|
||||||
@@ -129,9 +148,10 @@ Continue **$mission_name** from existing state.
|
|||||||
3. Read \`docs/scratchpads/${mission_id}.md\` for session history and decisions
|
3. Read \`docs/scratchpads/${mission_id}.md\` for session history and decisions
|
||||||
4. Read \`docs/TASKS.md\` for current task state
|
4. Read \`docs/TASKS.md\` for current task state
|
||||||
5. \`git pull --rebase\` to sync latest changes
|
5. \`git pull --rebase\` to sync latest changes
|
||||||
6. Continue execution from task **${next_task:-next-pending}**
|
6. Launch runtime with \`$launch_cmd\`
|
||||||
7. Follow Two-Phase Completion Protocol
|
7. Continue execution from task **${next_task:-next-pending}**
|
||||||
8. You are the SOLE writer of \`docs/TASKS.md\`
|
8. Follow Two-Phase Completion Protocol
|
||||||
|
9. You are the SOLE writer of \`docs/TASKS.md\`
|
||||||
EOF
|
EOF
|
||||||
)"
|
)"
|
||||||
|
|
||||||
|
|||||||
@@ -270,6 +270,9 @@ fi
|
|||||||
|
|
||||||
# ─── Report ──────────────────────────────────────────────────────────────────
|
# ─── Report ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
runtime_cmd="$(coord_launch_command)"
|
||||||
|
run_cmd="$(coord_run_command)"
|
||||||
|
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${C_GREEN}${C_BOLD}Mission initialized: $NAME${C_RESET}"
|
echo -e "${C_GREEN}${C_BOLD}Mission initialized: $NAME${C_RESET}"
|
||||||
echo ""
|
echo ""
|
||||||
@@ -280,4 +283,4 @@ echo -e " ${C_CYAN}Manifest:${C_RESET} $manifest_path"
|
|||||||
echo -e " ${C_CYAN}Scratchpad:${C_RESET} $sp_file"
|
echo -e " ${C_CYAN}Scratchpad:${C_RESET} $sp_file"
|
||||||
echo -e " ${C_CYAN}Tasks:${C_RESET} $tasks_path"
|
echo -e " ${C_CYAN}Tasks:${C_RESET} $tasks_path"
|
||||||
echo ""
|
echo ""
|
||||||
echo "Next: Launch an agent session with 'mosaic claude' or generate a prompt with 'mosaic coord continue'"
|
echo "Next: Resume with '$run_cmd' (or launch directly with '$runtime_cmd')."
|
||||||
|
|||||||
80
tools/orchestrator/session-run.sh
Executable file
80
tools/orchestrator/session-run.sh
Executable file
@@ -0,0 +1,80 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# session-run.sh — Generate continuation context and launch target runtime.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# session-run.sh [--project <path>] [--milestone <id>] [--print]
|
||||||
|
#
|
||||||
|
# Behavior:
|
||||||
|
# - Builds continuation prompt + next-task capsule.
|
||||||
|
# - Launches selected runtime (default: claude, override via MOSAIC_COORD_RUNTIME).
|
||||||
|
# - For codex, injects strict orchestration kickoff to reduce clarification loops.
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
PROJECT="."
|
||||||
|
MILESTONE=""
|
||||||
|
PRINT=false
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--project) PROJECT="$2"; shift 2 ;;
|
||||||
|
--milestone) MILESTONE="$2"; shift 2 ;;
|
||||||
|
--print) PRINT=true; shift ;;
|
||||||
|
-h|--help)
|
||||||
|
cat <<'USAGE'
|
||||||
|
Usage: session-run.sh [--project <path>] [--milestone <id>] [--print]
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--project <path> Project directory (default: CWD)
|
||||||
|
--milestone <id> Force specific milestone context
|
||||||
|
--print Print launch prompt only (no runtime launch)
|
||||||
|
USAGE
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*) echo "Unknown option: $1" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
PROJECT="${PROJECT/#\~/$HOME}"
|
||||||
|
PROJECT="$(cd "$PROJECT" && pwd)"
|
||||||
|
|
||||||
|
_require_jq
|
||||||
|
require_mission "$PROJECT"
|
||||||
|
|
||||||
|
runtime="$(coord_runtime)"
|
||||||
|
launch_cmd="$(coord_launch_command)"
|
||||||
|
|
||||||
|
continue_cmd=(bash "$SCRIPT_DIR/continue-prompt.sh" --project "$PROJECT")
|
||||||
|
if [[ -n "$MILESTONE" ]]; then
|
||||||
|
continue_cmd+=(--milestone "$MILESTONE")
|
||||||
|
fi
|
||||||
|
|
||||||
|
continuation_prompt="$(MOSAIC_COORD_RUNTIME="$runtime" "${continue_cmd[@]}")"
|
||||||
|
|
||||||
|
if [[ "$runtime" == "codex" ]]; then
|
||||||
|
launch_prompt="$(build_codex_strict_kickoff "$PROJECT" "$continuation_prompt")"
|
||||||
|
else
|
||||||
|
launch_prompt="$continuation_prompt"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$PRINT" == true ]]; then
|
||||||
|
echo "$launch_prompt"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${C_CYAN}Launching orchestration runtime: ${launch_cmd}${C_RESET}"
|
||||||
|
echo -e "${C_CYAN}Project:${C_RESET} $PROJECT"
|
||||||
|
echo -e "${C_CYAN}Capsule:${C_RESET} $(next_task_capsule_path "$PROJECT")"
|
||||||
|
|
||||||
|
cd "$PROJECT"
|
||||||
|
if [[ "$runtime" == "claude" ]]; then
|
||||||
|
exec "$MOSAIC_HOME/bin/mosaic" claude "$launch_prompt"
|
||||||
|
elif [[ "$runtime" == "codex" ]]; then
|
||||||
|
exec "$MOSAIC_HOME/bin/mosaic" codex "$launch_prompt"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${C_RED}Unsupported coord runtime: $runtime${C_RESET}" >&2
|
||||||
|
exit 1
|
||||||
@@ -33,15 +33,99 @@ while [[ $# -gt 0 ]]; do
|
|||||||
done
|
done
|
||||||
|
|
||||||
_require_jq
|
_require_jq
|
||||||
|
runtime_cmd="$(coord_launch_command)"
|
||||||
|
run_cmd="$(coord_run_command)"
|
||||||
|
|
||||||
# ─── Check session lock ─────────────────────────────────────────────────────
|
# ─── Check session lock ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
lock_data=""
|
lock_data=""
|
||||||
if ! lock_data="$(session_lock_read "$PROJECT")"; then
|
if ! lock_data="$(session_lock_read "$PROJECT")"; then
|
||||||
if [[ "$FORMAT" == "json" ]]; then
|
# No active session — but check if a mission exists
|
||||||
echo '{"status":"no-session"}'
|
mp="$(mission_path "$PROJECT")"
|
||||||
|
if [[ -f "$mp" ]]; then
|
||||||
|
m_status="$(jq -r '.status // "inactive"' "$mp")"
|
||||||
|
m_name="$(jq -r '.name // "unnamed"' "$mp")"
|
||||||
|
m_id="$(jq -r '.mission_id // ""' "$mp")"
|
||||||
|
m_total="$(jq '.milestones | length' "$mp")"
|
||||||
|
m_done="$(jq '[.milestones[] | select(.status == "completed")] | length' "$mp")"
|
||||||
|
m_current="$(jq -r '[.milestones[] | select(.status == "active" or .status == "pending")][0].name // "none"' "$mp")"
|
||||||
|
|
||||||
|
# Task counts if TASKS.md exists
|
||||||
|
task_json="$(count_tasks_md "$PROJECT")"
|
||||||
|
t_total="$(echo "$task_json" | jq '.total')"
|
||||||
|
t_done="$(echo "$task_json" | jq '.done')"
|
||||||
|
t_pending="$(echo "$task_json" | jq '.pending')"
|
||||||
|
t_inprog="$(echo "$task_json" | jq '.in_progress')"
|
||||||
|
|
||||||
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
jq -n \
|
||||||
|
--arg status "no-session" \
|
||||||
|
--arg mission_status "$m_status" \
|
||||||
|
--arg mission_name "$m_name" \
|
||||||
|
--arg mission_id "$m_id" \
|
||||||
|
--argjson milestones_total "$m_total" \
|
||||||
|
--argjson milestones_done "$m_done" \
|
||||||
|
--argjson tasks_total "$t_total" \
|
||||||
|
--argjson tasks_done "$t_done" \
|
||||||
|
'{
|
||||||
|
status: $status,
|
||||||
|
mission: {
|
||||||
|
status: $mission_status,
|
||||||
|
name: $mission_name,
|
||||||
|
id: $mission_id,
|
||||||
|
milestones_total: $milestones_total,
|
||||||
|
milestones_done: $milestones_done,
|
||||||
|
tasks_total: $tasks_total,
|
||||||
|
tasks_done: $tasks_done
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
else
|
||||||
|
echo ""
|
||||||
|
echo -e " ${C_DIM}No active agent session.${C_RESET}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Mission info
|
||||||
|
case "$m_status" in
|
||||||
|
active) ms_color="${C_GREEN}ACTIVE${C_RESET}" ;;
|
||||||
|
paused) ms_color="${C_YELLOW}PAUSED${C_RESET}" ;;
|
||||||
|
completed) ms_color="${C_CYAN}COMPLETED${C_RESET}" ;;
|
||||||
|
*) ms_color="${C_DIM}${m_status}${C_RESET}" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
echo -e " ${C_BOLD}Mission:${C_RESET} $m_name"
|
||||||
|
echo -e " ${C_CYAN}Status:${C_RESET} $ms_color"
|
||||||
|
echo -e " ${C_CYAN}ID:${C_RESET} $m_id"
|
||||||
|
echo -e " ${C_CYAN}Milestones:${C_RESET} $m_done / $m_total completed"
|
||||||
|
[[ "$m_current" != "none" ]] && echo -e " ${C_CYAN}Current:${C_RESET} $m_current"
|
||||||
|
|
||||||
|
if (( t_total > 0 )); then
|
||||||
|
echo -e " ${C_CYAN}Tasks:${C_RESET} $t_done / $t_total done ($t_pending pending, $t_inprog in-progress)"
|
||||||
|
fi
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
if [[ "$m_status" == "active" || "$m_status" == "paused" ]]; then
|
||||||
|
echo -e " ${C_BOLD}Next steps:${C_RESET}"
|
||||||
|
echo " $run_cmd Auto-generate context and launch"
|
||||||
|
echo " mosaic coord continue Generate continuation prompt"
|
||||||
|
echo " $runtime_cmd Launch agent session"
|
||||||
|
elif [[ "$m_status" == "completed" ]]; then
|
||||||
|
echo -e " ${C_DIM}Mission completed. Start a new one with: mosaic coord init${C_RESET}"
|
||||||
|
else
|
||||||
|
echo -e " ${C_DIM}Initialize with: mosaic coord init --name \"Mission Name\"${C_RESET}"
|
||||||
|
fi
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
else
|
else
|
||||||
echo -e "${C_DIM}No active session.${C_RESET}"
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
echo '{"status":"no-session","mission":null}'
|
||||||
|
else
|
||||||
|
echo ""
|
||||||
|
echo -e " ${C_DIM}No active session.${C_RESET}"
|
||||||
|
echo -e " ${C_DIM}No mission found.${C_RESET}"
|
||||||
|
echo ""
|
||||||
|
echo " Initialize with: mosaic coord init --name \"Mission Name\""
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
fi
|
fi
|
||||||
exit 4
|
exit 4
|
||||||
fi
|
fi
|
||||||
|
|||||||
78
tools/orchestrator/smoke-test.sh
Executable file
78
tools/orchestrator/smoke-test.sh
Executable file
@@ -0,0 +1,78 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# smoke-test.sh — Behavior smoke checks for coord continue/run workflows.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# smoke-test.sh
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
PASS=0
|
||||||
|
FAIL=0
|
||||||
|
|
||||||
|
pass_case() {
|
||||||
|
echo "PASS: $1"
|
||||||
|
PASS=$((PASS + 1))
|
||||||
|
}
|
||||||
|
|
||||||
|
fail_case() {
|
||||||
|
echo "FAIL: $1" >&2
|
||||||
|
FAIL=$((FAIL + 1))
|
||||||
|
}
|
||||||
|
|
||||||
|
tmp_project="$(mktemp -d)"
|
||||||
|
trap 'rm -rf "$tmp_project"' EXIT
|
||||||
|
|
||||||
|
mkdir -p "$tmp_project/.mosaic/orchestrator" "$tmp_project/docs/scratchpads"
|
||||||
|
|
||||||
|
cat > "$tmp_project/.mosaic/orchestrator/mission.json" <<'JSON'
|
||||||
|
{
|
||||||
|
"mission_id": "smoke-mission-20260223",
|
||||||
|
"name": "Smoke Mission",
|
||||||
|
"status": "active",
|
||||||
|
"project_path": "SMOKE_PROJECT",
|
||||||
|
"quality_gates": "pnpm lint && pnpm test",
|
||||||
|
"milestones": [
|
||||||
|
{ "id": "M1", "name": "Milestone One", "status": "pending" }
|
||||||
|
],
|
||||||
|
"sessions": []
|
||||||
|
}
|
||||||
|
JSON
|
||||||
|
|
||||||
|
cat > "$tmp_project/docs/MISSION-MANIFEST.md" <<'MD'
|
||||||
|
# Mission Manifest
|
||||||
|
MD
|
||||||
|
|
||||||
|
cat > "$tmp_project/docs/scratchpads/smoke-mission-20260223.md" <<'MD'
|
||||||
|
# Scratchpad
|
||||||
|
MD
|
||||||
|
|
||||||
|
cat > "$tmp_project/docs/TASKS.md" <<'MD'
|
||||||
|
| id | status | milestone | description | pr | notes |
|
||||||
|
|----|--------|-----------|-------------|----|-------|
|
||||||
|
| T-001 | pending | M1 | Smoke task | | |
|
||||||
|
MD
|
||||||
|
|
||||||
|
codex_continue_output="$(MOSAIC_COORD_RUNTIME=codex bash "$SCRIPT_DIR/continue-prompt.sh" --project "$tmp_project")"
|
||||||
|
capsule_file="$tmp_project/.mosaic/orchestrator/next-task.json"
|
||||||
|
|
||||||
|
if [[ -f "$capsule_file" ]]; then pass_case "continue writes next-task capsule"; else fail_case "continue writes next-task capsule"; fi
|
||||||
|
if jq -e '.runtime == "codex"' "$capsule_file" >/dev/null 2>&1; then pass_case "capsule runtime is codex"; else fail_case "capsule runtime is codex"; fi
|
||||||
|
if jq -e '.next_task == "T-001"' "$capsule_file" >/dev/null 2>&1; then pass_case "capsule next_task is T-001"; else fail_case "capsule next_task is T-001"; fi
|
||||||
|
if grep -Fq 'Target runtime:** codex' <<< "$codex_continue_output"; then pass_case "continue prompt contains target runtime codex"; else fail_case "continue prompt contains target runtime codex"; fi
|
||||||
|
|
||||||
|
codex_run_prompt="$(MOSAIC_COORD_RUNTIME=codex bash "$SCRIPT_DIR/session-run.sh" --project "$tmp_project" --print)"
|
||||||
|
if [[ "$(printf '%s\n' "$codex_run_prompt" | head -n1)" == "Now initiating Orchestrator mode..." ]]; then pass_case "codex run prompt first line is mode declaration"; else fail_case "codex run prompt first line is mode declaration"; fi
|
||||||
|
if grep -Fq 'Do NOT ask clarifying questions before your first tool actions' <<< "$codex_run_prompt"; then pass_case "codex run prompt includes no-questions hard gate"; else fail_case "codex run prompt includes no-questions hard gate"; fi
|
||||||
|
if grep -Fq '"next_task": "T-001"' <<< "$codex_run_prompt"; then pass_case "codex run prompt embeds capsule json"; else fail_case "codex run prompt embeds capsule json"; fi
|
||||||
|
|
||||||
|
claude_run_prompt="$(MOSAIC_COORD_RUNTIME=claude bash "$SCRIPT_DIR/session-run.sh" --project "$tmp_project" --print)"
|
||||||
|
if [[ "$(printf '%s\n' "$claude_run_prompt" | head -n1)" == "## Continuation Mission" ]]; then pass_case "claude run prompt remains continuation prompt format"; else fail_case "claude run prompt remains continuation prompt format"; fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Smoke test summary: pass=$PASS fail=$FAIL"
|
||||||
|
if (( FAIL > 0 )); then
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
279
tools/prdy/_lib.sh
Normal file
279
tools/prdy/_lib.sh
Normal file
@@ -0,0 +1,279 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# _lib.sh — Shared functions for mosaic prdy tools
|
||||||
|
#
|
||||||
|
# Usage: source ~/.config/mosaic/tools/prdy/_lib.sh
|
||||||
|
#
|
||||||
|
# Provides PRD detection, section validation, and system prompt generation
|
||||||
|
# for interactive PRD creation/update sessions.
|
||||||
|
|
||||||
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
|
PRD_CANONICAL="docs/PRD.md"
|
||||||
|
PRD_JSON_ALT="docs/PRD.json"
|
||||||
|
|
||||||
|
# ─── Color support ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if [[ -t 1 ]]; then
|
||||||
|
C_GREEN='\033[0;32m'
|
||||||
|
C_RED='\033[0;31m'
|
||||||
|
C_YELLOW='\033[0;33m'
|
||||||
|
C_CYAN='\033[0;36m'
|
||||||
|
C_BOLD='\033[1m'
|
||||||
|
C_DIM='\033[2m'
|
||||||
|
C_RESET='\033[0m'
|
||||||
|
else
|
||||||
|
C_GREEN='' C_RED='' C_YELLOW='' C_CYAN='' C_BOLD='' C_DIM='' C_RESET=''
|
||||||
|
fi
|
||||||
|
|
||||||
|
ok() { echo -e " ${C_GREEN}✓${C_RESET} $1"; }
|
||||||
|
warn() { echo -e " ${C_YELLOW}⚠${C_RESET} $1" >&2; }
|
||||||
|
fail() { echo -e " ${C_RED}✗${C_RESET} $1" >&2; }
|
||||||
|
info() { echo -e " ${C_CYAN}ℹ${C_RESET} $1"; }
|
||||||
|
step() { echo -e "\n${C_BOLD}$1${C_RESET}"; }
|
||||||
|
|
||||||
|
# ─── Dependency checks ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_require_cmd() {
|
||||||
|
local cmd="$1"
|
||||||
|
if ! command -v "$cmd" &>/dev/null; then
|
||||||
|
fail "'$cmd' is required but not installed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
prdy_runtime() {
|
||||||
|
local runtime="${MOSAIC_PRDY_RUNTIME:-claude}"
|
||||||
|
case "$runtime" in
|
||||||
|
claude|codex) echo "$runtime" ;;
|
||||||
|
*) echo "claude" ;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
prdy_runtime_command() {
|
||||||
|
local runtime
|
||||||
|
runtime="$(prdy_runtime)"
|
||||||
|
echo "$runtime"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ─── PRD detection ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Find the PRD file in a project directory.
|
||||||
|
# Returns the path (relative to project root) or empty string.
|
||||||
|
find_prd() {
|
||||||
|
local project="${1:-.}"
|
||||||
|
if [[ -f "$project/$PRD_CANONICAL" ]]; then
|
||||||
|
echo "$project/$PRD_CANONICAL"
|
||||||
|
elif [[ -f "$project/$PRD_JSON_ALT" ]]; then
|
||||||
|
echo "$project/$PRD_JSON_ALT"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ─── Section validation manifest ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
# 10 required sections per ~/.config/mosaic/guides/PRD.md
|
||||||
|
# Each entry: "label|grep_pattern" (extended regex, case-insensitive via grep -iE)
|
||||||
|
PRDY_REQUIRED_SECTIONS=(
|
||||||
|
"Problem Statement|^#{2,3} .*(problem statement|objective)"
|
||||||
|
"Scope / Non-Goals|^#{2,3} .*(scope|non.goal|out of scope|in.scope)"
|
||||||
|
"User Stories / Requirements|^#{2,3} .*(user stor|stakeholder|user.*requirement)"
|
||||||
|
"Functional Requirements|^#{2,3} .*functional requirement"
|
||||||
|
"Non-Functional Requirements|^#{2,3} .*non.functional"
|
||||||
|
"Acceptance Criteria|^#{2,3} .*acceptance criteria|\*\*acceptance criteria\*\*|- \[ \]"
|
||||||
|
"Technical Considerations|^#{2,3} .*(technical consideration|constraint|dependenc)"
|
||||||
|
"Risks / Open Questions|^#{2,3} .*(risk|open question)"
|
||||||
|
"Success Metrics / Testing|^#{2,3} .*(success metric|test|verification)"
|
||||||
|
"Milestones / Delivery|^#{2,3} .*(milestone|delivery|scope version)"
|
||||||
|
)
|
||||||
|
|
||||||
|
# ─── System prompt builder ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Build the specialized system prompt for PRD sessions.
|
||||||
|
# Usage: build_prdy_system_prompt <mode>
|
||||||
|
# mode: "init" or "update"
|
||||||
|
build_prdy_system_prompt() {
|
||||||
|
local mode="${1:-init}"
|
||||||
|
local guide_file="$MOSAIC_HOME/guides/PRD.md"
|
||||||
|
local template_file="$MOSAIC_HOME/templates/docs/PRD.md.template"
|
||||||
|
|
||||||
|
cat <<'PROMPT_HEADER'
|
||||||
|
# Mosaic PRD Agent — Behavioral Contract
|
||||||
|
|
||||||
|
You are a specialized PRD (Product Requirements Document) agent. Your sole purpose is to help the user create or update a comprehensive PRD document.
|
||||||
|
|
||||||
|
## Mode Declaration (Hard Gate)
|
||||||
|
|
||||||
|
PROMPT_HEADER
|
||||||
|
|
||||||
|
if [[ "$mode" == "init" ]]; then
|
||||||
|
cat <<'INIT_MODE'
|
||||||
|
Your first response MUST start with: "Now initiating PRD Creation mode..."
|
||||||
|
|
||||||
|
You are creating a NEW PRD. The output file is `docs/PRD.md`.
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Read the project directory to understand context (README, package.json, existing docs, source structure)
|
||||||
|
2. Ask the user 3-5 clarifying questions with lettered options (A/B/C/D) so they can answer "1A, 2C, 3B"
|
||||||
|
3. Focus questions on: problem/goal, target users, scope boundaries, milestone structure, success criteria
|
||||||
|
4. After answers (or if the description is sufficiently complete), generate the full PRD
|
||||||
|
5. Write to `docs/PRD.md` (create `docs/` directory if it doesn't exist)
|
||||||
|
6. After writing, tell the user: "PRD written to docs/PRD.md. Run `mosaic prdy validate` to verify completeness."
|
||||||
|
|
||||||
|
INIT_MODE
|
||||||
|
else
|
||||||
|
cat <<'UPDATE_MODE'
|
||||||
|
Your first response MUST start with: "Now initiating PRD Update mode..."
|
||||||
|
|
||||||
|
You are UPDATING an existing PRD. Read `docs/PRD.md` (or `docs/PRD.json`) first.
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Read the existing PRD file completely
|
||||||
|
2. Summarize the current state to the user (title, status, milestone count, open questions)
|
||||||
|
3. Ask the user what changes or additions are needed
|
||||||
|
4. Make targeted modifications — do NOT rewrite sections that don't need changes
|
||||||
|
5. Preserve existing user stories, FRs, and acceptance criteria unless explicitly asked to change them
|
||||||
|
6. After writing, tell the user: "PRD updated. Run `mosaic prdy validate` to verify completeness."
|
||||||
|
|
||||||
|
UPDATE_MODE
|
||||||
|
fi
|
||||||
|
|
||||||
|
cat <<'CONSTRAINTS'
|
||||||
|
## Hard Constraints
|
||||||
|
|
||||||
|
MUST:
|
||||||
|
- Save output to `docs/PRD.md` only
|
||||||
|
- Include ALL 10 required sections from the Mosaic PRD guide (included below)
|
||||||
|
- Number functional requirements as FR-1, FR-2, ... in sequence
|
||||||
|
- Group user stories under named Milestones (e.g., "Milestone 0.0.4 — Foundation")
|
||||||
|
- Format user stories as US-NNN with Description and Acceptance Criteria checkboxes
|
||||||
|
- Mark all guessed decisions with `ASSUMPTION:` and rationale
|
||||||
|
- Include a Metadata block (Owner, Date, Status, Mission ID, Scope Version)
|
||||||
|
- Write for junior developers and AI agents — explicit, unambiguous, no jargon
|
||||||
|
- Include "Typecheck and lint pass" in acceptance criteria for code stories
|
||||||
|
- Include "Verify in browser using dev-browser skill" for UI stories
|
||||||
|
|
||||||
|
MUST NOT:
|
||||||
|
- Write any implementation code
|
||||||
|
- Modify any files other than `docs/PRD.md`
|
||||||
|
- Skip clarifying questions when the feature description is ambiguous
|
||||||
|
- Begin implementation of any requirements
|
||||||
|
- Invent requirements silently — all guesses must be marked with ASSUMPTION
|
||||||
|
|
||||||
|
CONSTRAINTS
|
||||||
|
|
||||||
|
cat <<'STRUCTURE'
|
||||||
|
## Required PRD Structure (Gold Standard)
|
||||||
|
|
||||||
|
Follow this exact structure. Every section is required.
|
||||||
|
|
||||||
|
```
|
||||||
|
# PRD: {Feature/Project Name}
|
||||||
|
|
||||||
|
## Metadata
|
||||||
|
- **Owner:** {name}
|
||||||
|
- **Date:** {yyyy-mm-dd}
|
||||||
|
- **Status:** draft|planning|approved|in-progress|completed
|
||||||
|
- **Mission ID:** {kebab-case-id-yyyymmdd}
|
||||||
|
- **Scope Version:** {e.g., 0.0.4 → 0.0.5 → 0.1.0}
|
||||||
|
|
||||||
|
## Introduction
|
||||||
|
Narrative overview: what this is, the context it lives in, what exists before this work.
|
||||||
|
|
||||||
|
## Problem Statement
|
||||||
|
Numbered list of specific pain points being solved.
|
||||||
|
|
||||||
|
## Goals
|
||||||
|
Bullet list of measurable objectives.
|
||||||
|
|
||||||
|
## Milestones
|
||||||
|
Named milestones (e.g., "Milestone 0.0.4 — Design System Foundation").
|
||||||
|
Each milestone has a one-paragraph description of its theme.
|
||||||
|
|
||||||
|
## User Stories (grouped under milestones)
|
||||||
|
|
||||||
|
### Milestone X.Y.Z — {Theme Name}
|
||||||
|
|
||||||
|
#### US-001: {Title}
|
||||||
|
**Description:** As a {user}, I want {feature} so that {benefit}.
|
||||||
|
|
||||||
|
**Acceptance Criteria:**
|
||||||
|
- [ ] Specific verifiable criterion
|
||||||
|
- [ ] Another criterion
|
||||||
|
- [ ] Typecheck and lint pass
|
||||||
|
- [ ] [UI stories only] Verify in browser using dev-browser skill
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Functional Requirements
|
||||||
|
Numbered FR-1 through FR-N. Group by subsystem if applicable.
|
||||||
|
Each: "FR-N: {subject} must {behavior}"
|
||||||
|
|
||||||
|
## Non-Goals (Out of Scope)
|
||||||
|
Numbered list of explicit exclusions with brief rationale.
|
||||||
|
|
||||||
|
## Design Considerations
|
||||||
|
- Design references (mockups, existing design systems)
|
||||||
|
- Key visual/UX elements
|
||||||
|
- Component hierarchy
|
||||||
|
|
||||||
|
## Technical Considerations
|
||||||
|
|
||||||
|
### Dependencies
|
||||||
|
Libraries, packages, external services required.
|
||||||
|
|
||||||
|
### Build & CI
|
||||||
|
Build pipeline, CI gates, quality checks.
|
||||||
|
|
||||||
|
### Deployment
|
||||||
|
Target infrastructure, deployment method, image tagging strategy.
|
||||||
|
|
||||||
|
### Risks
|
||||||
|
Technical risks that could affect delivery.
|
||||||
|
|
||||||
|
## Success Metrics
|
||||||
|
Numbered list of measurable success conditions.
|
||||||
|
|
||||||
|
## Open Questions
|
||||||
|
Numbered list. For unresolved items, add:
|
||||||
|
(ASSUMPTION: {best-guess decision} — {rationale})
|
||||||
|
```
|
||||||
|
|
||||||
|
STRUCTURE
|
||||||
|
|
||||||
|
cat <<'QUESTIONS'
|
||||||
|
## Clarifying Question Format
|
||||||
|
|
||||||
|
When asking clarifying questions, use numbered questions with lettered options:
|
||||||
|
|
||||||
|
```
|
||||||
|
1. What is the primary goal?
|
||||||
|
A. Option one
|
||||||
|
B. Option two
|
||||||
|
C. Option three
|
||||||
|
D. Other: [please specify]
|
||||||
|
|
||||||
|
2. What is the target scope?
|
||||||
|
A. Minimal viable
|
||||||
|
B. Full-featured
|
||||||
|
C. Backend only
|
||||||
|
D. Frontend only
|
||||||
|
```
|
||||||
|
|
||||||
|
Tell the user they can answer with shorthand like "1A, 2C, 3B" for quick iteration.
|
||||||
|
|
||||||
|
QUESTIONS
|
||||||
|
|
||||||
|
# Include the live PRD guide
|
||||||
|
if [[ -f "$guide_file" ]]; then
|
||||||
|
printf '\n## Mosaic PRD Guide (Authoritative Rules)\n\n'
|
||||||
|
cat "$guide_file"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Include the template as reference
|
||||||
|
if [[ -f "$template_file" ]]; then
|
||||||
|
printf '\n## PRD Template Reference\n\n```markdown\n'
|
||||||
|
cat "$template_file"
|
||||||
|
printf '\n```\n'
|
||||||
|
fi
|
||||||
|
}
|
||||||
106
tools/prdy/prdy-init.sh
Normal file
106
tools/prdy/prdy-init.sh
Normal file
@@ -0,0 +1,106 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# prdy-init.sh — Create a new PRD via guided runtime session
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# prdy-init.sh [--project <path>] [--name <feature>]
|
||||||
|
#
|
||||||
|
# Launches a dedicated runtime session in yolo mode with a specialized
|
||||||
|
# system prompt that guides the user through PRD creation. The output is
|
||||||
|
# written to docs/PRD.md.
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
# ─── Parse arguments ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PROJECT="."
|
||||||
|
NAME=""
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--project) PROJECT="$2"; shift 2 ;;
|
||||||
|
--name) NAME="$2"; shift 2 ;;
|
||||||
|
-h|--help)
|
||||||
|
cat <<'USAGE'
|
||||||
|
prdy-init.sh — Create a new PRD via guided runtime session
|
||||||
|
|
||||||
|
Usage: prdy-init.sh [--project <path>] [--name <feature>]
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--project <path> Project directory (default: CWD)
|
||||||
|
--name <feature> Feature or project name (optional, runtime will ask if omitted)
|
||||||
|
|
||||||
|
Launches the selected runtime in yolo mode with a PRD-focused prompt.
|
||||||
|
The agent will ask clarifying questions, then write docs/PRD.md.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
mosaic prdy init
|
||||||
|
mosaic prdy init --name "User Authentication"
|
||||||
|
mosaic prdy init --project ~/src/my-app --name "Dashboard Redesign"
|
||||||
|
USAGE
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*) echo "Unknown option: $1" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Expand tilde if passed literally (e.g., --project ~/src/foo)
|
||||||
|
PROJECT="${PROJECT/#\~/$HOME}"
|
||||||
|
|
||||||
|
# ─── Preflight checks ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
RUNTIME_CMD="$(prdy_runtime_command)"
|
||||||
|
_require_cmd "$RUNTIME_CMD"
|
||||||
|
|
||||||
|
# Check for existing PRD
|
||||||
|
EXISTING="$(find_prd "$PROJECT")"
|
||||||
|
if [[ -n "$EXISTING" ]]; then
|
||||||
|
fail "PRD already exists: $EXISTING"
|
||||||
|
echo -e " Use ${C_CYAN}mosaic prdy update${C_RESET} to modify the existing PRD."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Ensure docs/ directory exists
|
||||||
|
mkdir -p "$PROJECT/docs"
|
||||||
|
|
||||||
|
# ─── Build system prompt ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
step "Launching PRD creation session"
|
||||||
|
|
||||||
|
SYSTEM_PROMPT="$(build_prdy_system_prompt "init")"
|
||||||
|
|
||||||
|
# Build kickoff message
|
||||||
|
if [[ -n "$NAME" ]]; then
|
||||||
|
KICKOFF="Create docs/PRD.md for the feature: ${NAME}. Read the project context first, then ask clarifying questions before writing the PRD."
|
||||||
|
else
|
||||||
|
KICKOFF="Create docs/PRD.md for this project. Read the project context first, then ask the user what they want to build. Ask clarifying questions before writing the PRD."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ─── Launch runtime ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
info "Output target: $PROJECT/$PRD_CANONICAL"
|
||||||
|
info "Mode: PRD Creation (yolo, runtime: $RUNTIME_CMD)"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
cd "$PROJECT"
|
||||||
|
if [[ "$RUNTIME_CMD" == "claude" ]]; then
|
||||||
|
exec claude --dangerously-skip-permissions --append-system-prompt "$SYSTEM_PROMPT" "$KICKOFF"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$RUNTIME_CMD" == "codex" ]]; then
|
||||||
|
CODEX_PROMPT="$(cat <<EOF
|
||||||
|
Follow this PRD contract exactly.
|
||||||
|
|
||||||
|
$SYSTEM_PROMPT
|
||||||
|
|
||||||
|
Task:
|
||||||
|
$KICKOFF
|
||||||
|
EOF
|
||||||
|
)"
|
||||||
|
exec codex --dangerously-bypass-approvals-and-sandbox "$CODEX_PROMPT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
fail "Unsupported runtime: $RUNTIME_CMD"
|
||||||
|
exit 1
|
||||||
94
tools/prdy/prdy-status.sh
Executable file
94
tools/prdy/prdy-status.sh
Executable file
@@ -0,0 +1,94 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# prdy-status.sh — Quick PRD health check (one-liner output)
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# prdy-status.sh [--project <path>] [--format short|json]
|
||||||
|
#
|
||||||
|
# Exit codes:
|
||||||
|
# 0 = PRD ready (all required sections present)
|
||||||
|
# 1 = PRD incomplete
|
||||||
|
# 2 = PRD missing
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
# ─── Parse arguments ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PROJECT="."
|
||||||
|
FORMAT="short"
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--project) PROJECT="$2"; shift 2 ;;
|
||||||
|
--format) FORMAT="$2"; shift 2 ;;
|
||||||
|
-h|--help)
|
||||||
|
cat <<'USAGE'
|
||||||
|
prdy-status.sh — Quick PRD health check
|
||||||
|
|
||||||
|
Usage: prdy-status.sh [--project <path>] [--format short|json]
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--project <path> Project directory (default: CWD)
|
||||||
|
--format <f> Output format: short (default) or json
|
||||||
|
|
||||||
|
Exit codes:
|
||||||
|
0 = PRD ready (all required sections present)
|
||||||
|
1 = PRD incomplete
|
||||||
|
2 = PRD missing
|
||||||
|
USAGE
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*) echo "Unknown option: $1" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
PROJECT="${PROJECT/#\~/$HOME}"
|
||||||
|
|
||||||
|
# ─── Status check ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PRD_PATH="$(find_prd "$PROJECT")"
|
||||||
|
|
||||||
|
if [[ -z "$PRD_PATH" ]]; then
|
||||||
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
echo '{"status":"missing"}'
|
||||||
|
else
|
||||||
|
echo "PRD: missing"
|
||||||
|
fi
|
||||||
|
exit 2
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Count present sections using the shared manifest
|
||||||
|
PRD_CONTENT="$(cat "$PRD_PATH")"
|
||||||
|
total=${#PRDY_REQUIRED_SECTIONS[@]}
|
||||||
|
present=0
|
||||||
|
|
||||||
|
for entry in "${PRDY_REQUIRED_SECTIONS[@]}"; do
|
||||||
|
pattern="${entry#*|}"
|
||||||
|
if echo "$PRD_CONTENT" | grep -qiE "$pattern"; then
|
||||||
|
present=$((present + 1))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# Count additional metrics
|
||||||
|
fr_count=$(echo "$PRD_CONTENT" | grep -cE '^\- FR-[0-9]+:|^FR-[0-9]+:' || true)
|
||||||
|
us_count=$(echo "$PRD_CONTENT" | grep -cE '^#{1,4} US-[0-9]+' || true)
|
||||||
|
assumptions=$(echo "$PRD_CONTENT" | grep -c 'ASSUMPTION:' || true)
|
||||||
|
|
||||||
|
if (( present == total )); then
|
||||||
|
status="ready"
|
||||||
|
exit_code=0
|
||||||
|
else
|
||||||
|
status="incomplete"
|
||||||
|
exit_code=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
|
printf '{"status":"%s","sections":%d,"total":%d,"frs":%d,"stories":%d,"assumptions":%d}\n' \
|
||||||
|
"$status" "$present" "$total" "$fr_count" "$us_count" "$assumptions"
|
||||||
|
else
|
||||||
|
echo "PRD: $status ($present/$total sections, ${fr_count} FRs, ${us_count} stories, $assumptions assumptions)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
exit "$exit_code"
|
||||||
94
tools/prdy/prdy-update.sh
Normal file
94
tools/prdy/prdy-update.sh
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# prdy-update.sh — Update an existing PRD via guided runtime session
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# prdy-update.sh [--project <path>]
|
||||||
|
#
|
||||||
|
# Launches a dedicated runtime session in yolo mode with a specialized
|
||||||
|
# system prompt that reads the existing PRD and guides targeted modifications.
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
# ─── Parse arguments ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PROJECT="."
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--project) PROJECT="$2"; shift 2 ;;
|
||||||
|
-h|--help)
|
||||||
|
cat <<'USAGE'
|
||||||
|
prdy-update.sh — Update an existing PRD via guided runtime session
|
||||||
|
|
||||||
|
Usage: prdy-update.sh [--project <path>]
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--project <path> Project directory (default: CWD)
|
||||||
|
|
||||||
|
Launches the selected runtime in yolo mode with a PRD-update prompt.
|
||||||
|
The agent will read the existing docs/PRD.md, summarize its state,
|
||||||
|
and ask what changes are needed.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
mosaic prdy update
|
||||||
|
mosaic prdy update --project ~/src/my-app
|
||||||
|
USAGE
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*) echo "Unknown option: $1" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Expand tilde if passed literally (e.g., --project ~/src/foo)
|
||||||
|
PROJECT="${PROJECT/#\~/$HOME}"
|
||||||
|
|
||||||
|
# ─── Preflight checks ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
RUNTIME_CMD="$(prdy_runtime_command)"
|
||||||
|
_require_cmd "$RUNTIME_CMD"
|
||||||
|
|
||||||
|
# Require existing PRD
|
||||||
|
EXISTING="$(find_prd "$PROJECT")"
|
||||||
|
if [[ -z "$EXISTING" ]]; then
|
||||||
|
fail "No PRD found in $PROJECT/docs/"
|
||||||
|
echo -e " Run ${C_CYAN}mosaic prdy init${C_RESET} to create one first."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ─── Build system prompt ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
step "Launching PRD update session"
|
||||||
|
|
||||||
|
SYSTEM_PROMPT="$(build_prdy_system_prompt "update")"
|
||||||
|
|
||||||
|
KICKOFF="Read the existing PRD at ${EXISTING}, summarize its current state, then ask what changes or additions are needed."
|
||||||
|
|
||||||
|
# ─── Launch runtime ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
info "Updating: $EXISTING"
|
||||||
|
info "Mode: PRD Update (yolo, runtime: $RUNTIME_CMD)"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
cd "$PROJECT"
|
||||||
|
if [[ "$RUNTIME_CMD" == "claude" ]]; then
|
||||||
|
exec claude --dangerously-skip-permissions --append-system-prompt "$SYSTEM_PROMPT" "$KICKOFF"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$RUNTIME_CMD" == "codex" ]]; then
|
||||||
|
CODEX_PROMPT="$(cat <<EOF
|
||||||
|
Follow this PRD contract exactly.
|
||||||
|
|
||||||
|
$SYSTEM_PROMPT
|
||||||
|
|
||||||
|
Task:
|
||||||
|
$KICKOFF
|
||||||
|
EOF
|
||||||
|
)"
|
||||||
|
exec codex --dangerously-bypass-approvals-and-sandbox "$CODEX_PROMPT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
fail "Unsupported runtime: $RUNTIME_CMD"
|
||||||
|
exit 1
|
||||||
170
tools/prdy/prdy-validate.sh
Normal file
170
tools/prdy/prdy-validate.sh
Normal file
@@ -0,0 +1,170 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
#
|
||||||
|
# prdy-validate.sh — Check PRD completeness against Mosaic guide requirements
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# prdy-validate.sh [--project <path>]
|
||||||
|
#
|
||||||
|
# Performs static analysis of docs/PRD.md to verify it meets the minimum
|
||||||
|
# content requirements defined in the Mosaic PRD guide.
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/_lib.sh"
|
||||||
|
|
||||||
|
# ─── Parse arguments ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PROJECT="."
|
||||||
|
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case "$1" in
|
||||||
|
--project) PROJECT="$2"; shift 2 ;;
|
||||||
|
-h|--help)
|
||||||
|
cat <<'USAGE'
|
||||||
|
prdy-validate.sh — Check PRD completeness against Mosaic guide
|
||||||
|
|
||||||
|
Usage: prdy-validate.sh [--project <path>]
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--project <path> Project directory (default: CWD)
|
||||||
|
|
||||||
|
Checks:
|
||||||
|
- docs/PRD.md exists
|
||||||
|
- All 10 required sections present
|
||||||
|
- Metadata block (Owner, Date, Status)
|
||||||
|
- Functional requirements (FR-N) count
|
||||||
|
- User stories (US-NNN) count
|
||||||
|
- Acceptance criteria checklist count
|
||||||
|
- ASSUMPTION markers (informational)
|
||||||
|
|
||||||
|
Exit codes:
|
||||||
|
0 All required checks pass
|
||||||
|
1 One or more required checks failed
|
||||||
|
USAGE
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*) echo "Unknown option: $1" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Expand tilde if passed literally (e.g., --project ~/src/foo)
|
||||||
|
PROJECT="${PROJECT/#\~/$HOME}"
|
||||||
|
|
||||||
|
# ─── Find PRD ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
step "Validating PRD"
|
||||||
|
|
||||||
|
PRD_PATH="$(find_prd "$PROJECT")"
|
||||||
|
PASS=0
|
||||||
|
FAIL_COUNT=0
|
||||||
|
WARN_COUNT=0
|
||||||
|
|
||||||
|
check_pass() {
|
||||||
|
ok "$1"
|
||||||
|
PASS=$((PASS + 1))
|
||||||
|
}
|
||||||
|
|
||||||
|
check_fail() {
|
||||||
|
fail "$1"
|
||||||
|
FAIL_COUNT=$((FAIL_COUNT + 1))
|
||||||
|
}
|
||||||
|
|
||||||
|
check_warn() {
|
||||||
|
warn "$1"
|
||||||
|
WARN_COUNT=$((WARN_COUNT + 1))
|
||||||
|
}
|
||||||
|
|
||||||
|
if [[ -z "$PRD_PATH" ]]; then
|
||||||
|
check_fail "No PRD found. Expected $PROJECT/$PRD_CANONICAL or $PROJECT/$PRD_JSON_ALT"
|
||||||
|
echo ""
|
||||||
|
echo -e "${C_RED}PRD validation failed.${C_RESET} Run ${C_CYAN}mosaic prdy init${C_RESET} to create one."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
check_pass "PRD found: $PRD_PATH"
|
||||||
|
|
||||||
|
# JSON PRDs get a limited check
|
||||||
|
if [[ "$PRD_PATH" == *.json ]]; then
|
||||||
|
info "JSON PRD detected — section checks skipped (markdown-only)"
|
||||||
|
echo ""
|
||||||
|
echo -e "${C_GREEN}PRD file exists.${C_RESET} JSON validation is limited."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
PRD_CONTENT="$(cat "$PRD_PATH")"
|
||||||
|
|
||||||
|
# ─── Section checks ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
for entry in "${PRDY_REQUIRED_SECTIONS[@]}"; do
|
||||||
|
label="${entry%%|*}"
|
||||||
|
pattern="${entry#*|}"
|
||||||
|
|
||||||
|
if echo "$PRD_CONTENT" | grep -qiE "$pattern"; then
|
||||||
|
check_pass "$label section present"
|
||||||
|
else
|
||||||
|
check_fail "$label section MISSING"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# ─── Metadata checks ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
META_PASS=true
|
||||||
|
for field in "Owner" "Date" "Status"; do
|
||||||
|
if ! echo "$PRD_CONTENT" | grep -qiE "^- \*\*${field}:\*\*|^- ${field}:"; then
|
||||||
|
META_PASS=false
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if $META_PASS; then
|
||||||
|
check_pass "Metadata block present (Owner, Date, Status)"
|
||||||
|
else
|
||||||
|
check_fail "Metadata block incomplete (need Owner, Date, Status)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ─── Content counts ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
FR_COUNT=$(echo "$PRD_CONTENT" | grep -cE '^\- FR-[0-9]+:|^FR-[0-9]+:' || true)
|
||||||
|
US_COUNT=$(echo "$PRD_CONTENT" | grep -cE '^#{1,4} US-[0-9]+' || true)
|
||||||
|
AC_COUNT=$(echo "$PRD_CONTENT" | grep -cE '^\s*- \[ \]' || true)
|
||||||
|
ASSUMPTION_COUNT=$(echo "$PRD_CONTENT" | grep -c 'ASSUMPTION:' || true)
|
||||||
|
|
||||||
|
if [[ "$FR_COUNT" -gt 0 ]]; then
|
||||||
|
check_pass "Functional requirements: $FR_COUNT FR items"
|
||||||
|
else
|
||||||
|
check_warn "No FR-N items found (expected numbered functional requirements)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$US_COUNT" -gt 0 ]]; then
|
||||||
|
check_pass "User stories: $US_COUNT US items"
|
||||||
|
else
|
||||||
|
check_warn "No US-NNN items found (expected user stories)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$AC_COUNT" -gt 0 ]]; then
|
||||||
|
check_pass "Acceptance criteria: $AC_COUNT checklist items"
|
||||||
|
else
|
||||||
|
check_warn "No acceptance criteria checkboxes found"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "$ASSUMPTION_COUNT" -gt 0 ]]; then
|
||||||
|
info "ASSUMPTION markers: $ASSUMPTION_COUNT (informational)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ─── Summary ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
TOTAL=$((PASS + FAIL_COUNT))
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
if [[ "$FAIL_COUNT" -eq 0 ]]; then
|
||||||
|
echo -e "${C_GREEN}PRD validation passed.${C_RESET} ${PASS}/${TOTAL} checks OK."
|
||||||
|
if [[ "$WARN_COUNT" -gt 0 ]]; then
|
||||||
|
echo -e "${C_YELLOW}${WARN_COUNT} warning(s)${C_RESET} — review recommended."
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
else
|
||||||
|
echo -e "${C_RED}PRD validation failed.${C_RESET} ${FAIL_COUNT}/${TOTAL} checks failed."
|
||||||
|
if [[ "$WARN_COUNT" -gt 0 ]]; then
|
||||||
|
echo -e "${C_YELLOW}${WARN_COUNT} warning(s)${C_RESET}"
|
||||||
|
fi
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
34
tools/qa/prevent-memory-write.sh
Executable file
34
tools/qa/prevent-memory-write.sh
Executable file
@@ -0,0 +1,34 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# prevent-memory-write.sh — PreToolUse hook
|
||||||
|
#
|
||||||
|
# Blocks Write/Edit/MultiEdit calls targeting Claude Code's native auto-memory
|
||||||
|
# files (~/.claude/projects/*/memory/*.md).
|
||||||
|
#
|
||||||
|
# These files are runtime-specific silos that no other agent harness can read.
|
||||||
|
# All agent learnings MUST go to OpenBrain (capture MCP tool or REST API).
|
||||||
|
# MEMORY.md files may only contain load-path behavioral guardrails — not knowledge.
|
||||||
|
#
|
||||||
|
# Exit codes (Claude Code PreToolUse):
|
||||||
|
# 0 = allow
|
||||||
|
# 2 = block with message shown to agent
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
INPUT="$(cat)"
|
||||||
|
|
||||||
|
FILE_PATH="$(echo "$INPUT" | jq -r '.tool_input.file_path // empty' 2>/dev/null || true)"
|
||||||
|
|
||||||
|
[[ -z "$FILE_PATH" ]] && exit 0
|
||||||
|
|
||||||
|
# Resolve ~ to HOME
|
||||||
|
FILE_PATH="${FILE_PATH/#\~/$HOME}"
|
||||||
|
|
||||||
|
# Block writes to Claude Code auto-memory files
|
||||||
|
if [[ "$FILE_PATH" =~ /.claude/projects/.+/memory/.*\.md$ ]]; then
|
||||||
|
echo "BLOCKED: Do not write agent learnings to ~/.claude/projects/*/memory/ — this is a runtime-specific silo."
|
||||||
|
echo "Use OpenBrain instead: MCP 'capture' tool or REST POST https://brain.woltje.com/v1/thoughts"
|
||||||
|
echo "File blocked: $FILE_PATH"
|
||||||
|
exit 2
|
||||||
|
fi
|
||||||
|
|
||||||
|
exit 0
|
||||||
@@ -43,7 +43,7 @@ npx husky install
|
|||||||
✅ **TypeScript strict mode** - All type checks enabled
|
✅ **TypeScript strict mode** - All type checks enabled
|
||||||
✅ **ESLint blocking `any` types** - no-explicit-any: error
|
✅ **ESLint blocking `any` types** - no-explicit-any: error
|
||||||
✅ **Pre-commit hooks** - Type check + lint + format before commit
|
✅ **Pre-commit hooks** - Type check + lint + format before commit
|
||||||
✅ **Secret scanning** - Block hardcoded passwords/API keys
|
✅ **Secret scanning (gitleaks)** - Block hardcoded passwords/API keys (pre-commit + CI)
|
||||||
✅ **CI/CD templates** - Woodpecker, GitHub Actions, GitLab
|
✅ **CI/CD templates** - Woodpecker, GitHub Actions, GitLab
|
||||||
✅ **Test coverage enforcement** - 80% threshold
|
✅ **Test coverage enforcement** - 80% threshold
|
||||||
✅ **Security scanning** - npm audit, OWASP checks
|
✅ **Security scanning** - npm audit, OWASP checks
|
||||||
@@ -96,11 +96,12 @@ git commit -m "Add feature"
|
|||||||
### CI/CD (Remote Enforcement)
|
### CI/CD (Remote Enforcement)
|
||||||
```yaml
|
```yaml
|
||||||
# Woodpecker pipeline runs:
|
# Woodpecker pipeline runs:
|
||||||
|
✓ gitleaks (secret scanning — parallel, no deps)
|
||||||
✓ npm audit (dependency security)
|
✓ npm audit (dependency security)
|
||||||
✓ eslint (code quality)
|
✓ eslint (code quality)
|
||||||
✓ tsc --noEmit (type checking)
|
✓ tsc --noEmit (type checking)
|
||||||
✓ jest --coverage (tests + coverage)
|
✓ jest --coverage (tests + coverage)
|
||||||
✓ npm run build (compilation)
|
✓ npm run build (compilation — gates on all above)
|
||||||
|
|
||||||
# If any step fails, merge is blocked
|
# If any step fails, merge is blocked
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -8,12 +8,13 @@ Quality Rails includes `.woodpecker.yml` template.
|
|||||||
|
|
||||||
### Pipeline Stages
|
### Pipeline Stages
|
||||||
|
|
||||||
1. **Install** - Dependencies
|
1. **Secret Scan** - gitleaks scans latest commit for hardcoded secrets (runs in parallel, no deps)
|
||||||
2. **Security Audit** - npm audit for CVEs
|
2. **Install** - Dependencies
|
||||||
3. **Lint** - ESLint checks
|
3. **Security Audit** - npm audit for CVEs
|
||||||
4. **Type Check** - TypeScript compilation
|
4. **Lint** - ESLint checks
|
||||||
5. **Test** - Jest with coverage thresholds
|
5. **Type Check** - TypeScript compilation
|
||||||
6. **Build** - Production build
|
6. **Test** - Jest with coverage thresholds
|
||||||
|
7. **Build** - Production build (gates on all above)
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
|
|
||||||
|
|||||||
@@ -24,11 +24,12 @@ git clone git@git.mosaicstack.dev:mosaic/quality-rails.git
|
|||||||
```
|
```
|
||||||
|
|
||||||
This copies:
|
This copies:
|
||||||
- `.husky/pre-commit` - Git hooks
|
- `.husky/pre-commit` - Git hooks (lint-staged + gitleaks)
|
||||||
- `.lintstagedrc.js` - Pre-commit checks
|
- `.lintstagedrc.js` - Pre-commit checks
|
||||||
- `.eslintrc.js` - Strict ESLint rules
|
- `.eslintrc.js` - Strict ESLint rules
|
||||||
- `tsconfig.json` - TypeScript strict mode
|
- `tsconfig.json` - TypeScript strict mode
|
||||||
- `.woodpecker.yml` - CI pipeline
|
- `.woodpecker.yml` - CI pipeline
|
||||||
|
- `.gitleaks.toml` - Secret scanning config
|
||||||
|
|
||||||
### 3. Install Dependencies
|
### 3. Install Dependencies
|
||||||
|
|
||||||
@@ -75,6 +76,8 @@ Should output:
|
|||||||
```
|
```
|
||||||
✅ PASS: Type errors blocked
|
✅ PASS: Type errors blocked
|
||||||
✅ PASS: 'any' types blocked
|
✅ PASS: 'any' types blocked
|
||||||
|
✅ PASS: gitleaks found (8.24.0)
|
||||||
|
✅ PASS: gitleaks detected planted secret
|
||||||
✅ PASS: Lint errors blocked
|
✅ PASS: Lint errors blocked
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -125,7 +128,7 @@ On every `git commit`, runs:
|
|||||||
1. ESLint with --max-warnings=0
|
1. ESLint with --max-warnings=0
|
||||||
2. TypeScript type check
|
2. TypeScript type check
|
||||||
3. Prettier formatting
|
3. Prettier formatting
|
||||||
4. Secret scanning (if git-secrets installed)
|
4. Secret scanning via gitleaks (required)
|
||||||
|
|
||||||
If any fail → **commit blocked**.
|
If any fail → **commit blocked**.
|
||||||
|
|
||||||
|
|||||||
@@ -33,6 +33,10 @@ Copy-Item -Path "$TemplateDir\.eslintrc.strict.js" -Destination "$TargetDir\.esl
|
|||||||
Copy-Item -Path "$TemplateDir\tsconfig.strict.json" -Destination "$TargetDir\tsconfig.json" -Force -ErrorAction SilentlyContinue
|
Copy-Item -Path "$TemplateDir\tsconfig.strict.json" -Destination "$TargetDir\tsconfig.json" -Force -ErrorAction SilentlyContinue
|
||||||
Copy-Item -Path "$TemplateDir\.woodpecker.yml" -Destination $TargetDir -Force -ErrorAction SilentlyContinue
|
Copy-Item -Path "$TemplateDir\.woodpecker.yml" -Destination $TargetDir -Force -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
# Copy shared gitleaks config from templates root
|
||||||
|
$SharedTemplates = Split-Path -Parent $TemplateDir
|
||||||
|
Copy-Item -Path "$SharedTemplates\.gitleaks.toml" -Destination $TargetDir -Force -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
Write-Host "✓ Files copied"
|
Write-Host "✓ Files copied"
|
||||||
|
|
||||||
if (Test-Path "$TargetDir\package.json") {
|
if (Test-Path "$TargetDir\package.json") {
|
||||||
@@ -50,4 +54,6 @@ Write-Host ""
|
|||||||
Write-Host "Next steps:"
|
Write-Host "Next steps:"
|
||||||
Write-Host "1. Install dependencies: npm install"
|
Write-Host "1. Install dependencies: npm install"
|
||||||
Write-Host "2. Initialize husky: npx husky install"
|
Write-Host "2. Initialize husky: npx husky install"
|
||||||
Write-Host "3. Run verification: ..\quality-rails\scripts\verify.ps1"
|
Write-Host "3. Install gitleaks: winget install gitleaks"
|
||||||
|
Write-Host "4. Run verification: ..\quality-rails\scripts\verify.ps1"
|
||||||
|
Write-Host "5. (Optional) Scan full history: gitleaks git --redact --verbose"
|
||||||
|
|||||||
@@ -53,6 +53,10 @@ cp "$TEMPLATE_DIR/.eslintrc.strict.js" "$TARGET_DIR/.eslintrc.js" 2>/dev/null ||
|
|||||||
cp "$TEMPLATE_DIR/tsconfig.strict.json" "$TARGET_DIR/tsconfig.json" 2>/dev/null || true
|
cp "$TEMPLATE_DIR/tsconfig.strict.json" "$TARGET_DIR/tsconfig.json" 2>/dev/null || true
|
||||||
cp "$TEMPLATE_DIR/.woodpecker.yml" "$TARGET_DIR/" 2>/dev/null || true
|
cp "$TEMPLATE_DIR/.woodpecker.yml" "$TARGET_DIR/" 2>/dev/null || true
|
||||||
|
|
||||||
|
# Copy shared gitleaks config from templates root
|
||||||
|
SHARED_TEMPLATES="$(dirname "$TEMPLATE_DIR")"
|
||||||
|
cp "$SHARED_TEMPLATES/.gitleaks.toml" "$TARGET_DIR/" 2>/dev/null || true
|
||||||
|
|
||||||
echo "✓ Files copied"
|
echo "✓ Files copied"
|
||||||
|
|
||||||
# Check if package.json exists
|
# Check if package.json exists
|
||||||
@@ -71,5 +75,7 @@ echo ""
|
|||||||
echo "Next steps:"
|
echo "Next steps:"
|
||||||
echo "1. Install dependencies: npm install"
|
echo "1. Install dependencies: npm install"
|
||||||
echo "2. Initialize husky: npx husky install"
|
echo "2. Initialize husky: npx husky install"
|
||||||
echo "3. Run verification: ~/.config/mosaic/bin/mosaic-quality-verify --target $TARGET_DIR"
|
echo "3. Install gitleaks: https://github.com/gitleaks/gitleaks#installing"
|
||||||
|
echo "4. Run verification: ~/.config/mosaic/bin/mosaic-quality-verify --target $TARGET_DIR"
|
||||||
|
echo "5. (Optional) Scan full history: gitleaks git --redact --verbose"
|
||||||
echo ""
|
echo ""
|
||||||
|
|||||||
@@ -39,6 +39,40 @@ if ($output -match "no-explicit-any") {
|
|||||||
git reset HEAD test-file.ts 2>$null
|
git reset HEAD test-file.ts 2>$null
|
||||||
Remove-Item test-file.ts -ErrorAction SilentlyContinue
|
Remove-Item test-file.ts -ErrorAction SilentlyContinue
|
||||||
|
|
||||||
|
# Test 3a: gitleaks binary must be present
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Test 3a: gitleaks must be installed..."
|
||||||
|
$gitleaksPath = Get-Command gitleaks -ErrorAction SilentlyContinue
|
||||||
|
if ($gitleaksPath) {
|
||||||
|
$gitleaksVer = & gitleaks version 2>&1 | Out-String
|
||||||
|
Write-Host "✅ PASS: gitleaks found ($($gitleaksVer.Trim()))" -ForegroundColor Green
|
||||||
|
$Passed++
|
||||||
|
} else {
|
||||||
|
Write-Host "❌ FAIL: gitleaks is NOT installed — secret scanning will not work" -ForegroundColor Red
|
||||||
|
Write-Host " Install: winget install gitleaks"
|
||||||
|
$Failed++
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test 3b: gitleaks detects a planted AWS key
|
||||||
|
Write-Host ""
|
||||||
|
Write-Host "Test 3b: gitleaks should detect planted AWS key..."
|
||||||
|
if ($gitleaksPath) {
|
||||||
|
"aws_access_key_id = AKIAIOSFODNN7REALKEY" | Out-File -FilePath gitleaks-test-secret.txt -Encoding utf8
|
||||||
|
git add gitleaks-test-secret.txt 2>$null
|
||||||
|
$output = & gitleaks git --pre-commit --staged --redact 2>&1 | Out-String
|
||||||
|
if ($output -match "leak|finding") {
|
||||||
|
Write-Host "✅ PASS: gitleaks detected planted secret" -ForegroundColor Green
|
||||||
|
$Passed++
|
||||||
|
} else {
|
||||||
|
Write-Host "❌ FAIL: gitleaks did NOT detect planted secret" -ForegroundColor Red
|
||||||
|
$Failed++
|
||||||
|
}
|
||||||
|
git reset HEAD gitleaks-test-secret.txt 2>$null
|
||||||
|
Remove-Item gitleaks-test-secret.txt -ErrorAction SilentlyContinue
|
||||||
|
} else {
|
||||||
|
Write-Host "⚠ SKIP: gitleaks not installed (Test 3a already failed)"
|
||||||
|
}
|
||||||
|
|
||||||
# Summary
|
# Summary
|
||||||
Write-Host ""
|
Write-Host ""
|
||||||
Write-Host "═══════════════════════════════════════════"
|
Write-Host "═══════════════════════════════════════════"
|
||||||
|
|||||||
@@ -40,23 +40,35 @@ fi
|
|||||||
git reset HEAD test-file.ts 2>/dev/null
|
git reset HEAD test-file.ts 2>/dev/null
|
||||||
rm test-file.ts 2>/dev/null
|
rm test-file.ts 2>/dev/null
|
||||||
|
|
||||||
# Test 3: Hardcoded secret blocked (if git-secrets installed)
|
# Test 3a: gitleaks binary must be present
|
||||||
echo ""
|
echo ""
|
||||||
echo "Test 3: Hardcoded secrets should be blocked..."
|
echo "Test 3a: gitleaks must be installed..."
|
||||||
if command -v git-secrets &> /dev/null; then
|
if command -v gitleaks &> /dev/null; then
|
||||||
echo "const password = 'SuperSecret123!';" > test-file.ts
|
echo "✅ PASS: gitleaks found ($(gitleaks version 2>/dev/null || echo 'unknown version'))"
|
||||||
git add test-file.ts 2>/dev/null
|
PASSED=$((PASSED + 1))
|
||||||
if git commit -m "Test commit" 2>&1 | grep -q -i "secret\|password"; then
|
|
||||||
echo "✅ PASS: Secrets blocked"
|
|
||||||
((PASSED++))
|
|
||||||
else
|
|
||||||
echo "⚠ WARN: Secrets NOT blocked (git-secrets may need configuration)"
|
|
||||||
((FAILED++))
|
|
||||||
fi
|
|
||||||
git reset HEAD test-file.ts 2>/dev/null
|
|
||||||
rm test-file.ts 2>/dev/null
|
|
||||||
else
|
else
|
||||||
echo "⚠ SKIP: git-secrets not installed"
|
echo "❌ FAIL: gitleaks is NOT installed — secret scanning will not work"
|
||||||
|
echo " Install: https://github.com/gitleaks/gitleaks#installing"
|
||||||
|
FAILED=$((FAILED + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Test 3b: gitleaks detects a planted AWS key
|
||||||
|
echo ""
|
||||||
|
echo "Test 3b: gitleaks should detect planted AWS key..."
|
||||||
|
if command -v gitleaks &> /dev/null; then
|
||||||
|
echo 'aws_access_key_id = AKIAIOSFODNN7REALKEY' > gitleaks-test-secret.txt
|
||||||
|
git add gitleaks-test-secret.txt 2>/dev/null
|
||||||
|
if gitleaks git --pre-commit --staged --redact 2>&1 | grep -q -i "leak\|finding"; then
|
||||||
|
echo "✅ PASS: gitleaks detected planted secret"
|
||||||
|
PASSED=$((PASSED + 1))
|
||||||
|
else
|
||||||
|
echo "❌ FAIL: gitleaks did NOT detect planted secret"
|
||||||
|
FAILED=$((FAILED + 1))
|
||||||
|
fi
|
||||||
|
git reset HEAD gitleaks-test-secret.txt 2>/dev/null
|
||||||
|
rm gitleaks-test-secret.txt 2>/dev/null
|
||||||
|
else
|
||||||
|
echo "⚠ SKIP: gitleaks not installed (Test 3a already failed)"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Test 4: Lint error blocked
|
# Test 4: Lint error blocked
|
||||||
|
|||||||
162
tools/quality/templates/.gitleaks.toml
Normal file
162
tools/quality/templates/.gitleaks.toml
Normal file
@@ -0,0 +1,162 @@
|
|||||||
|
# Mosaic Quality Rails — gitleaks configuration
|
||||||
|
# Shared across all project templates. Copied to project root by install.sh.
|
||||||
|
# Built-in rules: https://github.com/gitleaks/gitleaks/tree/master/config
|
||||||
|
# This file adds custom rules for patterns the 150+ built-in rules miss.
|
||||||
|
|
||||||
|
title = "Mosaic gitleaks config"
|
||||||
|
|
||||||
|
[allowlist]
|
||||||
|
description = "Global allowlist — skip files that never contain real secrets"
|
||||||
|
paths = [
|
||||||
|
'''node_modules/''',
|
||||||
|
'''dist/''',
|
||||||
|
'''build/''',
|
||||||
|
'''\.next/''',
|
||||||
|
'''\.nuxt/''',
|
||||||
|
'''\.output/''',
|
||||||
|
'''coverage/''',
|
||||||
|
'''__pycache__/''',
|
||||||
|
'''\.venv/''',
|
||||||
|
'''vendor/''',
|
||||||
|
'''pnpm-lock\.yaml$''',
|
||||||
|
'''package-lock\.json$''',
|
||||||
|
'''yarn\.lock$''',
|
||||||
|
'''\.lock$''',
|
||||||
|
'''\.snap$''',
|
||||||
|
'''\.min\.js$''',
|
||||||
|
'''\.min\.css$''',
|
||||||
|
'''\.gitleaks\.toml$''',
|
||||||
|
]
|
||||||
|
stopwords = [
|
||||||
|
"localhost",
|
||||||
|
"127.0.0.1",
|
||||||
|
"changeme",
|
||||||
|
"placeholder",
|
||||||
|
"example",
|
||||||
|
"example.com",
|
||||||
|
"test",
|
||||||
|
"dummy",
|
||||||
|
"fake",
|
||||||
|
"sample",
|
||||||
|
"your-",
|
||||||
|
"xxx",
|
||||||
|
"CHANGEME",
|
||||||
|
"PLACEHOLDER",
|
||||||
|
"TODO",
|
||||||
|
"REPLACE_ME",
|
||||||
|
]
|
||||||
|
|
||||||
|
# ──────────────────────────────────────────────
|
||||||
|
# Custom rules — patterns the built-in rules miss
|
||||||
|
# ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "database-url-with-credentials"
|
||||||
|
description = "Database connection URL with embedded password"
|
||||||
|
regex = '''(?i)(?:postgres(?:ql)?|mysql|mariadb|mongodb(?:\+srv)?|redis|amqp)://[^:\s]+:[^@\s]+@[^/\s]+'''
|
||||||
|
tags = ["database", "connection-string"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["localhost", "127.0.0.1", "changeme", "password", "example", "test_", "placeholder"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "alembic-ini-sqlalchemy-url"
|
||||||
|
description = "SQLAlchemy URL in alembic.ini with credentials"
|
||||||
|
regex = '''sqlalchemy\.url\s*=\s*\S+://[^:\s]+:[^@\s]+@\S+'''
|
||||||
|
paths = ['''alembic\.ini$''', '''\.ini$''']
|
||||||
|
tags = ["python", "alembic", "database"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["localhost", "127.0.0.1", "changeme", "driver://user:pass"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "dotenv-secret-value"
|
||||||
|
description = "High-entropy secret value in .env file"
|
||||||
|
regex = '''(?i)(?:SECRET|TOKEN|PASSWORD|KEY|CREDENTIALS|AUTH)[\w]*\s*=\s*['"]?[A-Za-z0-9/+=]{20,}['"]?\s*$'''
|
||||||
|
paths = ['''\.env$''', '''\.env\.\w+$''']
|
||||||
|
tags = ["dotenv", "secret"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example", "your_", "REPLACE", "TODO"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "jdbc-url-with-password"
|
||||||
|
description = "JDBC connection string with embedded password"
|
||||||
|
regex = '''jdbc:[a-z]+://[^;\s]+password=[^;\s&]+'''
|
||||||
|
tags = ["java", "jdbc", "database"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "dsn-inline-password"
|
||||||
|
description = "DSN-style connection string with inline password"
|
||||||
|
regex = '''(?i)(?:dsn|connection_string|conn_str)\s*[:=]\s*\S+://[^:\s]+:[^@\s]+@\S+'''
|
||||||
|
tags = ["database", "connection-string"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["localhost", "127.0.0.1", "changeme", "example"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "hardcoded-password-variable"
|
||||||
|
description = "Hardcoded password assignment in source code"
|
||||||
|
regex = '''(?i)(?:password|passwd|pwd)\s*[:=]\s*['"][^'"]{8,}['"]'''
|
||||||
|
tags = ["password", "hardcoded"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example", "test", "dummy", "password123", "your_password"]
|
||||||
|
paths = [
|
||||||
|
'''test[s]?/''',
|
||||||
|
'''spec[s]?/''',
|
||||||
|
'''__test__/''',
|
||||||
|
'''fixture[s]?/''',
|
||||||
|
'''mock[s]?/''',
|
||||||
|
]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "bearer-token-in-code"
|
||||||
|
description = "Hardcoded bearer token in source code"
|
||||||
|
regex = '''(?i)['"]Bearer\s+[A-Za-z0-9\-._~+/]+=*['"]'''
|
||||||
|
tags = ["auth", "bearer", "token"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["example", "test", "dummy", "placeholder", "fake"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "spring-application-properties-password"
|
||||||
|
description = "Password in Spring Boot application properties"
|
||||||
|
regex = '''(?i)spring\.\w+\.password\s*=\s*\S+'''
|
||||||
|
paths = ['''application\.properties$''', '''application\.yml$''', '''application-\w+\.properties$''', '''application-\w+\.yml$''']
|
||||||
|
tags = ["java", "spring", "password"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "${"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "docker-compose-env-secret"
|
||||||
|
description = "Hardcoded secret in docker-compose environment"
|
||||||
|
regex = '''(?i)(?:POSTGRES_PASSWORD|MYSQL_ROOT_PASSWORD|MYSQL_PASSWORD|REDIS_PASSWORD|RABBITMQ_DEFAULT_PASS|MONGO_INITDB_ROOT_PASSWORD)\s*[:=]\s*['"]?[^\s'"$]{8,}['"]?'''
|
||||||
|
paths = ['''compose\.ya?ml$''', '''docker-compose\.ya?ml$''']
|
||||||
|
tags = ["docker", "compose", "secret"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example", "${"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "terraform-variable-secret"
|
||||||
|
description = "Sensitive default value in Terraform variable"
|
||||||
|
regex = '''(?i)default\s*=\s*"[^"]{8,}"'''
|
||||||
|
paths = ['''variables\.tf$''', '''\.tf$''']
|
||||||
|
tags = ["terraform", "secret"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example", "TODO"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "private-key-pem-inline"
|
||||||
|
description = "PEM-encoded private key in source"
|
||||||
|
regex = '''-----BEGIN\s+(?:RSA |EC |DSA |OPENSSH )?PRIVATE KEY-----'''
|
||||||
|
tags = ["key", "pem", "private-key"]
|
||||||
|
|
||||||
|
[[rules]]
|
||||||
|
id = "base64-encoded-secret"
|
||||||
|
description = "Base64 value assigned to secret-named variable"
|
||||||
|
regex = '''(?i)(?:secret|token|key|password|credentials)[\w]*\s*[:=]\s*['"]?[A-Za-z0-9+/]{40,}={0,2}['"]?'''
|
||||||
|
tags = ["base64", "encoded", "secret"]
|
||||||
|
[rules.allowlist]
|
||||||
|
stopwords = ["changeme", "placeholder", "example", "test"]
|
||||||
|
paths = [
|
||||||
|
'''test[s]?/''',
|
||||||
|
'''spec[s]?/''',
|
||||||
|
'''fixture[s]?/''',
|
||||||
|
]
|
||||||
@@ -1,2 +1,15 @@
|
|||||||
npx lint-staged
|
npx lint-staged
|
||||||
npx git-secrets --scan || echo "Warning: git-secrets not installed"
|
|
||||||
|
# Secret scanning — gitleaks is REQUIRED (not optional like git-secrets was)
|
||||||
|
if ! command -v gitleaks &>/dev/null; then
|
||||||
|
echo ""
|
||||||
|
echo "ERROR: gitleaks is not installed. Secret scanning is required."
|
||||||
|
echo ""
|
||||||
|
echo "Install:"
|
||||||
|
echo " Linux: curl -sSfL https://github.com/gitleaks/gitleaks/releases/latest/download/gitleaks_8.24.0_linux_x64.tar.gz | sudo tar -xz -C /usr/local/bin gitleaks"
|
||||||
|
echo " macOS: brew install gitleaks"
|
||||||
|
echo " Windows: winget install gitleaks"
|
||||||
|
echo ""
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
gitleaks git --pre-commit --redact --staged --verbose
|
||||||
|
|||||||
@@ -4,11 +4,19 @@ when:
|
|||||||
|
|
||||||
variables:
|
variables:
|
||||||
- &node_image "node:20-alpine"
|
- &node_image "node:20-alpine"
|
||||||
|
- &gitleaks_image "ghcr.io/gitleaks/gitleaks:v8.24.0"
|
||||||
- &install_deps |
|
- &install_deps |
|
||||||
corepack enable
|
corepack enable
|
||||||
npm ci --ignore-scripts
|
npm ci --ignore-scripts
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
# Secret scanning (runs in parallel with install, no deps)
|
||||||
|
secret-scan:
|
||||||
|
image: *gitleaks_image
|
||||||
|
commands:
|
||||||
|
- gitleaks git --redact --verbose --log-opts="HEAD~1..HEAD"
|
||||||
|
depends_on: []
|
||||||
|
|
||||||
install:
|
install:
|
||||||
image: *node_image
|
image: *node_image
|
||||||
commands:
|
commands:
|
||||||
@@ -65,3 +73,4 @@ steps:
|
|||||||
- typecheck
|
- typecheck
|
||||||
- test
|
- test
|
||||||
- security-audit
|
- security-audit
|
||||||
|
- secret-scan
|
||||||
|
|||||||
@@ -1,2 +1,15 @@
|
|||||||
npx lint-staged
|
npx lint-staged
|
||||||
npx git-secrets --scan || echo "Warning: git-secrets not installed"
|
|
||||||
|
# Secret scanning — gitleaks is REQUIRED (not optional like git-secrets was)
|
||||||
|
if ! command -v gitleaks &>/dev/null; then
|
||||||
|
echo ""
|
||||||
|
echo "ERROR: gitleaks is not installed. Secret scanning is required."
|
||||||
|
echo ""
|
||||||
|
echo "Install:"
|
||||||
|
echo " Linux: curl -sSfL https://github.com/gitleaks/gitleaks/releases/latest/download/gitleaks_8.24.0_linux_x64.tar.gz | sudo tar -xz -C /usr/local/bin gitleaks"
|
||||||
|
echo " macOS: brew install gitleaks"
|
||||||
|
echo " Windows: winget install gitleaks"
|
||||||
|
echo ""
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
gitleaks git --pre-commit --redact --staged --verbose
|
||||||
|
|||||||
@@ -4,11 +4,19 @@ when:
|
|||||||
|
|
||||||
variables:
|
variables:
|
||||||
- &node_image "node:20-alpine"
|
- &node_image "node:20-alpine"
|
||||||
|
- &gitleaks_image "ghcr.io/gitleaks/gitleaks:v8.24.0"
|
||||||
- &install_deps |
|
- &install_deps |
|
||||||
corepack enable
|
corepack enable
|
||||||
npm ci --ignore-scripts
|
npm ci --ignore-scripts
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
# Secret scanning (runs in parallel with install, no deps)
|
||||||
|
secret-scan:
|
||||||
|
image: *gitleaks_image
|
||||||
|
commands:
|
||||||
|
- gitleaks git --redact --verbose --log-opts="HEAD~1..HEAD"
|
||||||
|
depends_on: []
|
||||||
|
|
||||||
install:
|
install:
|
||||||
image: *node_image
|
image: *node_image
|
||||||
commands:
|
commands:
|
||||||
@@ -65,3 +73,4 @@ steps:
|
|||||||
- typecheck
|
- typecheck
|
||||||
- test
|
- test
|
||||||
- security-audit
|
- security-audit
|
||||||
|
- secret-scan
|
||||||
|
|||||||
@@ -1,2 +1,15 @@
|
|||||||
npx lint-staged
|
npx lint-staged
|
||||||
npx git-secrets --scan || echo "Warning: git-secrets not installed"
|
|
||||||
|
# Secret scanning — gitleaks is REQUIRED (not optional like git-secrets was)
|
||||||
|
if ! command -v gitleaks &>/dev/null; then
|
||||||
|
echo ""
|
||||||
|
echo "ERROR: gitleaks is not installed. Secret scanning is required."
|
||||||
|
echo ""
|
||||||
|
echo "Install:"
|
||||||
|
echo " Linux: curl -sSfL https://github.com/gitleaks/gitleaks/releases/latest/download/gitleaks_8.24.0_linux_x64.tar.gz | sudo tar -xz -C /usr/local/bin gitleaks"
|
||||||
|
echo " macOS: brew install gitleaks"
|
||||||
|
echo " Windows: winget install gitleaks"
|
||||||
|
echo ""
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
gitleaks git --pre-commit --redact --staged --verbose
|
||||||
|
|||||||
@@ -6,11 +6,19 @@ when:
|
|||||||
|
|
||||||
variables:
|
variables:
|
||||||
- &node_image "node:20-alpine"
|
- &node_image "node:20-alpine"
|
||||||
|
- &gitleaks_image "ghcr.io/gitleaks/gitleaks:v8.24.0"
|
||||||
- &install_deps |
|
- &install_deps |
|
||||||
corepack enable
|
corepack enable
|
||||||
npm ci --ignore-scripts
|
npm ci --ignore-scripts
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
# Secret scanning (runs in parallel with install, no deps)
|
||||||
|
secret-scan:
|
||||||
|
image: *gitleaks_image
|
||||||
|
commands:
|
||||||
|
- gitleaks git --redact --verbose --log-opts="HEAD~1..HEAD"
|
||||||
|
depends_on: []
|
||||||
|
|
||||||
# Stage 1: Install
|
# Stage 1: Install
|
||||||
install:
|
install:
|
||||||
image: *node_image
|
image: *node_image
|
||||||
@@ -64,3 +72,4 @@ steps:
|
|||||||
- typecheck
|
- typecheck
|
||||||
- test
|
- test
|
||||||
- security-audit
|
- security-audit
|
||||||
|
- secret-scan
|
||||||
|
|||||||
50
tools/woodpecker/_lib.sh
Normal file
50
tools/woodpecker/_lib.sh
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
#
|
||||||
|
# _lib.sh — Shared helpers for Woodpecker CI tool scripts
|
||||||
|
#
|
||||||
|
# Usage: source "$(dirname "${BASH_SOURCE[0]}")/_lib.sh"
|
||||||
|
#
|
||||||
|
# Requires: WOODPECKER_URL and WOODPECKER_TOKEN to be set (via load_credentials)
|
||||||
|
|
||||||
|
# Resolve owner/repo name to numeric repo ID (required by Woodpecker v3 API)
|
||||||
|
# Usage: REPO_ID=$(wp_resolve_repo_id "owner/repo")
|
||||||
|
wp_resolve_repo_id() {
|
||||||
|
local full_name="$1"
|
||||||
|
local response http_code body repo_id
|
||||||
|
|
||||||
|
response=$(curl -sk -w "\n%{http_code}" \
|
||||||
|
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
||||||
|
"${WOODPECKER_URL}/api/repos/lookup/${full_name}")
|
||||||
|
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: Failed to look up repo '${full_name}' (HTTP $http_code)" >&2
|
||||||
|
if echo "$body" | jq -e '.message' &>/dev/null; then
|
||||||
|
echo " $(echo "$body" | jq -r '.message')" >&2
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
repo_id=$(echo "$body" | jq -r '.id // empty')
|
||||||
|
if [[ -z "$repo_id" ]]; then
|
||||||
|
echo "Error: Repo lookup returned no ID for '${full_name}'" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "$repo_id"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Auto-detect repo name from git remote origin
|
||||||
|
# Usage: REPO=$(wp_detect_repo)
|
||||||
|
wp_detect_repo() {
|
||||||
|
local remote_url
|
||||||
|
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
||||||
|
if [[ -n "$remote_url" ]]; then
|
||||||
|
echo "$remote_url" | sed -E 's|\.git$||' | sed -E 's|.*[:/]([^/]+/[^/]+)$|\1|'
|
||||||
|
else
|
||||||
|
echo "Error: -r owner/repo required (not in a git repository)" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
@@ -2,62 +2,55 @@
|
|||||||
#
|
#
|
||||||
# pipeline-list.sh — List Woodpecker CI pipelines
|
# pipeline-list.sh — List Woodpecker CI pipelines
|
||||||
#
|
#
|
||||||
# Usage: pipeline-list.sh [-r owner/repo] [-l limit] [-f format]
|
# Usage: pipeline-list.sh [-r owner/repo] [-l limit] [-f format] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -r repo Repository in owner/repo format (default: current repo)
|
# -r repo Repository in owner/repo format (default: current repo)
|
||||||
# -l limit Number of pipelines to show (default: 20)
|
# -l limit Number of pipelines to show (default: 20)
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -h Show this help
|
# -a instance Woodpecker instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Requires: woodpecker.url and woodpecker.token in credentials.json
|
# Requires: woodpecker credentials in credentials.json
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_lib.sh"
|
||||||
# Check if woodpecker credentials exist before loading
|
|
||||||
CRED_FILE="${MOSAIC_CREDENTIALS_FILE:-$HOME/src/jarvis-brain/credentials.json}"
|
|
||||||
if ! jq -e '.woodpecker.token // empty | select(. != "")' "$CRED_FILE" &>/dev/null; then
|
|
||||||
echo "Error: Woodpecker API token not configured in credentials.json" >&2
|
|
||||||
echo "" >&2
|
|
||||||
echo "To configure:" >&2
|
|
||||||
echo " 1. Get your token from Woodpecker CI → User Settings → API" >&2
|
|
||||||
echo " 2. Add to credentials.json:" >&2
|
|
||||||
echo ' "woodpecker": {"url": "https://ci.mosaicstack.dev", "token": "YOUR_TOKEN"}' >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
load_credentials woodpecker
|
|
||||||
|
|
||||||
REPO=""
|
REPO=""
|
||||||
LIMIT=20
|
LIMIT=20
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
|
WP_INSTANCE=""
|
||||||
|
|
||||||
while getopts "r:l:f:h" opt; do
|
while getopts "r:l:f:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
r) REPO="$OPTARG" ;;
|
r) REPO="$OPTARG" ;;
|
||||||
l) LIMIT="$OPTARG" ;;
|
l) LIMIT="$OPTARG" ;;
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
|
a) WP_INSTANCE="$OPTARG" ;;
|
||||||
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
*) echo "Usage: $0 [-r owner/repo] [-l limit] [-f format]" >&2; exit 1 ;;
|
*) echo "Usage: $0 [-r owner/repo] [-l limit] [-f format] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
|
if [[ -n "$WP_INSTANCE" ]]; then
|
||||||
|
load_credentials "woodpecker-${WP_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials woodpecker
|
||||||
|
fi
|
||||||
|
|
||||||
# Auto-detect repo from git remote if not specified
|
# Auto-detect repo from git remote if not specified
|
||||||
if [[ -z "$REPO" ]]; then
|
if [[ -z "$REPO" ]]; then
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
REPO=$(wp_detect_repo) || exit 1
|
||||||
if [[ -n "$remote_url" ]]; then
|
|
||||||
REPO=$(echo "$remote_url" | sed -E 's|.*[:/]([^/]+/[^/]+?)(\.git)?$|\1|')
|
|
||||||
else
|
|
||||||
echo "Error: -r owner/repo required (not in a git repository)" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Resolve owner/repo to numeric ID (Woodpecker v3 API)
|
||||||
|
REPO_ID=$(wp_resolve_repo_id "$REPO") || exit 1
|
||||||
|
|
||||||
response=$(curl -sk -w "\n%{http_code}" \
|
response=$(curl -sk -w "\n%{http_code}" \
|
||||||
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
||||||
"${WOODPECKER_URL}/api/repos/${REPO}/pipelines?per_page=${LIMIT}")
|
"${WOODPECKER_URL}/api/repos/${REPO_ID}/pipelines?per_page=${LIMIT}")
|
||||||
|
|
||||||
http_code=$(echo "$response" | tail -n1)
|
http_code=$(echo "$response" | tail -n1)
|
||||||
body=$(echo "$response" | sed '$d')
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|||||||
@@ -2,76 +2,78 @@
|
|||||||
#
|
#
|
||||||
# pipeline-status.sh — Check Woodpecker CI pipeline status
|
# pipeline-status.sh — Check Woodpecker CI pipeline status
|
||||||
#
|
#
|
||||||
# Usage: pipeline-status.sh [-r owner/repo] [-n number] [-f format]
|
# Usage: pipeline-status.sh [-r owner/repo] [-n number] [-f format] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -r repo Repository in owner/repo format (default: current repo)
|
# -r repo Repository in owner/repo format (default: current repo)
|
||||||
# -n number Pipeline number (default: latest)
|
# -n number Pipeline number (default: latest)
|
||||||
# -f format Output format: table (default), json
|
# -f format Output format: table (default), json
|
||||||
# -h Show this help
|
# -a instance Woodpecker instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Requires: woodpecker.url and woodpecker.token in credentials.json
|
# Requires: woodpecker credentials in credentials.json
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_lib.sh"
|
||||||
CRED_FILE="${MOSAIC_CREDENTIALS_FILE:-$HOME/src/jarvis-brain/credentials.json}"
|
|
||||||
if ! jq -e '.woodpecker.token // empty | select(. != "")' "$CRED_FILE" &>/dev/null; then
|
|
||||||
echo "Error: Woodpecker API token not configured in credentials.json" >&2
|
|
||||||
echo "See: ~/.config/mosaic/tools/woodpecker/README.md" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
load_credentials woodpecker
|
|
||||||
|
|
||||||
REPO=""
|
REPO=""
|
||||||
NUMBER=""
|
NUMBER=""
|
||||||
FORMAT="table"
|
FORMAT="table"
|
||||||
|
WP_INSTANCE=""
|
||||||
|
|
||||||
while getopts "r:n:f:h" opt; do
|
while getopts "r:n:f:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
r) REPO="$OPTARG" ;;
|
r) REPO="$OPTARG" ;;
|
||||||
n) NUMBER="$OPTARG" ;;
|
n) NUMBER="$OPTARG" ;;
|
||||||
f) FORMAT="$OPTARG" ;;
|
f) FORMAT="$OPTARG" ;;
|
||||||
|
a) WP_INSTANCE="$OPTARG" ;;
|
||||||
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
*) echo "Usage: $0 [-r owner/repo] [-n number] [-f format]" >&2; exit 1 ;;
|
*) echo "Usage: $0 [-r owner/repo] [-n number] [-f format] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
|
if [[ -n "$WP_INSTANCE" ]]; then
|
||||||
|
load_credentials "woodpecker-${WP_INSTANCE}"
|
||||||
|
else
|
||||||
|
load_credentials woodpecker
|
||||||
|
fi
|
||||||
|
|
||||||
if [[ -z "$REPO" ]]; then
|
if [[ -z "$REPO" ]]; then
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
REPO=$(wp_detect_repo) || exit 1
|
||||||
if [[ -n "$remote_url" ]]; then
|
fi
|
||||||
REPO=$(echo "$remote_url" | sed -E 's|.*[:/]([^/]+/[^/]+?)(\.git)?$|\1|')
|
|
||||||
else
|
# Resolve owner/repo to numeric ID (Woodpecker v3 API)
|
||||||
echo "Error: -r owner/repo required (not in a git repository)" >&2
|
REPO_ID=$(wp_resolve_repo_id "$REPO") || exit 1
|
||||||
|
|
||||||
|
_wp_fetch() {
|
||||||
|
local ep="$1"
|
||||||
|
local resp http_code body
|
||||||
|
resp=$(curl -sk -w "\n%{http_code}" \
|
||||||
|
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
||||||
|
"$ep")
|
||||||
|
http_code=$(echo "$resp" | tail -n1)
|
||||||
|
body=$(echo "$resp" | sed '$d')
|
||||||
|
if [[ "$http_code" != "200" ]]; then
|
||||||
|
echo "Error: HTTP $http_code from $ep" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
echo "$body"
|
||||||
|
}
|
||||||
|
|
||||||
|
if [[ -z "$NUMBER" ]]; then
|
||||||
|
# Get latest pipeline number from list, then fetch full detail
|
||||||
|
list_body=$(_wp_fetch "${WOODPECKER_URL}/api/repos/${REPO_ID}/pipelines?per_page=1") || exit 1
|
||||||
|
NUMBER=$(echo "$list_body" | jq -r '.[0].number // empty')
|
||||||
|
if [[ -z "$NUMBER" ]]; then
|
||||||
|
echo "Error: No pipelines found" >&2
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [[ -z "$NUMBER" ]]; then
|
# Always fetch the single-pipeline endpoint (includes workflows/steps)
|
||||||
# Get latest pipeline
|
body=$(_wp_fetch "${WOODPECKER_URL}/api/repos/${REPO_ID}/pipelines/${NUMBER}") || exit 1
|
||||||
endpoint="${WOODPECKER_URL}/api/repos/${REPO}/pipelines?per_page=1"
|
|
||||||
else
|
|
||||||
endpoint="${WOODPECKER_URL}/api/repos/${REPO}/pipelines/${NUMBER}"
|
|
||||||
fi
|
|
||||||
|
|
||||||
response=$(curl -sk -w "\n%{http_code}" \
|
|
||||||
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
|
||||||
"$endpoint")
|
|
||||||
|
|
||||||
http_code=$(echo "$response" | tail -n1)
|
|
||||||
body=$(echo "$response" | sed '$d')
|
|
||||||
|
|
||||||
if [[ "$http_code" != "200" ]]; then
|
|
||||||
echo "Error: Failed to get pipeline status (HTTP $http_code)" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# If we got a list, extract the first one
|
|
||||||
if [[ -z "$NUMBER" ]]; then
|
|
||||||
body=$(echo "$body" | jq '.[0]')
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ "$FORMAT" == "json" ]]; then
|
if [[ "$FORMAT" == "json" ]]; then
|
||||||
echo "$body" | jq '.'
|
echo "$body" | jq '.'
|
||||||
@@ -81,6 +83,7 @@ fi
|
|||||||
echo "Pipeline Status"
|
echo "Pipeline Status"
|
||||||
echo "==============="
|
echo "==============="
|
||||||
echo "$body" | jq -r '
|
echo "$body" | jq -r '
|
||||||
|
def ts: if . and . > 0 then todate else "—" end;
|
||||||
" Number: \(.number)\n" +
|
" Number: \(.number)\n" +
|
||||||
" Status: \(.status)\n" +
|
" Status: \(.status)\n" +
|
||||||
" Branch: \(.branch)\n" +
|
" Branch: \(.branch)\n" +
|
||||||
@@ -88,6 +91,28 @@ echo "$body" | jq -r '
|
|||||||
" Commit: \(.commit[:12])\n" +
|
" Commit: \(.commit[:12])\n" +
|
||||||
" Message: \(.message | split("\n")[0])\n" +
|
" Message: \(.message | split("\n")[0])\n" +
|
||||||
" Author: \(.author)\n" +
|
" Author: \(.author)\n" +
|
||||||
" Started: \(.started_at // "pending")\n" +
|
" Started: \(.started | ts)\n" +
|
||||||
" Finished: \(.finished_at // "running")"
|
" Finished: \(.finished | ts)"
|
||||||
'
|
'
|
||||||
|
|
||||||
|
# Show step-level details if workflows exist
|
||||||
|
has_workflows=$(echo "$body" | jq 'has("workflows") and (.workflows | length > 0)')
|
||||||
|
if [[ "$has_workflows" == "true" ]]; then
|
||||||
|
echo ""
|
||||||
|
echo "Steps"
|
||||||
|
echo "-----"
|
||||||
|
echo "$body" | jq -r '
|
||||||
|
.workflows[] | .children[]? |
|
||||||
|
select(.type != "clone") |
|
||||||
|
" " +
|
||||||
|
(if .state == "success" then "OK"
|
||||||
|
elif .state == "failure" then "FAIL"
|
||||||
|
elif .state == "running" then "RUN"
|
||||||
|
elif .state == "skipped" then "SKIP"
|
||||||
|
elif .state == "pending" then "WAIT"
|
||||||
|
else .state end) +
|
||||||
|
" " + .name +
|
||||||
|
(if .error and .error != "" then " (" + .error + ")" else "" end) +
|
||||||
|
(if .exit_code and .exit_code != 0 then " [exit " + (.exit_code | tostring) + "]" else "" end)
|
||||||
|
'
|
||||||
|
fi
|
||||||
|
|||||||
@@ -2,57 +2,55 @@
|
|||||||
#
|
#
|
||||||
# pipeline-trigger.sh — Trigger a Woodpecker CI pipeline
|
# pipeline-trigger.sh — Trigger a Woodpecker CI pipeline
|
||||||
#
|
#
|
||||||
# Usage: pipeline-trigger.sh [-r owner/repo] [-b branch]
|
# Usage: pipeline-trigger.sh [-r owner/repo] [-b branch] [-a instance]
|
||||||
#
|
#
|
||||||
# Options:
|
# Options:
|
||||||
# -r repo Repository in owner/repo format (default: current repo)
|
# -r repo Repository in owner/repo format (default: current repo)
|
||||||
# -b branch Branch to build (default: main)
|
# -b branch Branch to build (default: main)
|
||||||
# -h Show this help
|
# -a instance Woodpecker instance name (e.g. usc, mosaic)
|
||||||
|
# -h Show this help
|
||||||
#
|
#
|
||||||
# Requires: woodpecker.url and woodpecker.token in credentials.json
|
# Requires: woodpecker credentials in credentials.json
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
|
|
||||||
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
MOSAIC_HOME="${MOSAIC_HOME:-$HOME/.config/mosaic}"
|
||||||
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
source "$MOSAIC_HOME/tools/_lib/credentials.sh"
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_lib.sh"
|
||||||
CRED_FILE="${MOSAIC_CREDENTIALS_FILE:-$HOME/src/jarvis-brain/credentials.json}"
|
|
||||||
if ! jq -e '.woodpecker.token // empty | select(. != "")' "$CRED_FILE" &>/dev/null; then
|
|
||||||
echo "Error: Woodpecker API token not configured in credentials.json" >&2
|
|
||||||
echo "See: ~/.config/mosaic/tools/woodpecker/README.md" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
load_credentials woodpecker
|
|
||||||
|
|
||||||
REPO=""
|
REPO=""
|
||||||
BRANCH="main"
|
BRANCH="main"
|
||||||
|
WP_INSTANCE=""
|
||||||
|
|
||||||
while getopts "r:b:h" opt; do
|
while getopts "r:b:a:h" opt; do
|
||||||
case $opt in
|
case $opt in
|
||||||
r) REPO="$OPTARG" ;;
|
r) REPO="$OPTARG" ;;
|
||||||
b) BRANCH="$OPTARG" ;;
|
b) BRANCH="$OPTARG" ;;
|
||||||
h) head -12 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
a) WP_INSTANCE="$OPTARG" ;;
|
||||||
*) echo "Usage: $0 [-r owner/repo] [-b branch]" >&2; exit 1 ;;
|
h) head -14 "$0" | grep "^#" | sed 's/^# \?//'; exit 0 ;;
|
||||||
|
*) echo "Usage: $0 [-r owner/repo] [-b branch] [-a instance]" >&2; exit 1 ;;
|
||||||
esac
|
esac
|
||||||
done
|
done
|
||||||
|
|
||||||
if [[ -z "$REPO" ]]; then
|
if [[ -n "$WP_INSTANCE" ]]; then
|
||||||
remote_url=$(git remote get-url origin 2>/dev/null || true)
|
load_credentials "woodpecker-${WP_INSTANCE}"
|
||||||
if [[ -n "$remote_url" ]]; then
|
else
|
||||||
REPO=$(echo "$remote_url" | sed -E 's|.*[:/]([^/]+/[^/]+?)(\.git)?$|\1|')
|
load_credentials woodpecker
|
||||||
else
|
|
||||||
echo "Error: -r owner/repo required (not in a git repository)" >&2
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
if [[ -z "$REPO" ]]; then
|
||||||
|
REPO=$(wp_detect_repo) || exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Resolve owner/repo to numeric ID (Woodpecker v3 API)
|
||||||
|
REPO_ID=$(wp_resolve_repo_id "$REPO") || exit 1
|
||||||
|
|
||||||
echo "Triggering pipeline for $REPO on branch $BRANCH..."
|
echo "Triggering pipeline for $REPO on branch $BRANCH..."
|
||||||
|
|
||||||
response=$(curl -sk -w "\n%{http_code}" -X POST \
|
response=$(curl -sk -w "\n%{http_code}" -X POST \
|
||||||
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
-H "Authorization: Bearer $WOODPECKER_TOKEN" \
|
||||||
-H "Content-Type: application/json" \
|
-H "Content-Type: application/json" \
|
||||||
-d "$(jq -n --arg b "$BRANCH" '{branch: $b}')" \
|
-d "$(jq -n --arg b "$BRANCH" '{branch: $b}')" \
|
||||||
"${WOODPECKER_URL}/api/repos/${REPO}/pipelines")
|
"${WOODPECKER_URL}/api/repos/${REPO_ID}/pipelines")
|
||||||
|
|
||||||
http_code=$(echo "$response" | tail -n1)
|
http_code=$(echo "$response" | tail -n1)
|
||||||
body=$(echo "$response" | sed '$d')
|
body=$(echo "$response" | sed '$d')
|
||||||
|
|||||||
Reference in New Issue
Block a user