Compare commits
169 Commits
21c045559d
...
docs/insta
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d6117f1a4a | ||
| a4c94d9a90 | |||
| cee838d22e | |||
| 732f8a49cf | |||
| be917e2496 | |||
| cd8b1f666d | |||
| 8fa5995bde | |||
| 25cada7735 | |||
| be6553101c | |||
| 417805f330 | |||
| 2472ce52e8 | |||
| 597eb232d7 | |||
| afe997db82 | |||
| b9d464de61 | |||
| 872c124581 | |||
| a531029c5b | |||
| 35ab619bd0 | |||
| 831193cdd8 | |||
| df460d5a49 | |||
| 119ff0eb1b | |||
| 3abd63ea5c | |||
| 641e4604d5 | |||
|
|
9b5ecc0171 | ||
|
|
a00325da0e | ||
| 4ebce3422d | |||
| 751e0ee330 | |||
| 54b2920ef3 | |||
| 5917016509 | |||
| 7b4f1d249d | |||
| 5425f9268e | |||
| febd866098 | |||
| 2446593fff | |||
| 651426cf2e | |||
| cf46f6e0ae | |||
| 6f15a84ccf | |||
| c39433c361 | |||
| 257796ce87 | |||
|
|
2357602f50 | ||
| 1230f6b984 | |||
| 14b775f1b9 | |||
|
|
c7691d9807 | ||
| 9a53d55678 | |||
|
|
31008ef7ff | ||
| 621ab260c0 | |||
| 2b1840214e | |||
|
|
5cfccc2ead | ||
|
|
774b76447d | ||
| 80994bdc8e | |||
| 2e31626f87 | |||
| 255ba46a4d | |||
| 10285933a0 | |||
| 543388e18b | |||
| 07a1f5d594 | |||
|
|
c6fc090c98 | ||
| 9723b6b948 | |||
| c0d0fd44b7 | |||
| 30c0fb1308 | |||
| 26fac4722f | |||
| e3f64c79d9 | |||
| cbd5e8c626 | |||
| 7560c7dee7 | |||
| 982a0e8f83 | |||
| fc7fa11923 | |||
| 86d6c214fe | |||
| 39ccba95d0 | |||
| 202e375f41 | |||
|
|
d0378c5723 | ||
| d6f04a0757 | |||
| afedb8697e | |||
|
|
1274df7ffc | ||
|
|
1b4767bd8b | ||
| 0b0fe10b37 | |||
| acfb31f8f6 | |||
|
|
fd83bd4f2d | ||
|
|
ce3ca1dbd1 | ||
|
|
95e7b071d4 | ||
| d4c5797a65 | |||
| 70a51ba711 | |||
| db8023bdbb | |||
| 9e597ecf87 | |||
| a23c117ea4 | |||
| 0cf80dab8c | |||
|
|
04a80fb9ba | ||
|
|
626adac363 | ||
|
|
35fbd88a1d | ||
| 381b0eed7b | |||
|
|
25383ea645 | ||
|
|
e7db9ddf98 | ||
|
|
7bb878718d | ||
|
|
46a31d4e71 | ||
|
|
e128a7a322 | ||
|
|
27b1898ec6 | ||
|
|
d19ef45bb0 | ||
|
|
5e852df6c3 | ||
|
|
e0eca771c6 | ||
|
|
9d22ef4cc9 | ||
|
|
41961a6980 | ||
|
|
e797676a02 | ||
|
|
05d61e62be | ||
|
|
73043773d8 | ||
| 0be9729e40 | |||
|
|
e83674ac51 | ||
|
|
a6e59bf829 | ||
| e46f0641f6 | |||
|
|
07efaa9580 | ||
|
|
361fece023 | ||
| 80e69016b0 | |||
|
|
e084a88a9d | ||
| 990a88362f | |||
|
|
ea9782b2dc | ||
| 8efbaf100e | |||
|
|
15830e2f2a | ||
| 04db8591af | |||
|
|
785d30e065 | ||
| e57a10913d | |||
| 0d12471868 | |||
| ea371d760d | |||
|
|
3b9104429b | ||
|
|
8a83aed9b1 | ||
|
|
2f68237046 | ||
|
|
45f5b9062e | ||
| 147f5f1bec | |||
|
|
f05b198882 | ||
| d0a484cbb7 | |||
|
|
6e6ee37da0 | ||
| 53199122d8 | |||
|
|
b38cfac760 | ||
| f3cb3e6852 | |||
|
|
e599f5fe38 | ||
| 6357a3fc9c | |||
|
|
92998e6e65 | ||
| 2394a2a0dd | |||
|
|
13934d4879 | ||
| aa80013811 | |||
|
|
2ee7206c3a | ||
| be74ca3cf9 | |||
| 35123b21ce | |||
| 492dc18e14 | |||
|
|
a824a43ed1 | ||
|
|
9b72f0ea14 | ||
|
|
d367f00077 | ||
| 31a5751c6c | |||
| fa43989cd5 | |||
| 1b317e8a0a | |||
| 316807581c | |||
|
|
3321d4575a | ||
|
|
85d4527701 | ||
|
|
47b7509288 | ||
|
|
34fad9da81 | ||
|
|
48be0aa195 | ||
|
|
f544cc65d2 | ||
|
|
41e8f91b2d | ||
|
|
f161e3cb62 | ||
| da41724490 | |||
|
|
281e636e4d | ||
| 87dcd12a65 | |||
|
|
d3fdc4ff54 | ||
| 9690aba0f5 | |||
|
|
10689a30d2 | ||
| 40c068fcbc | |||
|
|
a9340adad7 | ||
| 5cb72e8ca6 | |||
|
|
48323e7d6e | ||
|
|
01259f56cd | ||
| 472f046a85 | |||
| dfaf5a52df | |||
| 93b3322e45 | |||
| a532fd43b2 | |||
| 701bb69e6c |
12
.env.example
12
.env.example
@@ -23,8 +23,8 @@ VALKEY_URL=redis://localhost:6380
|
||||
|
||||
|
||||
# ─── Gateway ─────────────────────────────────────────────────────────────────
|
||||
# TCP port the NestJS/Fastify gateway listens on (default: 4000)
|
||||
GATEWAY_PORT=4000
|
||||
# TCP port the NestJS/Fastify gateway listens on (default: 14242)
|
||||
GATEWAY_PORT=14242
|
||||
|
||||
# Comma-separated list of allowed CORS origins.
|
||||
# Must include the web app origin in production.
|
||||
@@ -37,12 +37,12 @@ GATEWAY_CORS_ORIGIN=http://localhost:3000
|
||||
BETTER_AUTH_SECRET=change-me-to-a-random-32-char-string
|
||||
|
||||
# Public base URL of the gateway (used by BetterAuth for callback URLs)
|
||||
BETTER_AUTH_URL=http://localhost:4000
|
||||
BETTER_AUTH_URL=http://localhost:14242
|
||||
|
||||
|
||||
# ─── Web App (Next.js) ───────────────────────────────────────────────────────
|
||||
# Public gateway URL — accessible from the browser, not just the server.
|
||||
NEXT_PUBLIC_GATEWAY_URL=http://localhost:4000
|
||||
NEXT_PUBLIC_GATEWAY_URL=http://localhost:14242
|
||||
|
||||
|
||||
# ─── OpenTelemetry ───────────────────────────────────────────────────────────
|
||||
@@ -121,12 +121,12 @@ OTEL_SERVICE_NAME=mosaic-gateway
|
||||
# ─── Discord Plugin (optional — set DISCORD_BOT_TOKEN to enable) ─────────────
|
||||
# DISCORD_BOT_TOKEN=
|
||||
# DISCORD_GUILD_ID=
|
||||
# DISCORD_GATEWAY_URL=http://localhost:4000
|
||||
# DISCORD_GATEWAY_URL=http://localhost:14242
|
||||
|
||||
|
||||
# ─── Telegram Plugin (optional — set TELEGRAM_BOT_TOKEN to enable) ───────────
|
||||
# TELEGRAM_BOT_TOKEN=
|
||||
# TELEGRAM_GATEWAY_URL=http://localhost:4000
|
||||
# TELEGRAM_GATEWAY_URL=http://localhost:14242
|
||||
|
||||
|
||||
# ─── SSO Providers (add credentials to enable) ───────────────────────────────
|
||||
|
||||
2
.npmrc
2
.npmrc
@@ -1 +1 @@
|
||||
@mosaic:registry=https://git.mosaicstack.dev/api/packages/mosaic/npm/
|
||||
@mosaicstack:registry=https://git.mosaicstack.dev/api/packages/mosaicstack/npm/
|
||||
|
||||
@@ -15,6 +15,7 @@ steps:
|
||||
image: *node_image
|
||||
commands:
|
||||
- corepack enable
|
||||
- apk add --no-cache python3 make g++
|
||||
- pnpm install --frozen-lockfile
|
||||
|
||||
typecheck:
|
||||
@@ -44,18 +45,30 @@ steps:
|
||||
|
||||
test:
|
||||
image: *node_image
|
||||
environment:
|
||||
DATABASE_URL: postgresql://mosaic:mosaic@postgres:5432/mosaic
|
||||
commands:
|
||||
- *enable_pnpm
|
||||
# Install postgresql-client for pg_isready
|
||||
- apk add --no-cache postgresql-client
|
||||
# Wait up to 30s for postgres to be ready
|
||||
- |
|
||||
for i in $(seq 1 30); do
|
||||
pg_isready -h postgres -p 5432 -U mosaic && break
|
||||
echo "Waiting for postgres ($i/30)..."
|
||||
sleep 1
|
||||
done
|
||||
# Run migrations (DATABASE_URL is set in environment above)
|
||||
- pnpm --filter @mosaicstack/db run db:migrate
|
||||
# Run all tests
|
||||
- pnpm test
|
||||
depends_on:
|
||||
- typecheck
|
||||
|
||||
build:
|
||||
image: *node_image
|
||||
commands:
|
||||
- *enable_pnpm
|
||||
- pnpm build
|
||||
depends_on:
|
||||
- lint
|
||||
- format
|
||||
- test
|
||||
services:
|
||||
postgres:
|
||||
image: pgvector/pgvector:pg17
|
||||
environment:
|
||||
POSTGRES_USER: mosaic
|
||||
POSTGRES_PASSWORD: mosaic
|
||||
POSTGRES_DB: mosaic
|
||||
|
||||
140
.woodpecker/publish.yml
Normal file
140
.woodpecker/publish.yml
Normal file
@@ -0,0 +1,140 @@
|
||||
# Build, publish npm packages, and push Docker images
|
||||
# Runs only on main branch push/tag
|
||||
|
||||
variables:
|
||||
- &node_image 'node:22-alpine'
|
||||
- &enable_pnpm 'corepack enable'
|
||||
|
||||
when:
|
||||
- branch: [main]
|
||||
event: [push, manual, tag]
|
||||
|
||||
steps:
|
||||
install:
|
||||
image: *node_image
|
||||
commands:
|
||||
- corepack enable
|
||||
- pnpm install --frozen-lockfile
|
||||
|
||||
build:
|
||||
image: *node_image
|
||||
commands:
|
||||
- *enable_pnpm
|
||||
- pnpm build
|
||||
depends_on:
|
||||
- install
|
||||
|
||||
publish-npm:
|
||||
image: *node_image
|
||||
environment:
|
||||
NPM_TOKEN:
|
||||
from_secret: gitea_token
|
||||
commands:
|
||||
- *enable_pnpm
|
||||
# Configure auth for Gitea npm registry
|
||||
- |
|
||||
echo "//git.mosaicstack.dev/api/packages/mosaicstack/npm/:_authToken=$NPM_TOKEN" > ~/.npmrc
|
||||
echo "@mosaicstack:registry=https://git.mosaicstack.dev/api/packages/mosaicstack/npm/" >> ~/.npmrc
|
||||
# Publish non-private packages to Gitea.
|
||||
#
|
||||
# The only publish failure we tolerate is "version already exists" —
|
||||
# that legitimately happens when only some packages were bumped in
|
||||
# the merge. Any other failure (registry 404, auth error, network
|
||||
# error) MUST fail the pipeline loudly: the previous
|
||||
# `|| echo "... continuing"` fallback silently hid a 404 from the
|
||||
# Gitea org rename and caused every @mosaicstack/* publish to fall
|
||||
# on the floor while CI still reported green.
|
||||
- |
|
||||
# Portable sh (Alpine ash) — avoid bashisms like PIPESTATUS.
|
||||
set +e
|
||||
pnpm --filter "@mosaicstack/*" --filter "!@mosaicstack/web" publish --no-git-checks --access public >/tmp/publish.log 2>&1
|
||||
EXIT=$?
|
||||
set -e
|
||||
cat /tmp/publish.log
|
||||
if [ "$EXIT" -eq 0 ]; then
|
||||
echo "[publish] all packages published successfully"
|
||||
exit 0
|
||||
fi
|
||||
# Hard registry / auth / network errors → fatal. Match npm's own
|
||||
# error lines specifically to avoid false positives on arbitrary
|
||||
# log text that happens to contain "E404" etc.
|
||||
if grep -qE "npm (error|ERR!) code (E404|E401|ENEEDAUTH|ECONNREFUSED|ETIMEDOUT|ENOTFOUND)" /tmp/publish.log; then
|
||||
echo "[publish] FATAL: registry/auth/network error detected — failing pipeline" >&2
|
||||
exit 1
|
||||
fi
|
||||
# Only tolerate the explicit "version already published" case.
|
||||
# npm returns this as E403 with body "You cannot publish over..."
|
||||
# or EPUBLISHCONFLICT depending on version.
|
||||
if grep -qE "EPUBLISHCONFLICT|You cannot publish over|previously published" /tmp/publish.log; then
|
||||
echo "[publish] some packages already at this version — continuing (non-fatal)"
|
||||
exit 0
|
||||
fi
|
||||
echo "[publish] FATAL: publish failed with unrecognized error — failing pipeline" >&2
|
||||
exit 1
|
||||
depends_on:
|
||||
- build
|
||||
|
||||
# TODO: Uncomment when ready to publish to npmjs.org
|
||||
# publish-npmjs:
|
||||
# image: *node_image
|
||||
# environment:
|
||||
# NPM_TOKEN:
|
||||
# from_secret: npmjs_token
|
||||
# commands:
|
||||
# - *enable_pnpm
|
||||
# - apk add --no-cache jq bash
|
||||
# - bash scripts/publish-npmjs.sh
|
||||
# depends_on:
|
||||
# - build
|
||||
# when:
|
||||
# - event: [tag]
|
||||
|
||||
build-gateway:
|
||||
image: gcr.io/kaniko-project/executor:debug
|
||||
environment:
|
||||
REGISTRY_USER:
|
||||
from_secret: gitea_username
|
||||
REGISTRY_PASS:
|
||||
from_secret: gitea_password
|
||||
CI_COMMIT_BRANCH: ${CI_COMMIT_BRANCH}
|
||||
CI_COMMIT_TAG: ${CI_COMMIT_TAG}
|
||||
CI_COMMIT_SHA: ${CI_COMMIT_SHA}
|
||||
commands:
|
||||
- mkdir -p /kaniko/.docker
|
||||
- echo "{\"auths\":{\"git.mosaicstack.dev\":{\"username\":\"$REGISTRY_USER\",\"password\":\"$REGISTRY_PASS\"}}}" > /kaniko/.docker/config.json
|
||||
- |
|
||||
DESTINATIONS="--destination git.mosaicstack.dev/mosaicstack/mosaic-stack/gateway:sha-${CI_COMMIT_SHA:0:7}"
|
||||
if [ "$CI_COMMIT_BRANCH" = "main" ]; then
|
||||
DESTINATIONS="$DESTINATIONS --destination git.mosaicstack.dev/mosaicstack/mosaic-stack/gateway:latest"
|
||||
fi
|
||||
if [ -n "$CI_COMMIT_TAG" ]; then
|
||||
DESTINATIONS="$DESTINATIONS --destination git.mosaicstack.dev/mosaicstack/mosaic-stack/gateway:$CI_COMMIT_TAG"
|
||||
fi
|
||||
/kaniko/executor --context . --dockerfile docker/gateway.Dockerfile $DESTINATIONS
|
||||
depends_on:
|
||||
- build
|
||||
|
||||
build-web:
|
||||
image: gcr.io/kaniko-project/executor:debug
|
||||
environment:
|
||||
REGISTRY_USER:
|
||||
from_secret: gitea_username
|
||||
REGISTRY_PASS:
|
||||
from_secret: gitea_password
|
||||
CI_COMMIT_BRANCH: ${CI_COMMIT_BRANCH}
|
||||
CI_COMMIT_TAG: ${CI_COMMIT_TAG}
|
||||
CI_COMMIT_SHA: ${CI_COMMIT_SHA}
|
||||
commands:
|
||||
- mkdir -p /kaniko/.docker
|
||||
- echo "{\"auths\":{\"git.mosaicstack.dev\":{\"username\":\"$REGISTRY_USER\",\"password\":\"$REGISTRY_PASS\"}}}" > /kaniko/.docker/config.json
|
||||
- |
|
||||
DESTINATIONS="--destination git.mosaicstack.dev/mosaicstack/mosaic-stack/web:sha-${CI_COMMIT_SHA:0:7}"
|
||||
if [ "$CI_COMMIT_BRANCH" = "main" ]; then
|
||||
DESTINATIONS="$DESTINATIONS --destination git.mosaicstack.dev/mosaicstack/mosaic-stack/web:latest"
|
||||
fi
|
||||
if [ -n "$CI_COMMIT_TAG" ]; then
|
||||
DESTINATIONS="$DESTINATIONS --destination git.mosaicstack.dev/mosaicstack/mosaic-stack/web:$CI_COMMIT_TAG"
|
||||
fi
|
||||
/kaniko/executor --context . --dockerfile docker/web.Dockerfile $DESTINATIONS
|
||||
depends_on:
|
||||
- build
|
||||
12
AGENTS.md
12
AGENTS.md
@@ -21,11 +21,11 @@ Mosaic Stack is a self-hosted, multi-user AI agent platform. TypeScript monorepo
|
||||
| `apps/web` | Next.js dashboard | React 19, Tailwind |
|
||||
| `packages/types` | Shared TypeScript contracts | class-validator |
|
||||
| `packages/db` | Drizzle ORM schema + migrations | drizzle-orm, postgres |
|
||||
| `packages/auth` | BetterAuth configuration | better-auth, @mosaic/db |
|
||||
| `packages/brain` | Data layer (PG-backed) | @mosaic/db |
|
||||
| `packages/auth` | BetterAuth configuration | better-auth, @mosaicstack/db |
|
||||
| `packages/brain` | Data layer (PG-backed) | @mosaicstack/db |
|
||||
| `packages/queue` | Valkey task queue + MCP | ioredis |
|
||||
| `packages/coord` | Mission coordination | @mosaic/queue |
|
||||
| `packages/cli` | Unified CLI + Pi TUI | Ink, Pi SDK |
|
||||
| `packages/coord` | Mission coordination | @mosaicstack/queue |
|
||||
| `packages/mosaic` | Unified `mosaic` CLI + TUI | Ink, Pi SDK, commander |
|
||||
| `plugins/discord` | Discord channel plugin | discord.js |
|
||||
| `plugins/telegram` | Telegram channel plugin | Telegraf |
|
||||
|
||||
@@ -33,9 +33,9 @@ Mosaic Stack is a self-hosted, multi-user AI agent platform. TypeScript monorepo
|
||||
|
||||
1. Gateway is the single API surface — all clients connect through it
|
||||
2. Pi SDK is ESM-only — gateway and CLI must use ESM
|
||||
3. Socket.IO typed events defined in `@mosaic/types` enforce compile-time contracts
|
||||
3. Socket.IO typed events defined in `@mosaicstack/types` enforce compile-time contracts
|
||||
4. OTEL auto-instrumentation loads before NestJS bootstrap
|
||||
5. BetterAuth manages auth tables; schema defined in `@mosaic/db`
|
||||
5. BetterAuth manages auth tables; schema defined in `@mosaicstack/db`
|
||||
6. Docker Compose provides PG (5433), Valkey (6380), OTEL Collector (4317/4318), Jaeger (16686)
|
||||
7. Explicit `@Inject()` decorators required in NestJS (tsx/esbuild doesn't emit decorator metadata)
|
||||
|
||||
|
||||
10
CLAUDE.md
10
CLAUDE.md
@@ -10,7 +10,7 @@ Self-hosted, multi-user AI agent platform. TypeScript monorepo.
|
||||
- **Web**: Next.js 16 + React 19 (`apps/web`)
|
||||
- **ORM**: Drizzle ORM + PostgreSQL 17 + pgvector (`packages/db`)
|
||||
- **Auth**: BetterAuth (`packages/auth`)
|
||||
- **Agent**: Pi SDK (`packages/agent`, `packages/cli`)
|
||||
- **Agent**: Pi SDK (`packages/agent`, `packages/mosaic`)
|
||||
- **Queue**: Valkey 8 (`packages/queue`)
|
||||
- **Build**: pnpm workspaces + Turborepo
|
||||
- **CI**: Woodpecker CI
|
||||
@@ -26,13 +26,13 @@ pnpm test # Vitest (all packages)
|
||||
pnpm build # Build all packages
|
||||
|
||||
# Database
|
||||
pnpm --filter @mosaic/db db:push # Push schema to PG (dev)
|
||||
pnpm --filter @mosaic/db db:generate # Generate migrations
|
||||
pnpm --filter @mosaic/db db:migrate # Run migrations
|
||||
pnpm --filter @mosaicstack/db db:push # Push schema to PG (dev)
|
||||
pnpm --filter @mosaicstack/db db:generate # Generate migrations
|
||||
pnpm --filter @mosaicstack/db db:migrate # Run migrations
|
||||
|
||||
# Dev
|
||||
docker compose up -d # Start PG, Valkey, OTEL, Jaeger
|
||||
pnpm --filter @mosaic/gateway exec tsx src/main.ts # Start gateway
|
||||
pnpm --filter @mosaicstack/gateway exec tsx src/main.ts # Start gateway
|
||||
```
|
||||
|
||||
## Conventions
|
||||
|
||||
348
README.md
Normal file
348
README.md
Normal file
@@ -0,0 +1,348 @@
|
||||
# Mosaic Stack
|
||||
|
||||
Self-hosted, multi-user AI agent platform. One config, every runtime, same standards.
|
||||
|
||||
Mosaic gives you a unified launcher for Claude Code, Codex, OpenCode, and Pi — injecting consistent system prompts, guardrails, skills, and mission context into every session. A NestJS gateway provides the API surface, a Next.js dashboard gives you the UI, and a plugin system connects Discord, Telegram, and more.
|
||||
|
||||
## Quick Install
|
||||
|
||||
```bash
|
||||
bash <(curl -fsSL https://git.mosaicstack.dev/mosaicstack/mosaic-stack/raw/branch/main/tools/install.sh)
|
||||
```
|
||||
|
||||
The installer auto-launches the setup wizard, which walks you through gateway install and verification. Flags for non-interactive use:
|
||||
|
||||
```bash
|
||||
bash <(curl -fsSL …) --yes # Accept all defaults
|
||||
bash <(curl -fsSL …) --yes --no-auto-launch # Install only, skip wizard
|
||||
```
|
||||
|
||||
This installs both components:
|
||||
|
||||
| Component | What | Where |
|
||||
| ----------------------- | ---------------------------------------------------------------- | -------------------- |
|
||||
| **Framework** | Bash launcher, guides, runtime configs, tools, skills | `~/.config/mosaic/` |
|
||||
| **@mosaicstack/mosaic** | Unified `mosaic` CLI — TUI, gateway client, wizard, auto-updater | `~/.npm-global/bin/` |
|
||||
|
||||
After install, the wizard runs automatically or you can invoke it manually:
|
||||
|
||||
```bash
|
||||
mosaic wizard # Full guided setup (gateway install → verify)
|
||||
```
|
||||
|
||||
### Requirements
|
||||
|
||||
- Node.js ≥ 20
|
||||
- npm (for global @mosaicstack/mosaic install)
|
||||
- One or more runtimes: [Claude Code](https://docs.anthropic.com/en/docs/claude-code), [Codex](https://github.com/openai/codex), [OpenCode](https://opencode.ai), or [Pi](https://github.com/mariozechner/pi-coding-agent)
|
||||
|
||||
## Usage
|
||||
|
||||
### Launching Agent Sessions
|
||||
|
||||
```bash
|
||||
mosaic pi # Launch Pi with Mosaic injection
|
||||
mosaic claude # Launch Claude Code with Mosaic injection
|
||||
mosaic codex # Launch Codex with Mosaic injection
|
||||
mosaic opencode # Launch OpenCode with Mosaic injection
|
||||
|
||||
mosaic yolo claude # Claude with dangerous-permissions mode
|
||||
mosaic yolo pi # Pi in yolo mode
|
||||
```
|
||||
|
||||
The launcher verifies your config, checks for `SOUL.md`, injects your `AGENTS.md` standards into the runtime, and forwards all arguments.
|
||||
|
||||
### TUI & Gateway
|
||||
|
||||
```bash
|
||||
mosaic tui # Interactive TUI connected to the gateway
|
||||
mosaic gateway login # Authenticate with a gateway instance
|
||||
mosaic sessions list # List active agent sessions
|
||||
```
|
||||
|
||||
### Gateway Management
|
||||
|
||||
```bash
|
||||
mosaic gateway install # Install and configure the gateway service
|
||||
mosaic gateway verify # Post-install health check
|
||||
mosaic gateway login # Authenticate and store a session token
|
||||
mosaic gateway config rotate-token # Rotate your API token
|
||||
mosaic gateway config recover-token # Recover a token via BetterAuth cookie
|
||||
```
|
||||
|
||||
If you already have a gateway account but no token, use `mosaic gateway config recover-token` to retrieve one without recreating your account.
|
||||
|
||||
### Configuration
|
||||
|
||||
```bash
|
||||
mosaic config show # Print full config as JSON
|
||||
mosaic config get <key> # Read a specific key
|
||||
mosaic config set <key> <val># Write a key
|
||||
mosaic config edit # Open config in $EDITOR
|
||||
mosaic config path # Print config file path
|
||||
```
|
||||
|
||||
### Management
|
||||
|
||||
```bash
|
||||
mosaic doctor # Health audit — detect drift and missing files
|
||||
mosaic sync # Sync skills from canonical source
|
||||
mosaic update # Check for and install CLI updates
|
||||
mosaic wizard # Full guided setup wizard
|
||||
mosaic bootstrap <path> # Bootstrap a repo with Mosaic standards
|
||||
mosaic coord init # Initialize a new orchestration mission
|
||||
mosaic prdy init # Create a PRD via guided session
|
||||
```
|
||||
|
||||
### Sub-package Commands
|
||||
|
||||
Each Mosaic sub-package exposes its API surface through the unified CLI:
|
||||
|
||||
```bash
|
||||
# User management
|
||||
mosaic auth users list
|
||||
mosaic auth users create
|
||||
mosaic auth sso
|
||||
|
||||
# Agent brain (projects, missions, tasks)
|
||||
mosaic brain projects
|
||||
mosaic brain missions
|
||||
mosaic brain tasks
|
||||
mosaic brain conversations
|
||||
|
||||
# Agent forge pipeline
|
||||
mosaic forge run
|
||||
mosaic forge status
|
||||
mosaic forge resume
|
||||
mosaic forge personas
|
||||
|
||||
# Structured logging
|
||||
mosaic log tail
|
||||
mosaic log search
|
||||
mosaic log export
|
||||
mosaic log level
|
||||
|
||||
# MACP protocol
|
||||
mosaic macp tasks
|
||||
mosaic macp submit
|
||||
mosaic macp gate
|
||||
mosaic macp events
|
||||
|
||||
# Agent memory
|
||||
mosaic memory search
|
||||
mosaic memory stats
|
||||
mosaic memory insights
|
||||
mosaic memory preferences
|
||||
|
||||
# Task queue (Valkey)
|
||||
mosaic queue list
|
||||
mosaic queue stats
|
||||
mosaic queue pause
|
||||
mosaic queue resume
|
||||
mosaic queue jobs
|
||||
mosaic queue drain
|
||||
|
||||
# Object storage
|
||||
mosaic storage status
|
||||
mosaic storage tier
|
||||
mosaic storage export
|
||||
mosaic storage import
|
||||
mosaic storage migrate
|
||||
```
|
||||
|
||||
### Telemetry
|
||||
|
||||
```bash
|
||||
# Local observability (OTEL / Jaeger)
|
||||
mosaic telemetry local status
|
||||
mosaic telemetry local tail
|
||||
mosaic telemetry local jaeger
|
||||
|
||||
# Remote telemetry (dry-run by default)
|
||||
mosaic telemetry status
|
||||
mosaic telemetry opt-in
|
||||
mosaic telemetry opt-out
|
||||
mosaic telemetry test
|
||||
mosaic telemetry upload # Dry-run unless opted in
|
||||
```
|
||||
|
||||
Consent state is persisted in config. Remote upload is a no-op until you run `mosaic telemetry opt-in`.
|
||||
|
||||
## Development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js ≥ 20
|
||||
- pnpm 10.6+
|
||||
- Docker & Docker Compose
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
git clone git@git.mosaicstack.dev:mosaicstack/mosaic-stack.git
|
||||
cd mosaic-stack
|
||||
|
||||
# Start infrastructure (Postgres, Valkey, Jaeger)
|
||||
docker compose up -d
|
||||
|
||||
# Install dependencies
|
||||
pnpm install
|
||||
|
||||
# Run migrations
|
||||
pnpm --filter @mosaicstack/db run db:migrate
|
||||
|
||||
# Start all services in dev mode
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
### Infrastructure
|
||||
|
||||
Docker Compose provides:
|
||||
|
||||
| Service | Port | Purpose |
|
||||
| --------------------- | --------- | ---------------------- |
|
||||
| PostgreSQL (pgvector) | 5433 | Primary database |
|
||||
| Valkey | 6380 | Task queue + caching |
|
||||
| Jaeger | 16686 | Distributed tracing UI |
|
||||
| OTEL Collector | 4317/4318 | Telemetry ingestion |
|
||||
|
||||
### Quality Gates
|
||||
|
||||
```bash
|
||||
pnpm typecheck # TypeScript type checking (all packages)
|
||||
pnpm lint # ESLint (all packages)
|
||||
pnpm test # Vitest (all packages)
|
||||
pnpm format:check # Prettier check
|
||||
pnpm format # Prettier auto-fix
|
||||
```
|
||||
|
||||
### CI
|
||||
|
||||
Woodpecker CI runs on every push:
|
||||
|
||||
- `pnpm install --frozen-lockfile`
|
||||
- Database migration against a fresh Postgres
|
||||
- `pnpm test` (Turbo-orchestrated across all packages)
|
||||
|
||||
npm packages are published to the Gitea package registry on main merges.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
mosaic-stack/
|
||||
├── apps/
|
||||
│ ├── gateway/ NestJS API + WebSocket hub (Fastify, Socket.IO, OTEL)
|
||||
│ └── web/ Next.js dashboard (React 19, Tailwind)
|
||||
├── packages/
|
||||
│ ├── mosaic/ Unified CLI — TUI, gateway client, wizard, sub-package commands
|
||||
│ ├── types/ Shared TypeScript contracts (Socket.IO typed events)
|
||||
│ ├── db/ Drizzle ORM schema + migrations (pgvector)
|
||||
│ ├── auth/ BetterAuth configuration
|
||||
│ ├── brain/ Data layer (PG-backed)
|
||||
│ ├── queue/ Valkey task queue + MCP
|
||||
│ ├── coord/ Mission coordination
|
||||
│ ├── forge/ Multi-stage AI pipeline (intake → board → plan → code → review)
|
||||
│ ├── macp/ MACP protocol — credential resolution, gate runner, events
|
||||
│ ├── agent/ Agent session management
|
||||
│ ├── memory/ Agent memory layer
|
||||
│ ├── log/ Structured logging
|
||||
│ ├── prdy/ PRD creation and validation
|
||||
│ ├── quality-rails/ Quality templates (TypeScript, Next.js, monorepo)
|
||||
│ └── design-tokens/ Shared design tokens
|
||||
├── plugins/
|
||||
│ ├── discord/ Discord channel plugin (discord.js)
|
||||
│ ├── telegram/ Telegram channel plugin (Telegraf)
|
||||
│ ├── macp/ OpenClaw MACP runtime plugin
|
||||
│ └── mosaic-framework/ OpenClaw framework injection plugin
|
||||
├── tools/
|
||||
│ └── install.sh Unified installer (framework + npm CLI, --yes / --no-auto-launch)
|
||||
├── scripts/agent/ Agent session lifecycle scripts
|
||||
├── docker-compose.yml Dev infrastructure
|
||||
└── .woodpecker/ CI pipeline configs
|
||||
```
|
||||
|
||||
### Key Design Decisions
|
||||
|
||||
- **Gateway is the single API surface** — all clients (TUI, web, Discord, Telegram) connect through it
|
||||
- **ESM everywhere** — `"type": "module"`, `.js` extensions in imports, NodeNext resolution
|
||||
- **Socket.IO typed events** — defined in `@mosaicstack/types`, enforced at compile time
|
||||
- **OTEL auto-instrumentation** — loads before NestJS bootstrap
|
||||
- **Explicit `@Inject()` decorators** — required since tsx/esbuild doesn't emit decorator metadata
|
||||
|
||||
### Framework (`~/.config/mosaic/`)
|
||||
|
||||
The framework is the bash-based standards layer installed to every developer machine:
|
||||
|
||||
```
|
||||
~/.config/mosaic/
|
||||
├── AGENTS.md ← Central standards (loaded into every runtime)
|
||||
├── SOUL.md ← Agent identity (name, style, guardrails)
|
||||
├── USER.md ← User profile (name, timezone, preferences)
|
||||
├── TOOLS.md ← Machine-level tool reference
|
||||
├── bin/mosaic ← Unified launcher (claude, codex, opencode, pi, yolo)
|
||||
├── guides/ ← E2E delivery, orchestrator protocol, PRD, etc.
|
||||
├── runtime/ ← Per-runtime configs (claude/, codex/, opencode/, pi/)
|
||||
├── skills/ ← Universal skills (synced from agent-skills repo)
|
||||
├── tools/ ← Tool suites (orchestrator, git, quality, prdy, etc.)
|
||||
└── memory/ ← Persistent agent memory (preserved across upgrades)
|
||||
```
|
||||
|
||||
### Forge Pipeline
|
||||
|
||||
Forge is a multi-stage AI pipeline for autonomous feature delivery:
|
||||
|
||||
```
|
||||
Intake → Discovery → Board Review → Planning (3 stages) → Coding → Review → Remediation → Test → Deploy
|
||||
```
|
||||
|
||||
Each stage has a dispatch mode (`exec` for research/review, `yolo` for coding), quality gates, and timeouts. The board review uses multiple AI personas (CEO, CTO, CFO, COO + specialists) to evaluate briefs before committing resources.
|
||||
|
||||
## Upgrading
|
||||
|
||||
Run the installer again — it handles upgrades automatically:
|
||||
|
||||
```bash
|
||||
bash <(curl -fsSL https://git.mosaicstack.dev/mosaicstack/mosaic-stack/raw/branch/main/tools/install.sh)
|
||||
```
|
||||
|
||||
Or use the CLI:
|
||||
|
||||
```bash
|
||||
mosaic update # Check + install CLI updates
|
||||
mosaic update --check # Check only, don't install
|
||||
```
|
||||
|
||||
The CLI also performs a background update check on every invocation (cached for 1 hour).
|
||||
|
||||
### Installer Flags
|
||||
|
||||
```bash
|
||||
bash tools/install.sh --check # Version check only
|
||||
bash tools/install.sh --framework # Framework only (skip npm CLI)
|
||||
bash tools/install.sh --cli # npm CLI only (skip framework)
|
||||
bash tools/install.sh --ref v1.0 # Install from a specific git ref
|
||||
bash tools/install.sh --yes # Non-interactive, accept all defaults
|
||||
bash tools/install.sh --no-auto-launch # Skip auto-launch of wizard
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
```bash
|
||||
# Create a feature branch
|
||||
git checkout -b feat/my-feature
|
||||
|
||||
# Make changes, then verify
|
||||
pnpm typecheck && pnpm lint && pnpm test && pnpm format:check
|
||||
|
||||
# Commit (husky runs lint-staged automatically)
|
||||
git commit -m "feat: description of change"
|
||||
|
||||
# Push and create PR
|
||||
git push -u origin feat/my-feature
|
||||
```
|
||||
|
||||
DTOs go in `*.dto.ts` files at module boundaries. Scratchpads (`docs/scratchpads/`) are mandatory for non-trivial tasks. See `AGENTS.md` for the full standards reference.
|
||||
|
||||
## License
|
||||
|
||||
Proprietary — all rights reserved.
|
||||
@@ -1,9 +1,23 @@
|
||||
{
|
||||
"name": "@mosaic/gateway",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"name": "@mosaicstack/gateway",
|
||||
"version": "0.0.6",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://git.mosaicstack.dev/mosaicstack/mosaic-stack.git",
|
||||
"directory": "apps/gateway"
|
||||
},
|
||||
"type": "module",
|
||||
"main": "dist/main.js",
|
||||
"bin": {
|
||||
"mosaic-gateway": "dist/main.js"
|
||||
},
|
||||
"files": [
|
||||
"dist"
|
||||
],
|
||||
"publishConfig": {
|
||||
"registry": "https://git.mosaicstack.dev/api/packages/mosaicstack/npm/",
|
||||
"access": "public"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"dev": "tsx watch src/main.ts",
|
||||
@@ -14,26 +28,28 @@
|
||||
"dependencies": {
|
||||
"@anthropic-ai/sdk": "^0.80.0",
|
||||
"@fastify/helmet": "^13.0.2",
|
||||
"@mariozechner/pi-ai": "~0.57.1",
|
||||
"@mariozechner/pi-coding-agent": "~0.57.1",
|
||||
"@mariozechner/pi-ai": "^0.65.0",
|
||||
"@mariozechner/pi-coding-agent": "^0.65.0",
|
||||
"@modelcontextprotocol/sdk": "^1.27.1",
|
||||
"@mosaic/auth": "workspace:^",
|
||||
"@mosaic/brain": "workspace:^",
|
||||
"@mosaic/coord": "workspace:^",
|
||||
"@mosaic/db": "workspace:^",
|
||||
"@mosaic/discord-plugin": "workspace:^",
|
||||
"@mosaic/log": "workspace:^",
|
||||
"@mosaic/memory": "workspace:^",
|
||||
"@mosaic/queue": "workspace:^",
|
||||
"@mosaic/telegram-plugin": "workspace:^",
|
||||
"@mosaic/types": "workspace:^",
|
||||
"@mosaicstack/auth": "workspace:^",
|
||||
"@mosaicstack/brain": "workspace:^",
|
||||
"@mosaicstack/config": "workspace:^",
|
||||
"@mosaicstack/coord": "workspace:^",
|
||||
"@mosaicstack/db": "workspace:^",
|
||||
"@mosaicstack/discord-plugin": "workspace:^",
|
||||
"@mosaicstack/log": "workspace:^",
|
||||
"@mosaicstack/memory": "workspace:^",
|
||||
"@mosaicstack/queue": "workspace:^",
|
||||
"@mosaicstack/storage": "workspace:^",
|
||||
"@mosaicstack/telegram-plugin": "workspace:^",
|
||||
"@mosaicstack/types": "workspace:^",
|
||||
"@nestjs/common": "^11.0.0",
|
||||
"@nestjs/core": "^11.0.0",
|
||||
"@nestjs/platform-fastify": "^11.0.0",
|
||||
"@nestjs/platform-socket.io": "^11.0.0",
|
||||
"@nestjs/throttler": "^6.5.0",
|
||||
"@nestjs/websockets": "^11.0.0",
|
||||
"@opentelemetry/auto-instrumentations-node": "^0.71.0",
|
||||
"@opentelemetry/auto-instrumentations-node": "^0.72.0",
|
||||
"@opentelemetry/exporter-metrics-otlp-http": "^0.213.0",
|
||||
"@opentelemetry/exporter-trace-otlp-http": "^0.213.0",
|
||||
"@opentelemetry/resources": "^2.6.0",
|
||||
@@ -42,6 +58,7 @@
|
||||
"@opentelemetry/semantic-conventions": "^1.40.0",
|
||||
"@sinclair/typebox": "^0.34.48",
|
||||
"better-auth": "^1.5.5",
|
||||
"bullmq": "^5.71.0",
|
||||
"class-transformer": "^0.5.1",
|
||||
"class-validator": "^0.15.1",
|
||||
"dotenv": "^17.3.1",
|
||||
|
||||
@@ -12,7 +12,7 @@ import { BadRequestException, NotFoundException } from '@nestjs/common';
|
||||
import { describe, expect, it, vi, beforeEach } from 'vitest';
|
||||
import type { ConversationHistoryMessage } from '../agent/agent.service.js';
|
||||
import { ConversationsController } from '../conversations/conversations.controller.js';
|
||||
import type { Message } from '@mosaic/brain';
|
||||
import type { Message } from '@mosaicstack/brain';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Shared test data
|
||||
|
||||
@@ -18,13 +18,13 @@
|
||||
*/
|
||||
|
||||
import { afterAll, beforeAll, beforeEach, describe, expect, it } from 'vitest';
|
||||
import { createDb } from '@mosaic/db';
|
||||
import { createConversationsRepo } from '@mosaic/brain';
|
||||
import { createAgentsRepo } from '@mosaic/brain';
|
||||
import { createPreferencesRepo, createInsightsRepo } from '@mosaic/memory';
|
||||
import { users, conversations, messages, agents, preferences, insights } from '@mosaic/db';
|
||||
import { eq } from '@mosaic/db';
|
||||
import type { DbHandle } from '@mosaic/db';
|
||||
import { createDb } from '@mosaicstack/db';
|
||||
import { createConversationsRepo } from '@mosaicstack/brain';
|
||||
import { createAgentsRepo } from '@mosaicstack/brain';
|
||||
import { createPreferencesRepo, createInsightsRepo } from '@mosaicstack/memory';
|
||||
import { users, conversations, messages, agents, preferences, insights } from '@mosaicstack/db';
|
||||
import { eq } from '@mosaicstack/db';
|
||||
import type { DbHandle } from '@mosaicstack/db';
|
||||
|
||||
// ─── Fixed IDs so the afterAll cleanup is deterministic ──────────────────────
|
||||
|
||||
|
||||
377
apps/gateway/src/__tests__/session-hardening.test.ts
Normal file
377
apps/gateway/src/__tests__/session-hardening.test.ts
Normal file
@@ -0,0 +1,377 @@
|
||||
/**
|
||||
* M5-008: Session hardening verification tests.
|
||||
*
|
||||
* Verifies:
|
||||
* 1. /model command switches model → session:info reflects updated modelId
|
||||
* 2. /agent command switches agent config → system prompt / agentName changes
|
||||
* 3. Session resume binds to a conversation (history injected via conversationHistory option)
|
||||
* 4. Session metrics track token usage and message count correctly
|
||||
*/
|
||||
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import type {
|
||||
AgentSession,
|
||||
AgentSessionOptions,
|
||||
ConversationHistoryMessage,
|
||||
} from '../agent/agent.service.js';
|
||||
import type { SessionInfoDto, SessionMetrics, SessionTokenMetrics } from '../agent/session.dto.js';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Helpers — minimal AgentSession fixture
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function makeMetrics(overrides?: Partial<SessionMetrics>): SessionMetrics {
|
||||
return {
|
||||
tokens: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
modelSwitches: 0,
|
||||
messageCount: 0,
|
||||
lastActivityAt: new Date().toISOString(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeSession(overrides?: Partial<AgentSession>): AgentSession {
|
||||
return {
|
||||
id: 'session-001',
|
||||
provider: 'anthropic',
|
||||
modelId: 'claude-3-5-sonnet-20241022',
|
||||
piSession: {} as AgentSession['piSession'],
|
||||
listeners: new Set(),
|
||||
unsubscribe: vi.fn(),
|
||||
createdAt: Date.now(),
|
||||
promptCount: 0,
|
||||
channels: new Set(),
|
||||
skillPromptAdditions: [],
|
||||
sandboxDir: '/tmp',
|
||||
allowedTools: null,
|
||||
metrics: makeMetrics(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function sessionToInfo(session: AgentSession): SessionInfoDto {
|
||||
return {
|
||||
id: session.id,
|
||||
provider: session.provider,
|
||||
modelId: session.modelId,
|
||||
...(session.agentName ? { agentName: session.agentName } : {}),
|
||||
createdAt: new Date(session.createdAt).toISOString(),
|
||||
promptCount: session.promptCount,
|
||||
channels: Array.from(session.channels),
|
||||
durationMs: Date.now() - session.createdAt,
|
||||
metrics: { ...session.metrics },
|
||||
};
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Replicated AgentService methods (tested in isolation without full DI setup)
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
function updateSessionModel(session: AgentSession, modelId: string): void {
|
||||
session.modelId = modelId;
|
||||
session.metrics.modelSwitches += 1;
|
||||
session.metrics.lastActivityAt = new Date().toISOString();
|
||||
}
|
||||
|
||||
function applyAgentConfig(
|
||||
session: AgentSession,
|
||||
agentConfigId: string,
|
||||
agentName: string,
|
||||
modelId?: string,
|
||||
): void {
|
||||
session.agentConfigId = agentConfigId;
|
||||
session.agentName = agentName;
|
||||
if (modelId) {
|
||||
updateSessionModel(session, modelId);
|
||||
}
|
||||
}
|
||||
|
||||
function recordTokenUsage(session: AgentSession, tokens: SessionTokenMetrics): void {
|
||||
session.metrics.tokens.input += tokens.input;
|
||||
session.metrics.tokens.output += tokens.output;
|
||||
session.metrics.tokens.cacheRead += tokens.cacheRead;
|
||||
session.metrics.tokens.cacheWrite += tokens.cacheWrite;
|
||||
session.metrics.tokens.total += tokens.total;
|
||||
session.metrics.lastActivityAt = new Date().toISOString();
|
||||
}
|
||||
|
||||
function recordMessage(session: AgentSession): void {
|
||||
session.metrics.messageCount += 1;
|
||||
session.metrics.lastActivityAt = new Date().toISOString();
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// 1. /model command — switches model → session:info updated
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('/model command — model switch reflected in session:info', () => {
|
||||
let session: AgentSession;
|
||||
|
||||
beforeEach(() => {
|
||||
session = makeSession();
|
||||
});
|
||||
|
||||
it('updates modelId when /model is called with a model name', () => {
|
||||
updateSessionModel(session, 'claude-opus-4-5-20251001');
|
||||
|
||||
expect(session.modelId).toBe('claude-opus-4-5-20251001');
|
||||
});
|
||||
|
||||
it('increments modelSwitches metric after /model command', () => {
|
||||
expect(session.metrics.modelSwitches).toBe(0);
|
||||
|
||||
updateSessionModel(session, 'gpt-4o');
|
||||
expect(session.metrics.modelSwitches).toBe(1);
|
||||
|
||||
updateSessionModel(session, 'claude-3-5-sonnet-20241022');
|
||||
expect(session.metrics.modelSwitches).toBe(2);
|
||||
});
|
||||
|
||||
it('session:info DTO reflects the new modelId after switch', () => {
|
||||
updateSessionModel(session, 'claude-haiku-3-5-20251001');
|
||||
|
||||
const info = sessionToInfo(session);
|
||||
|
||||
expect(info.modelId).toBe('claude-haiku-3-5-20251001');
|
||||
expect(info.metrics.modelSwitches).toBe(1);
|
||||
});
|
||||
|
||||
it('lastActivityAt is updated after model switch', () => {
|
||||
const before = session.metrics.lastActivityAt;
|
||||
// Ensure at least 1ms passes
|
||||
vi.setSystemTime(Date.now() + 1);
|
||||
updateSessionModel(session, 'new-model');
|
||||
vi.useRealTimers();
|
||||
|
||||
expect(session.metrics.lastActivityAt).not.toBe(before);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// 2. /agent command — switches agent config → system prompt / agentName updated
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('/agent command — agent config applied to session', () => {
|
||||
let session: AgentSession;
|
||||
|
||||
beforeEach(() => {
|
||||
session = makeSession();
|
||||
});
|
||||
|
||||
it('sets agentConfigId and agentName on the session', () => {
|
||||
applyAgentConfig(session, 'agent-uuid-001', 'CodeReviewer');
|
||||
|
||||
expect(session.agentConfigId).toBe('agent-uuid-001');
|
||||
expect(session.agentName).toBe('CodeReviewer');
|
||||
});
|
||||
|
||||
it('also updates modelId when agent config carries a model', () => {
|
||||
applyAgentConfig(session, 'agent-uuid-002', 'DataAnalyst', 'gpt-4o-mini');
|
||||
|
||||
expect(session.agentName).toBe('DataAnalyst');
|
||||
expect(session.modelId).toBe('gpt-4o-mini');
|
||||
expect(session.metrics.modelSwitches).toBe(1);
|
||||
});
|
||||
|
||||
it('does NOT update modelId when agent config has no model', () => {
|
||||
const originalModel = session.modelId;
|
||||
applyAgentConfig(session, 'agent-uuid-003', 'Planner', undefined);
|
||||
|
||||
expect(session.modelId).toBe(originalModel);
|
||||
expect(session.metrics.modelSwitches).toBe(0);
|
||||
});
|
||||
|
||||
it('session:info DTO reflects agentName after /agent switch', () => {
|
||||
applyAgentConfig(session, 'agent-uuid-004', 'DevBot');
|
||||
|
||||
const info = sessionToInfo(session);
|
||||
|
||||
expect(info.agentName).toBe('DevBot');
|
||||
});
|
||||
|
||||
it('multiple /agent calls update to the latest agent', () => {
|
||||
applyAgentConfig(session, 'agent-001', 'FirstAgent');
|
||||
applyAgentConfig(session, 'agent-002', 'SecondAgent');
|
||||
|
||||
expect(session.agentConfigId).toBe('agent-002');
|
||||
expect(session.agentName).toBe('SecondAgent');
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// 3. Session resume — binds to conversation via conversationHistory
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('Session resume — binds to conversation', () => {
|
||||
it('conversationHistory option is preserved in session options', () => {
|
||||
const history: ConversationHistoryMessage[] = [
|
||||
{
|
||||
role: 'user',
|
||||
content: 'Hello, what is TypeScript?',
|
||||
createdAt: new Date('2026-01-01T00:01:00Z'),
|
||||
},
|
||||
{
|
||||
role: 'assistant',
|
||||
content: 'TypeScript is a typed superset of JavaScript.',
|
||||
createdAt: new Date('2026-01-01T00:01:05Z'),
|
||||
},
|
||||
];
|
||||
|
||||
const options: AgentSessionOptions = {
|
||||
conversationHistory: history,
|
||||
provider: 'anthropic',
|
||||
modelId: 'claude-3-5-sonnet-20241022',
|
||||
};
|
||||
|
||||
expect(options.conversationHistory).toHaveLength(2);
|
||||
expect(options.conversationHistory![0]!.role).toBe('user');
|
||||
expect(options.conversationHistory![1]!.role).toBe('assistant');
|
||||
});
|
||||
|
||||
it('session with conversationHistory option carries the conversation binding', () => {
|
||||
const CONV_ID = 'conv-resume-001';
|
||||
const history: ConversationHistoryMessage[] = [
|
||||
{ role: 'user', content: 'Prior question', createdAt: new Date('2026-01-01T00:01:00Z') },
|
||||
];
|
||||
|
||||
// Simulate what ChatGateway does: pass conversationId + history to createSession
|
||||
const options: AgentSessionOptions = {
|
||||
conversationHistory: history,
|
||||
};
|
||||
|
||||
// The session ID is the conversationId in the gateway
|
||||
const session = makeSession({ id: CONV_ID });
|
||||
|
||||
expect(session.id).toBe(CONV_ID);
|
||||
expect(options.conversationHistory).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('empty conversationHistory is valid (new conversation)', () => {
|
||||
const options: AgentSessionOptions = {
|
||||
conversationHistory: [],
|
||||
};
|
||||
|
||||
expect(options.conversationHistory).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('resumed session preserves all message roles', () => {
|
||||
const history: ConversationHistoryMessage[] = [
|
||||
{ role: 'system', content: 'You are a helpful assistant.', createdAt: new Date() },
|
||||
{ role: 'user', content: 'Question 1', createdAt: new Date() },
|
||||
{ role: 'assistant', content: 'Answer 1', createdAt: new Date() },
|
||||
{ role: 'user', content: 'Question 2', createdAt: new Date() },
|
||||
];
|
||||
|
||||
const roles = history.map((m) => m.role);
|
||||
expect(roles).toEqual(['system', 'user', 'assistant', 'user']);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// 4. Session metrics — token usage and message count
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('Session metrics — token usage and message count', () => {
|
||||
let session: AgentSession;
|
||||
|
||||
beforeEach(() => {
|
||||
session = makeSession();
|
||||
});
|
||||
|
||||
it('starts with zero metrics', () => {
|
||||
expect(session.metrics.tokens.input).toBe(0);
|
||||
expect(session.metrics.tokens.output).toBe(0);
|
||||
expect(session.metrics.tokens.total).toBe(0);
|
||||
expect(session.metrics.messageCount).toBe(0);
|
||||
expect(session.metrics.modelSwitches).toBe(0);
|
||||
});
|
||||
|
||||
it('accumulates token usage across multiple turns', () => {
|
||||
recordTokenUsage(session, {
|
||||
input: 100,
|
||||
output: 50,
|
||||
cacheRead: 0,
|
||||
cacheWrite: 0,
|
||||
total: 150,
|
||||
});
|
||||
recordTokenUsage(session, {
|
||||
input: 200,
|
||||
output: 80,
|
||||
cacheRead: 10,
|
||||
cacheWrite: 5,
|
||||
total: 295,
|
||||
});
|
||||
|
||||
expect(session.metrics.tokens.input).toBe(300);
|
||||
expect(session.metrics.tokens.output).toBe(130);
|
||||
expect(session.metrics.tokens.cacheRead).toBe(10);
|
||||
expect(session.metrics.tokens.cacheWrite).toBe(5);
|
||||
expect(session.metrics.tokens.total).toBe(445);
|
||||
});
|
||||
|
||||
it('increments message count with each recordMessage call', () => {
|
||||
expect(session.metrics.messageCount).toBe(0);
|
||||
|
||||
recordMessage(session);
|
||||
expect(session.metrics.messageCount).toBe(1);
|
||||
|
||||
recordMessage(session);
|
||||
recordMessage(session);
|
||||
expect(session.metrics.messageCount).toBe(3);
|
||||
});
|
||||
|
||||
it('session:info DTO exposes correct metrics snapshot', () => {
|
||||
recordTokenUsage(session, {
|
||||
input: 500,
|
||||
output: 100,
|
||||
cacheRead: 20,
|
||||
cacheWrite: 10,
|
||||
total: 630,
|
||||
});
|
||||
recordMessage(session);
|
||||
recordMessage(session);
|
||||
updateSessionModel(session, 'claude-haiku-3-5-20251001');
|
||||
|
||||
const info = sessionToInfo(session);
|
||||
|
||||
expect(info.metrics.tokens.input).toBe(500);
|
||||
expect(info.metrics.tokens.output).toBe(100);
|
||||
expect(info.metrics.tokens.total).toBe(630);
|
||||
expect(info.metrics.messageCount).toBe(2);
|
||||
expect(info.metrics.modelSwitches).toBe(1);
|
||||
});
|
||||
|
||||
it('metrics are independent per session', () => {
|
||||
const sessionA = makeSession({ id: 'session-A' });
|
||||
const sessionB = makeSession({ id: 'session-B' });
|
||||
|
||||
recordTokenUsage(sessionA, { input: 100, output: 50, cacheRead: 0, cacheWrite: 0, total: 150 });
|
||||
recordMessage(sessionA);
|
||||
|
||||
// Session B should remain at zero
|
||||
expect(sessionB.metrics.tokens.input).toBe(0);
|
||||
expect(sessionB.metrics.messageCount).toBe(0);
|
||||
|
||||
// Session A should have updated values
|
||||
expect(sessionA.metrics.tokens.input).toBe(100);
|
||||
expect(sessionA.metrics.messageCount).toBe(1);
|
||||
});
|
||||
|
||||
it('lastActivityAt is updated after recording tokens', () => {
|
||||
const before = session.metrics.lastActivityAt;
|
||||
vi.setSystemTime(new Date(Date.now() + 100));
|
||||
recordTokenUsage(session, { input: 10, output: 5, cacheRead: 0, cacheWrite: 0, total: 15 });
|
||||
vi.useRealTimers();
|
||||
|
||||
expect(session.metrics.lastActivityAt).not.toBe(before);
|
||||
});
|
||||
|
||||
it('lastActivityAt is updated after recording a message', () => {
|
||||
const before = session.metrics.lastActivityAt;
|
||||
vi.setSystemTime(new Date(Date.now() + 100));
|
||||
recordMessage(session);
|
||||
vi.useRealTimers();
|
||||
|
||||
expect(session.metrics.lastActivityAt).not.toBe(before);
|
||||
});
|
||||
});
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Controller, Get, Inject, UseGuards } from '@nestjs/common';
|
||||
import { sql, type Db } from '@mosaic/db';
|
||||
import { createQueue } from '@mosaic/queue';
|
||||
import { sql, type Db } from '@mosaicstack/db';
|
||||
import { createQueue } from '@mosaicstack/queue';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { AgentService } from '../agent/agent.service.js';
|
||||
import { ProviderService } from '../agent/provider.service.js';
|
||||
|
||||
128
apps/gateway/src/admin/admin-jobs.controller.ts
Normal file
128
apps/gateway/src/admin/admin-jobs.controller.ts
Normal file
@@ -0,0 +1,128 @@
|
||||
import {
|
||||
Controller,
|
||||
Get,
|
||||
HttpCode,
|
||||
HttpStatus,
|
||||
Inject,
|
||||
NotFoundException,
|
||||
Optional,
|
||||
Param,
|
||||
Post,
|
||||
Query,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import { AdminGuard } from './admin.guard.js';
|
||||
import { QueueService } from '../queue/queue.service.js';
|
||||
import type { JobDto, JobListDto, JobStatus, QueueListDto } from '../queue/queue-admin.dto.js';
|
||||
|
||||
@Controller('api/admin/jobs')
|
||||
@UseGuards(AdminGuard)
|
||||
export class AdminJobsController {
|
||||
constructor(
|
||||
@Optional()
|
||||
@Inject(QueueService)
|
||||
private readonly queueService: QueueService | null,
|
||||
) {}
|
||||
|
||||
/**
|
||||
* GET /api/admin/jobs
|
||||
* List jobs across all queues. Optional ?status=active|completed|failed|waiting|delayed
|
||||
*/
|
||||
@Get()
|
||||
async listJobs(@Query('status') status?: string): Promise<JobListDto> {
|
||||
if (!this.queueService) {
|
||||
return { jobs: [], total: 0 };
|
||||
}
|
||||
|
||||
const validStatuses: JobStatus[] = ['active', 'completed', 'failed', 'waiting', 'delayed'];
|
||||
const normalised = status as JobStatus | undefined;
|
||||
|
||||
if (normalised && !validStatuses.includes(normalised)) {
|
||||
return { jobs: [], total: 0 };
|
||||
}
|
||||
|
||||
const jobs: JobDto[] = await this.queueService.listJobs(normalised);
|
||||
return { jobs, total: jobs.length };
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/admin/jobs/:id/retry
|
||||
* Retry a specific failed job. The id is "<queue>__<bullmq-job-id>".
|
||||
*/
|
||||
@Post(':id/retry')
|
||||
@HttpCode(HttpStatus.OK)
|
||||
async retryJob(@Param('id') id: string): Promise<{ ok: boolean; message: string }> {
|
||||
if (!this.queueService) {
|
||||
throw new NotFoundException('Queue service is not available');
|
||||
}
|
||||
|
||||
const result = await this.queueService.retryJob(id);
|
||||
if (!result.ok) {
|
||||
throw new NotFoundException(result.message);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/admin/jobs/queues
|
||||
* Return status for all managed queues.
|
||||
*/
|
||||
@Get('queues')
|
||||
async listQueues(): Promise<QueueListDto> {
|
||||
if (!this.queueService) {
|
||||
return { queues: [] };
|
||||
}
|
||||
|
||||
const health = await this.queueService.getHealthStatus();
|
||||
const queues = Object.entries(health.queues).map(([name, stats]) => ({
|
||||
name,
|
||||
waiting: stats.waiting,
|
||||
active: stats.active,
|
||||
completed: stats.completed,
|
||||
failed: stats.failed,
|
||||
delayed: 0,
|
||||
paused: stats.paused,
|
||||
}));
|
||||
|
||||
return { queues };
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/admin/jobs/queues/:name/pause
|
||||
* Pause the named queue.
|
||||
*/
|
||||
@Post('queues/:name/pause')
|
||||
@HttpCode(HttpStatus.OK)
|
||||
async pauseQueue(@Param('name') name: string): Promise<{ ok: boolean; message: string }> {
|
||||
if (!this.queueService) {
|
||||
throw new NotFoundException('Queue service is not available');
|
||||
}
|
||||
|
||||
const result = await this.queueService.pauseQueue(name);
|
||||
if (!result.ok) {
|
||||
throw new NotFoundException(result.message);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/admin/jobs/queues/:name/resume
|
||||
* Resume the named queue.
|
||||
*/
|
||||
@Post('queues/:name/resume')
|
||||
@HttpCode(HttpStatus.OK)
|
||||
async resumeQueue(@Param('name') name: string): Promise<{ ok: boolean; message: string }> {
|
||||
if (!this.queueService) {
|
||||
throw new NotFoundException('Queue service is not available');
|
||||
}
|
||||
|
||||
const result = await this.queueService.resumeQueue(name);
|
||||
if (!result.ok) {
|
||||
throw new NotFoundException(result.message);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
}
|
||||
90
apps/gateway/src/admin/admin-tokens.controller.ts
Normal file
90
apps/gateway/src/admin/admin-tokens.controller.ts
Normal file
@@ -0,0 +1,90 @@
|
||||
import {
|
||||
Body,
|
||||
Controller,
|
||||
Delete,
|
||||
Get,
|
||||
HttpCode,
|
||||
HttpStatus,
|
||||
Inject,
|
||||
Param,
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import { randomBytes, createHash } from 'node:crypto';
|
||||
import { eq, type Db, adminTokens } from '@mosaicstack/db';
|
||||
import { v4 as uuid } from 'uuid';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { AdminGuard } from './admin.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
import type {
|
||||
CreateTokenDto,
|
||||
TokenCreatedDto,
|
||||
TokenDto,
|
||||
TokenListDto,
|
||||
} from './admin-tokens.dto.js';
|
||||
|
||||
function hashToken(plaintext: string): string {
|
||||
return createHash('sha256').update(plaintext).digest('hex');
|
||||
}
|
||||
|
||||
function toTokenDto(row: typeof adminTokens.$inferSelect): TokenDto {
|
||||
return {
|
||||
id: row.id,
|
||||
label: row.label,
|
||||
scope: row.scope,
|
||||
expiresAt: row.expiresAt?.toISOString() ?? null,
|
||||
lastUsedAt: row.lastUsedAt?.toISOString() ?? null,
|
||||
createdAt: row.createdAt.toISOString(),
|
||||
};
|
||||
}
|
||||
|
||||
@Controller('api/admin/tokens')
|
||||
@UseGuards(AdminGuard)
|
||||
export class AdminTokensController {
|
||||
constructor(@Inject(DB) private readonly db: Db) {}
|
||||
|
||||
@Post()
|
||||
async create(
|
||||
@Body() dto: CreateTokenDto,
|
||||
@CurrentUser() user: { id: string },
|
||||
): Promise<TokenCreatedDto> {
|
||||
const plaintext = randomBytes(32).toString('hex');
|
||||
const tokenHash = hashToken(plaintext);
|
||||
const id = uuid();
|
||||
|
||||
const expiresAt = dto.expiresInDays
|
||||
? new Date(Date.now() + dto.expiresInDays * 24 * 60 * 60 * 1000)
|
||||
: null;
|
||||
|
||||
const [row] = await this.db
|
||||
.insert(adminTokens)
|
||||
.values({
|
||||
id,
|
||||
userId: user.id,
|
||||
tokenHash,
|
||||
label: dto.label ?? 'CLI token',
|
||||
scope: dto.scope ?? 'admin',
|
||||
expiresAt,
|
||||
})
|
||||
.returning();
|
||||
|
||||
return { ...toTokenDto(row!), plaintext };
|
||||
}
|
||||
|
||||
@Get()
|
||||
async list(@CurrentUser() user: { id: string }): Promise<TokenListDto> {
|
||||
const rows = await this.db
|
||||
.select()
|
||||
.from(adminTokens)
|
||||
.where(eq(adminTokens.userId, user.id))
|
||||
.orderBy(adminTokens.createdAt);
|
||||
|
||||
return { tokens: rows.map(toTokenDto), total: rows.length };
|
||||
}
|
||||
|
||||
@Delete(':id')
|
||||
@HttpCode(HttpStatus.NO_CONTENT)
|
||||
async revoke(@Param('id') id: string, @CurrentUser() _user: { id: string }): Promise<void> {
|
||||
await this.db.delete(adminTokens).where(eq(adminTokens.id, id));
|
||||
}
|
||||
}
|
||||
33
apps/gateway/src/admin/admin-tokens.dto.ts
Normal file
33
apps/gateway/src/admin/admin-tokens.dto.ts
Normal file
@@ -0,0 +1,33 @@
|
||||
import { IsString, IsOptional, IsInt, Min } from 'class-validator';
|
||||
|
||||
export class CreateTokenDto {
|
||||
@IsString()
|
||||
label!: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsString()
|
||||
scope?: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsInt()
|
||||
@Min(1)
|
||||
expiresInDays?: number;
|
||||
}
|
||||
|
||||
export interface TokenDto {
|
||||
id: string;
|
||||
label: string;
|
||||
scope: string;
|
||||
expiresAt: string | null;
|
||||
lastUsedAt: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
export interface TokenCreatedDto extends TokenDto {
|
||||
plaintext: string;
|
||||
}
|
||||
|
||||
export interface TokenListDto {
|
||||
tokens: TokenDto[];
|
||||
total: number;
|
||||
}
|
||||
@@ -13,8 +13,8 @@ import {
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import { eq, type Db, users as usersTable } from '@mosaic/db';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import { eq, type Db, users as usersTable } from '@mosaicstack/db';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import { AUTH } from '../auth/auth.tokens.js';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { AdminGuard } from './admin.guard.js';
|
||||
|
||||
@@ -6,10 +6,11 @@ import {
|
||||
Injectable,
|
||||
UnauthorizedException,
|
||||
} from '@nestjs/common';
|
||||
import { createHash } from 'node:crypto';
|
||||
import { fromNodeHeaders } from 'better-auth/node';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { eq, users as usersTable } from '@mosaic/db';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { eq, adminTokens, users as usersTable } from '@mosaicstack/db';
|
||||
import type { FastifyRequest } from 'fastify';
|
||||
import { AUTH } from '../auth/auth.tokens.js';
|
||||
import { DB } from '../database/database.module.js';
|
||||
@@ -19,6 +20,8 @@ interface UserWithRole {
|
||||
role?: string;
|
||||
}
|
||||
|
||||
type AuthenticatedRequest = FastifyRequest & { user: unknown; session: unknown };
|
||||
|
||||
@Injectable()
|
||||
export class AdminGuard implements CanActivate {
|
||||
constructor(
|
||||
@@ -28,8 +31,64 @@ export class AdminGuard implements CanActivate {
|
||||
|
||||
async canActivate(context: ExecutionContext): Promise<boolean> {
|
||||
const request = context.switchToHttp().getRequest<FastifyRequest>();
|
||||
const headers = fromNodeHeaders(request.raw.headers);
|
||||
|
||||
// Try bearer token auth first
|
||||
const authHeader = request.raw.headers['authorization'];
|
||||
if (authHeader?.startsWith('Bearer ')) {
|
||||
return this.validateBearerToken(request, authHeader.slice(7));
|
||||
}
|
||||
|
||||
// Fall back to BetterAuth session
|
||||
return this.validateSession(request);
|
||||
}
|
||||
|
||||
private async validateBearerToken(request: FastifyRequest, plaintext: string): Promise<boolean> {
|
||||
const tokenHash = createHash('sha256').update(plaintext).digest('hex');
|
||||
|
||||
const [row] = await this.db
|
||||
.select({
|
||||
tokenId: adminTokens.id,
|
||||
userId: adminTokens.userId,
|
||||
scope: adminTokens.scope,
|
||||
expiresAt: adminTokens.expiresAt,
|
||||
userName: usersTable.name,
|
||||
userEmail: usersTable.email,
|
||||
userRole: usersTable.role,
|
||||
})
|
||||
.from(adminTokens)
|
||||
.innerJoin(usersTable, eq(adminTokens.userId, usersTable.id))
|
||||
.where(eq(adminTokens.tokenHash, tokenHash))
|
||||
.limit(1);
|
||||
|
||||
if (!row) {
|
||||
throw new UnauthorizedException('Invalid API token');
|
||||
}
|
||||
|
||||
if (row.expiresAt && row.expiresAt < new Date()) {
|
||||
throw new UnauthorizedException('API token expired');
|
||||
}
|
||||
|
||||
if (row.userRole !== 'admin') {
|
||||
throw new ForbiddenException('Admin access required');
|
||||
}
|
||||
|
||||
// Update last-used timestamp (fire-and-forget)
|
||||
this.db
|
||||
.update(adminTokens)
|
||||
.set({ lastUsedAt: new Date() })
|
||||
.where(eq(adminTokens.id, row.tokenId))
|
||||
.then(() => {})
|
||||
.catch(() => {});
|
||||
|
||||
const req = request as AuthenticatedRequest;
|
||||
req.user = { id: row.userId, name: row.userName, email: row.userEmail, role: row.userRole };
|
||||
req.session = { id: `token:${row.tokenId}`, userId: row.userId };
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
private async validateSession(request: FastifyRequest): Promise<boolean> {
|
||||
const headers = fromNodeHeaders(request.raw.headers);
|
||||
const result = await this.auth.api.getSession({ headers });
|
||||
|
||||
if (!result) {
|
||||
@@ -38,8 +97,6 @@ export class AdminGuard implements CanActivate {
|
||||
|
||||
const user = result.user as UserWithRole;
|
||||
|
||||
// Ensure the role field is populated. better-auth should include additionalFields
|
||||
// in the session, but as a fallback, fetch the role from the database if needed.
|
||||
let userRole = user.role;
|
||||
if (!userRole) {
|
||||
const [dbUser] = await this.db
|
||||
@@ -48,7 +105,6 @@ export class AdminGuard implements CanActivate {
|
||||
.where(eq(usersTable.id, user.id))
|
||||
.limit(1);
|
||||
userRole = dbUser?.role ?? 'member';
|
||||
// Update the session user object with the fetched role
|
||||
(user as UserWithRole).role = userRole;
|
||||
}
|
||||
|
||||
@@ -56,8 +112,9 @@ export class AdminGuard implements CanActivate {
|
||||
throw new ForbiddenException('Admin access required');
|
||||
}
|
||||
|
||||
(request as FastifyRequest & { user: unknown; session: unknown }).user = result.user;
|
||||
(request as FastifyRequest & { user: unknown; session: unknown }).session = result.session;
|
||||
const req = request as AuthenticatedRequest;
|
||||
req.user = result.user;
|
||||
req.session = result.session;
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
@@ -1,10 +1,19 @@
|
||||
import { Module } from '@nestjs/common';
|
||||
import { AdminController } from './admin.controller.js';
|
||||
import { AdminHealthController } from './admin-health.controller.js';
|
||||
import { AdminJobsController } from './admin-jobs.controller.js';
|
||||
import { AdminTokensController } from './admin-tokens.controller.js';
|
||||
import { BootstrapController } from './bootstrap.controller.js';
|
||||
import { AdminGuard } from './admin.guard.js';
|
||||
|
||||
@Module({
|
||||
controllers: [AdminController, AdminHealthController],
|
||||
controllers: [
|
||||
AdminController,
|
||||
AdminHealthController,
|
||||
AdminJobsController,
|
||||
AdminTokensController,
|
||||
BootstrapController,
|
||||
],
|
||||
providers: [AdminGuard],
|
||||
})
|
||||
export class AdminModule {}
|
||||
|
||||
101
apps/gateway/src/admin/bootstrap.controller.ts
Normal file
101
apps/gateway/src/admin/bootstrap.controller.ts
Normal file
@@ -0,0 +1,101 @@
|
||||
import {
|
||||
Body,
|
||||
Controller,
|
||||
ForbiddenException,
|
||||
Get,
|
||||
Inject,
|
||||
InternalServerErrorException,
|
||||
Post,
|
||||
} from '@nestjs/common';
|
||||
import { randomBytes, createHash } from 'node:crypto';
|
||||
import { count, eq, type Db, users as usersTable, adminTokens } from '@mosaicstack/db';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import { v4 as uuid } from 'uuid';
|
||||
import { AUTH } from '../auth/auth.tokens.js';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import type { BootstrapSetupDto, BootstrapStatusDto, BootstrapResultDto } from './bootstrap.dto.js';
|
||||
|
||||
@Controller('api/bootstrap')
|
||||
export class BootstrapController {
|
||||
constructor(
|
||||
@Inject(AUTH) private readonly auth: Auth,
|
||||
@Inject(DB) private readonly db: Db,
|
||||
) {}
|
||||
|
||||
@Get('status')
|
||||
async status(): Promise<BootstrapStatusDto> {
|
||||
const [result] = await this.db.select({ total: count() }).from(usersTable);
|
||||
return { needsSetup: (result?.total ?? 0) === 0 };
|
||||
}
|
||||
|
||||
@Post('setup')
|
||||
async setup(@Body() dto: BootstrapSetupDto): Promise<BootstrapResultDto> {
|
||||
// Only allow setup when zero users exist
|
||||
const [result] = await this.db.select({ total: count() }).from(usersTable);
|
||||
if ((result?.total ?? 0) > 0) {
|
||||
throw new ForbiddenException('Setup already completed — users exist');
|
||||
}
|
||||
|
||||
// Create admin user via BetterAuth API
|
||||
const authApi = this.auth.api as unknown as {
|
||||
createUser: (opts: {
|
||||
body: { name: string; email: string; password: string; role?: string };
|
||||
}) => Promise<{
|
||||
user: { id: string; name: string; email: string };
|
||||
}>;
|
||||
};
|
||||
|
||||
const created = await authApi.createUser({
|
||||
body: {
|
||||
name: dto.name,
|
||||
email: dto.email,
|
||||
password: dto.password,
|
||||
role: 'admin',
|
||||
},
|
||||
});
|
||||
|
||||
// Verify user was created
|
||||
const [user] = await this.db
|
||||
.select()
|
||||
.from(usersTable)
|
||||
.where(eq(usersTable.id, created.user.id))
|
||||
.limit(1);
|
||||
|
||||
if (!user) throw new InternalServerErrorException('User created but not found');
|
||||
|
||||
// Ensure role is admin (createUser may not set it via BetterAuth)
|
||||
if (user.role !== 'admin') {
|
||||
await this.db.update(usersTable).set({ role: 'admin' }).where(eq(usersTable.id, user.id));
|
||||
}
|
||||
|
||||
// Generate admin API token
|
||||
const plaintext = randomBytes(32).toString('hex');
|
||||
const tokenHash = createHash('sha256').update(plaintext).digest('hex');
|
||||
const tokenId = uuid();
|
||||
|
||||
const [token] = await this.db
|
||||
.insert(adminTokens)
|
||||
.values({
|
||||
id: tokenId,
|
||||
userId: user.id,
|
||||
tokenHash,
|
||||
label: 'Initial setup token',
|
||||
scope: 'admin',
|
||||
})
|
||||
.returning();
|
||||
|
||||
return {
|
||||
user: {
|
||||
id: user.id,
|
||||
name: user.name,
|
||||
email: user.email,
|
||||
role: 'admin',
|
||||
},
|
||||
token: {
|
||||
id: token!.id,
|
||||
plaintext,
|
||||
label: token!.label,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
31
apps/gateway/src/admin/bootstrap.dto.ts
Normal file
31
apps/gateway/src/admin/bootstrap.dto.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import { IsString, IsEmail, MinLength } from 'class-validator';
|
||||
|
||||
export class BootstrapSetupDto {
|
||||
@IsString()
|
||||
name!: string;
|
||||
|
||||
@IsEmail()
|
||||
email!: string;
|
||||
|
||||
@IsString()
|
||||
@MinLength(8)
|
||||
password!: string;
|
||||
}
|
||||
|
||||
export interface BootstrapStatusDto {
|
||||
needsSetup: boolean;
|
||||
}
|
||||
|
||||
export interface BootstrapResultDto {
|
||||
user: {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
role: string;
|
||||
};
|
||||
token: {
|
||||
id: string;
|
||||
plaintext: string;
|
||||
label: string;
|
||||
};
|
||||
}
|
||||
@@ -62,7 +62,7 @@ function restoreEnv(saved: Map<EnvKey, string | undefined>): void {
|
||||
}
|
||||
|
||||
function makeRegistry(): ModelRegistry {
|
||||
return new ModelRegistry(AuthStorage.inMemory());
|
||||
return ModelRegistry.inMemory(AuthStorage.inMemory());
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { describe, it, expect, beforeEach, vi } from 'vitest';
|
||||
import { RoutingService } from '../routing.service.js';
|
||||
import type { ModelInfo } from '@mosaic/types';
|
||||
import type { ModelInfo } from '@mosaicstack/types';
|
||||
|
||||
const mockModels: ModelInfo[] = [
|
||||
{
|
||||
|
||||
@@ -7,7 +7,7 @@ import type {
|
||||
IProviderAdapter,
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
|
||||
/**
|
||||
* Anthropic provider adapter.
|
||||
|
||||
@@ -6,7 +6,7 @@ import type {
|
||||
IProviderAdapter,
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
|
||||
/** Embedding models that Ollama ships with out of the box */
|
||||
const OLLAMA_EMBEDDING_MODELS: ReadonlyArray<{
|
||||
|
||||
@@ -7,7 +7,7 @@ import type {
|
||||
IProviderAdapter,
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
|
||||
/**
|
||||
* OpenAI provider adapter.
|
||||
|
||||
@@ -6,7 +6,7 @@ import type {
|
||||
IProviderAdapter,
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
|
||||
const OPENROUTER_BASE_URL = 'https://openrouter.ai/api/v1';
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@ import type {
|
||||
IProviderAdapter,
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
import { getModelCapability } from '../model-capabilities.js';
|
||||
|
||||
/**
|
||||
|
||||
@@ -13,7 +13,7 @@ import {
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
@@ -7,8 +7,8 @@ import {
|
||||
type AgentSessionEvent,
|
||||
type ToolDefinition,
|
||||
} from '@mariozechner/pi-coding-agent';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Memory } from '@mosaic/memory';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import type { Memory } from '@mosaicstack/memory';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { MEMORY } from '../memory/memory.tokens.js';
|
||||
import { EmbeddingService } from '../memory/embedding.service.js';
|
||||
@@ -23,6 +23,7 @@ import { createFileTools } from './tools/file-tools.js';
|
||||
import { createGitTools } from './tools/git-tools.js';
|
||||
import { createShellTools } from './tools/shell-tools.js';
|
||||
import { createWebTools } from './tools/web-tools.js';
|
||||
import { createSearchTools } from './tools/search-tools.js';
|
||||
import type { SessionInfoDto, SessionMetrics } from './session.dto.js';
|
||||
import { SystemOverrideService } from '../preferences/system-override.service.js';
|
||||
import { PreferencesService } from '../preferences/preferences.service.js';
|
||||
@@ -146,6 +147,7 @@ export class AgentService implements OnModuleDestroy {
|
||||
...createGitTools(sandboxDir),
|
||||
...createShellTools(sandboxDir),
|
||||
...createWebTools(),
|
||||
...createSearchTools(),
|
||||
];
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import type { ModelCapability } from '@mosaic/types';
|
||||
import type { ModelCapability } from '@mosaicstack/types';
|
||||
|
||||
/**
|
||||
* Comprehensive capability matrix for all target models.
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { createCipheriv, createDecipheriv, createHash, randomBytes } from 'node:crypto';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { providerCredentials, eq, and } from '@mosaic/db';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { providerCredentials, eq, and } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import type { ProviderCredentialSummaryDto } from './provider-credentials.dto.js';
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@ import type {
|
||||
ModelInfo,
|
||||
ProviderHealth,
|
||||
ProviderInfo,
|
||||
} from '@mosaic/types';
|
||||
} from '@mosaicstack/types';
|
||||
import {
|
||||
AnthropicAdapter,
|
||||
OllamaAdapter,
|
||||
@@ -67,7 +67,7 @@ export class ProviderService implements OnModuleInit, OnModuleDestroy {
|
||||
|
||||
async onModuleInit(): Promise<void> {
|
||||
const authStorage = AuthStorage.inMemory();
|
||||
this.registry = new ModelRegistry(authStorage);
|
||||
this.registry = ModelRegistry.inMemory(authStorage);
|
||||
|
||||
// Build the default set of adapters that rely on the registry
|
||||
this.adapters = [
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Body, Controller, Delete, Get, Inject, Param, Post, UseGuards } from '@nestjs/common';
|
||||
import type { RoutingCriteria } from '@mosaic/types';
|
||||
import type { RoutingCriteria } from '@mosaicstack/types';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
import { ProviderService } from './provider.service.js';
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import type { ModelInfo } from '@mosaic/types';
|
||||
import type { RoutingCriteria, RoutingResult, CostTier } from '@mosaic/types';
|
||||
import type { ModelInfo } from '@mosaicstack/types';
|
||||
import type { RoutingCriteria, RoutingResult, CostTier } from '@mosaicstack/types';
|
||||
import { ProviderService } from './provider.service.js';
|
||||
|
||||
/** Per-million-token cost thresholds for tier classification */
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable, Logger, type OnModuleInit } from '@nestjs/common';
|
||||
import { routingRules, type Db, sql } from '@mosaic/db';
|
||||
import { routingRules, type Db, sql } from '@mosaicstack/db';
|
||||
import { DB } from '../../database/database.module.js';
|
||||
import type { RoutingCondition, RoutingAction } from './routing.types.js';
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { routingRules, type Db, and, asc, eq, or } from '@mosaic/db';
|
||||
import { routingRules, type Db, and, asc, eq, or } from '@mosaicstack/db';
|
||||
import { DB } from '../../database/database.module.js';
|
||||
import { ProviderService } from '../provider.service.js';
|
||||
import { classifyTask } from './task-classifier.js';
|
||||
|
||||
@@ -13,7 +13,7 @@ import {
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import { routingRules, type Db, and, asc, eq, or, inArray } from '@mosaic/db';
|
||||
import { routingRules, type Db, and, asc, eq, or, inArray } from '@mosaicstack/db';
|
||||
import { DB } from '../../database/database.module.js';
|
||||
import { AuthGuard } from '../../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../../auth/current-user.decorator.js';
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
/**
|
||||
* Routing engine types — M4-002 (condition types) and M4-003 (action types).
|
||||
*
|
||||
* These types are re-exported from `@mosaic/types` for shared use across packages.
|
||||
* These types are re-exported from `@mosaicstack/types` for shared use across packages.
|
||||
*/
|
||||
|
||||
// ─── Classification primitives ───────────────────────────────────────────────
|
||||
@@ -23,7 +23,7 @@ export type Domain = 'frontend' | 'backend' | 'devops' | 'docs' | 'general';
|
||||
|
||||
/**
|
||||
* Cost tier for model selection.
|
||||
* Extends the existing `CostTier` in `@mosaic/types` with `local` for self-hosted models.
|
||||
* Extends the existing `CostTier` in `@mosaicstack/types` with `local` for self-hosted models.
|
||||
*/
|
||||
export type CostTier = 'cheap' | 'standard' | 'premium' | 'local';
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Type } from '@sinclair/typebox';
|
||||
import type { ToolDefinition } from '@mariozechner/pi-coding-agent';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
|
||||
export function createBrainTools(brain: Brain): ToolDefinition[] {
|
||||
const listProjects: ToolDefinition = {
|
||||
|
||||
@@ -190,5 +190,169 @@ export function createFileTools(baseDir: string): ToolDefinition[] {
|
||||
},
|
||||
};
|
||||
|
||||
return [readFileTool, writeFileTool, listDirectoryTool];
|
||||
const editFileTool: ToolDefinition = {
|
||||
name: 'fs_edit_file',
|
||||
label: 'Edit File',
|
||||
description:
|
||||
'Make targeted text replacements in a file. Each edit replaces an exact match of oldText with newText. ' +
|
||||
'All edits are matched against the original file content (not incrementally). ' +
|
||||
'Each oldText must be unique in the file and edits must not overlap.',
|
||||
parameters: Type.Object({
|
||||
path: Type.String({
|
||||
description: 'File path (relative to sandbox base or absolute within it)',
|
||||
}),
|
||||
edits: Type.Array(
|
||||
Type.Object({
|
||||
oldText: Type.String({
|
||||
description: 'Exact text to find and replace (must be unique in the file)',
|
||||
}),
|
||||
newText: Type.String({ description: 'Replacement text' }),
|
||||
}),
|
||||
{ description: 'One or more targeted replacements', minItems: 1 },
|
||||
),
|
||||
}),
|
||||
async execute(_toolCallId, params) {
|
||||
const { path, edits } = params as {
|
||||
path: string;
|
||||
edits: Array<{ oldText: string; newText: string }>;
|
||||
};
|
||||
|
||||
let safePath: string;
|
||||
try {
|
||||
safePath = guardPath(path, baseDir);
|
||||
} catch (err) {
|
||||
if (err instanceof SandboxEscapeError) {
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error: ${err.message}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error: ${String(err)}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
try {
|
||||
const info = await stat(safePath);
|
||||
if (!info.isFile()) {
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error: path is not a file: ${path}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
if (info.size > MAX_READ_BYTES) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `Error: file too large for editing (${info.size} bytes, limit ${MAX_READ_BYTES} bytes)`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
} catch (err) {
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error reading file: ${String(err)}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
let content: string;
|
||||
try {
|
||||
content = await readFile(safePath, { encoding: 'utf8' });
|
||||
} catch (err) {
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error reading file: ${String(err)}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
// Validate all edits before applying any
|
||||
const errors: string[] = [];
|
||||
for (let i = 0; i < edits.length; i++) {
|
||||
const edit = edits[i]!;
|
||||
const occurrences = content.split(edit.oldText).length - 1;
|
||||
if (occurrences === 0) {
|
||||
errors.push(`Edit ${i + 1}: oldText not found in file`);
|
||||
} else if (occurrences > 1) {
|
||||
errors.push(`Edit ${i + 1}: oldText matches ${occurrences} locations (must be unique)`);
|
||||
}
|
||||
}
|
||||
|
||||
// Check for overlapping edits
|
||||
if (errors.length === 0) {
|
||||
const positions = edits.map((edit, i) => ({
|
||||
index: i,
|
||||
start: content.indexOf(edit.oldText),
|
||||
end: content.indexOf(edit.oldText) + edit.oldText.length,
|
||||
}));
|
||||
positions.sort((a, b) => a.start - b.start);
|
||||
for (let i = 1; i < positions.length; i++) {
|
||||
if (positions[i]!.start < positions[i - 1]!.end) {
|
||||
errors.push(
|
||||
`Edits ${positions[i - 1]!.index + 1} and ${positions[i]!.index + 1} overlap`,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `Edit validation failed:\n${errors.join('\n')}`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
// Apply edits: process from end to start to preserve positions
|
||||
const positions = edits.map((edit) => ({
|
||||
edit,
|
||||
start: content.indexOf(edit.oldText),
|
||||
}));
|
||||
positions.sort((a, b) => b.start - a.start); // reverse order
|
||||
|
||||
let result = content;
|
||||
for (const { edit } of positions) {
|
||||
result = result.replace(edit.oldText, edit.newText);
|
||||
}
|
||||
|
||||
if (Buffer.byteLength(result, 'utf8') > MAX_WRITE_BYTES) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `Error: resulting file too large (limit ${MAX_WRITE_BYTES} bytes)`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
try {
|
||||
await writeFile(safePath, result, { encoding: 'utf8' });
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `File edited successfully: ${path} (${edits.length} edit(s) applied)`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
} catch (err) {
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: `Error writing file: ${String(err)}` }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
return [readFileTool, writeFileTool, listDirectoryTool, editFileTool];
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ export { createBrainTools } from './brain-tools.js';
|
||||
export { createCoordTools } from './coord-tools.js';
|
||||
export { createFileTools } from './file-tools.js';
|
||||
export { createGitTools } from './git-tools.js';
|
||||
export { createSearchTools } from './search-tools.js';
|
||||
export { createShellTools } from './shell-tools.js';
|
||||
export { createWebTools } from './web-tools.js';
|
||||
export { createSkillTools } from './skill-tools.js';
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Type } from '@sinclair/typebox';
|
||||
import type { ToolDefinition } from '@mariozechner/pi-coding-agent';
|
||||
import type { Memory } from '@mosaic/memory';
|
||||
import type { EmbeddingProvider } from '@mosaic/memory';
|
||||
import type { Memory } from '@mosaicstack/memory';
|
||||
import type { EmbeddingProvider } from '@mosaicstack/memory';
|
||||
|
||||
/**
|
||||
* Create memory tools bound to the session's authenticated userId.
|
||||
|
||||
496
apps/gateway/src/agent/tools/search-tools.ts
Normal file
496
apps/gateway/src/agent/tools/search-tools.ts
Normal file
@@ -0,0 +1,496 @@
|
||||
import { Type } from '@sinclair/typebox';
|
||||
import type { ToolDefinition } from '@mariozechner/pi-coding-agent';
|
||||
|
||||
const DEFAULT_TIMEOUT_MS = 15_000;
|
||||
const MAX_RESULTS = 10;
|
||||
const MAX_RESPONSE_BYTES = 256 * 1024; // 256 KB
|
||||
|
||||
// ─── Provider helpers ────────────────────────────────────────────────────────
|
||||
|
||||
interface SearchResult {
|
||||
title: string;
|
||||
url: string;
|
||||
snippet: string;
|
||||
}
|
||||
|
||||
interface SearchResponse {
|
||||
provider: string;
|
||||
query: string;
|
||||
results: SearchResult[];
|
||||
error?: string;
|
||||
}
|
||||
|
||||
async function fetchWithTimeout(
|
||||
url: string,
|
||||
init: RequestInit,
|
||||
timeoutMs: number,
|
||||
): Promise<Response> {
|
||||
const controller = new AbortController();
|
||||
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
||||
try {
|
||||
return await fetch(url, { ...init, signal: controller.signal });
|
||||
} finally {
|
||||
clearTimeout(timer);
|
||||
}
|
||||
}
|
||||
|
||||
async function readLimited(response: Response): Promise<string> {
|
||||
const reader = response.body?.getReader();
|
||||
if (!reader) return '';
|
||||
const chunks: Uint8Array[] = [];
|
||||
let total = 0;
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
total += value.length;
|
||||
if (total > MAX_RESPONSE_BYTES) {
|
||||
chunks.push(value.subarray(0, MAX_RESPONSE_BYTES - (total - value.length)));
|
||||
reader.cancel();
|
||||
break;
|
||||
}
|
||||
chunks.push(value);
|
||||
}
|
||||
const combined = new Uint8Array(chunks.reduce((a, c) => a + c.length, 0));
|
||||
let offset = 0;
|
||||
for (const chunk of chunks) {
|
||||
combined.set(chunk, offset);
|
||||
offset += chunk.length;
|
||||
}
|
||||
return new TextDecoder().decode(combined);
|
||||
}
|
||||
|
||||
// ─── Brave Search ────────────────────────────────────────────────────────────
|
||||
|
||||
async function searchBrave(query: string, limit: number): Promise<SearchResponse> {
|
||||
const apiKey = process.env['BRAVE_API_KEY'];
|
||||
if (!apiKey) return { provider: 'brave', query, results: [], error: 'BRAVE_API_KEY not set' };
|
||||
|
||||
try {
|
||||
const params = new URLSearchParams({
|
||||
q: query,
|
||||
count: String(Math.min(limit, 20)),
|
||||
});
|
||||
const res = await fetchWithTimeout(
|
||||
`https://api.search.brave.com/res/v1/web/search?${params}`,
|
||||
{ headers: { 'X-Subscription-Token': apiKey, Accept: 'application/json' } },
|
||||
DEFAULT_TIMEOUT_MS,
|
||||
);
|
||||
if (!res.ok) {
|
||||
const body = await res.text().catch(() => '');
|
||||
return { provider: 'brave', query, results: [], error: `HTTP ${res.status}: ${body}` };
|
||||
}
|
||||
const data = (await res.json()) as {
|
||||
web?: { results?: Array<{ title: string; url: string; description: string }> };
|
||||
};
|
||||
const results: SearchResult[] = (data.web?.results ?? []).slice(0, limit).map((r) => ({
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
snippet: r.description,
|
||||
}));
|
||||
return { provider: 'brave', query, results };
|
||||
} catch (err) {
|
||||
return {
|
||||
provider: 'brave',
|
||||
query,
|
||||
results: [],
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Tavily Search ───────────────────────────────────────────────────────────
|
||||
|
||||
async function searchTavily(query: string, limit: number): Promise<SearchResponse> {
|
||||
const apiKey = process.env['TAVILY_API_KEY'];
|
||||
if (!apiKey) return { provider: 'tavily', query, results: [], error: 'TAVILY_API_KEY not set' };
|
||||
|
||||
try {
|
||||
const res = await fetchWithTimeout(
|
||||
'https://api.tavily.com/search',
|
||||
{
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
api_key: apiKey,
|
||||
query,
|
||||
max_results: Math.min(limit, 10),
|
||||
include_answer: false,
|
||||
}),
|
||||
},
|
||||
DEFAULT_TIMEOUT_MS,
|
||||
);
|
||||
if (!res.ok) {
|
||||
const body = await res.text().catch(() => '');
|
||||
return { provider: 'tavily', query, results: [], error: `HTTP ${res.status}: ${body}` };
|
||||
}
|
||||
const data = (await res.json()) as {
|
||||
results?: Array<{ title: string; url: string; content: string }>;
|
||||
};
|
||||
const results: SearchResult[] = (data.results ?? []).slice(0, limit).map((r) => ({
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
snippet: r.content,
|
||||
}));
|
||||
return { provider: 'tavily', query, results };
|
||||
} catch (err) {
|
||||
return {
|
||||
provider: 'tavily',
|
||||
query,
|
||||
results: [],
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// ─── SearXNG (self-hosted) ───────────────────────────────────────────────────
|
||||
|
||||
async function searchSearxng(query: string, limit: number): Promise<SearchResponse> {
|
||||
const baseUrl = process.env['SEARXNG_URL'];
|
||||
if (!baseUrl) return { provider: 'searxng', query, results: [], error: 'SEARXNG_URL not set' };
|
||||
|
||||
try {
|
||||
const params = new URLSearchParams({
|
||||
q: query,
|
||||
format: 'json',
|
||||
pageno: '1',
|
||||
});
|
||||
const res = await fetchWithTimeout(
|
||||
`${baseUrl.replace(/\/$/, '')}/search?${params}`,
|
||||
{ headers: { Accept: 'application/json' } },
|
||||
DEFAULT_TIMEOUT_MS,
|
||||
);
|
||||
if (!res.ok) {
|
||||
const body = await res.text().catch(() => '');
|
||||
return { provider: 'searxng', query, results: [], error: `HTTP ${res.status}: ${body}` };
|
||||
}
|
||||
const data = (await res.json()) as {
|
||||
results?: Array<{ title: string; url: string; content: string }>;
|
||||
};
|
||||
const results: SearchResult[] = (data.results ?? []).slice(0, limit).map((r) => ({
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
snippet: r.content,
|
||||
}));
|
||||
return { provider: 'searxng', query, results };
|
||||
} catch (err) {
|
||||
return {
|
||||
provider: 'searxng',
|
||||
query,
|
||||
results: [],
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// ─── DuckDuckGo (lite HTML endpoint) ─────────────────────────────────────────
|
||||
|
||||
async function searchDuckDuckGo(query: string, limit: number): Promise<SearchResponse> {
|
||||
try {
|
||||
// Use the DuckDuckGo Instant Answer API (JSON, free, no key)
|
||||
const params = new URLSearchParams({
|
||||
q: query,
|
||||
format: 'json',
|
||||
no_html: '1',
|
||||
skip_disambig: '1',
|
||||
});
|
||||
const res = await fetchWithTimeout(
|
||||
`https://api.duckduckgo.com/?${params}`,
|
||||
{ headers: { Accept: 'application/json' } },
|
||||
DEFAULT_TIMEOUT_MS,
|
||||
);
|
||||
if (!res.ok) {
|
||||
return {
|
||||
provider: 'duckduckgo',
|
||||
query,
|
||||
results: [],
|
||||
error: `HTTP ${res.status}`,
|
||||
};
|
||||
}
|
||||
const text = await readLimited(res);
|
||||
const data = JSON.parse(text) as {
|
||||
AbstractText?: string;
|
||||
AbstractURL?: string;
|
||||
AbstractSource?: string;
|
||||
RelatedTopics?: Array<{
|
||||
Text?: string;
|
||||
FirstURL?: string;
|
||||
Result?: string;
|
||||
Topics?: Array<{ Text?: string; FirstURL?: string }>;
|
||||
}>;
|
||||
};
|
||||
|
||||
const results: SearchResult[] = [];
|
||||
|
||||
// Main abstract result
|
||||
if (data.AbstractText && data.AbstractURL) {
|
||||
results.push({
|
||||
title: data.AbstractSource ?? 'DuckDuckGo Abstract',
|
||||
url: data.AbstractURL,
|
||||
snippet: data.AbstractText,
|
||||
});
|
||||
}
|
||||
|
||||
// Related topics
|
||||
for (const topic of data.RelatedTopics ?? []) {
|
||||
if (results.length >= limit) break;
|
||||
if (topic.Text && topic.FirstURL) {
|
||||
results.push({
|
||||
title: topic.Text.slice(0, 120),
|
||||
url: topic.FirstURL,
|
||||
snippet: topic.Text,
|
||||
});
|
||||
}
|
||||
// Sub-topics
|
||||
for (const sub of topic.Topics ?? []) {
|
||||
if (results.length >= limit) break;
|
||||
if (sub.Text && sub.FirstURL) {
|
||||
results.push({
|
||||
title: sub.Text.slice(0, 120),
|
||||
url: sub.FirstURL,
|
||||
snippet: sub.Text,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { provider: 'duckduckgo', query, results: results.slice(0, limit) };
|
||||
} catch (err) {
|
||||
return {
|
||||
provider: 'duckduckgo',
|
||||
query,
|
||||
results: [],
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Provider resolution ─────────────────────────────────────────────────────
|
||||
|
||||
type SearchProvider = 'brave' | 'tavily' | 'searxng' | 'duckduckgo' | 'auto';
|
||||
|
||||
function getAvailableProviders(): SearchProvider[] {
|
||||
const available: SearchProvider[] = [];
|
||||
if (process.env['BRAVE_API_KEY']) available.push('brave');
|
||||
if (process.env['TAVILY_API_KEY']) available.push('tavily');
|
||||
if (process.env['SEARXNG_URL']) available.push('searxng');
|
||||
// DuckDuckGo is always available (no API key needed)
|
||||
available.push('duckduckgo');
|
||||
return available;
|
||||
}
|
||||
|
||||
async function executeSearch(
|
||||
provider: SearchProvider,
|
||||
query: string,
|
||||
limit: number,
|
||||
): Promise<SearchResponse> {
|
||||
switch (provider) {
|
||||
case 'brave':
|
||||
return searchBrave(query, limit);
|
||||
case 'tavily':
|
||||
return searchTavily(query, limit);
|
||||
case 'searxng':
|
||||
return searchSearxng(query, limit);
|
||||
case 'duckduckgo':
|
||||
return searchDuckDuckGo(query, limit);
|
||||
case 'auto': {
|
||||
// Try providers in priority order: Brave > Tavily > SearXNG > DuckDuckGo
|
||||
const available = getAvailableProviders();
|
||||
for (const p of available) {
|
||||
const result = await executeSearch(p, query, limit);
|
||||
if (!result.error && result.results.length > 0) return result;
|
||||
}
|
||||
// Fall back to DuckDuckGo if everything failed
|
||||
return searchDuckDuckGo(query, limit);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function formatSearchResults(response: SearchResponse): string {
|
||||
const lines: string[] = [];
|
||||
lines.push(`Search provider: ${response.provider}`);
|
||||
lines.push(`Query: "${response.query}"`);
|
||||
|
||||
if (response.error) {
|
||||
lines.push(`Error: ${response.error}`);
|
||||
}
|
||||
|
||||
if (response.results.length === 0) {
|
||||
lines.push('No results found.');
|
||||
} else {
|
||||
lines.push(`Results (${response.results.length}):\n`);
|
||||
for (let i = 0; i < response.results.length; i++) {
|
||||
const r = response.results[i]!;
|
||||
lines.push(`${i + 1}. ${r.title}`);
|
||||
lines.push(` URL: ${r.url}`);
|
||||
lines.push(` ${r.snippet}`);
|
||||
lines.push('');
|
||||
}
|
||||
}
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
// ─── Tool exports ────────────────────────────────────────────────────────────
|
||||
|
||||
export function createSearchTools(): ToolDefinition[] {
|
||||
const webSearch: ToolDefinition = {
|
||||
name: 'web_search',
|
||||
label: 'Web Search',
|
||||
description:
|
||||
'Search the web using configured search providers. ' +
|
||||
'Supports Brave, Tavily, SearXNG, and DuckDuckGo. ' +
|
||||
'Use "auto" provider to pick the best available. ' +
|
||||
'DuckDuckGo is always available as a fallback (no API key needed).',
|
||||
parameters: Type.Object({
|
||||
query: Type.String({ description: 'Search query' }),
|
||||
provider: Type.Optional(
|
||||
Type.String({
|
||||
description:
|
||||
'Search provider: "auto" (default), "brave", "tavily", "searxng", or "duckduckgo"',
|
||||
}),
|
||||
),
|
||||
limit: Type.Optional(
|
||||
Type.Number({ description: `Max results to return (default 5, max ${MAX_RESULTS})` }),
|
||||
),
|
||||
}),
|
||||
async execute(_toolCallId, params) {
|
||||
const { query, provider, limit } = params as {
|
||||
query: string;
|
||||
provider?: string;
|
||||
limit?: number;
|
||||
};
|
||||
|
||||
const effectiveProvider = (provider ?? 'auto') as SearchProvider;
|
||||
const validProviders = ['auto', 'brave', 'tavily', 'searxng', 'duckduckgo'];
|
||||
if (!validProviders.includes(effectiveProvider)) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `Invalid provider "${provider}". Valid: ${validProviders.join(', ')}`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
const effectiveLimit = Math.min(Math.max(limit ?? 5, 1), MAX_RESULTS);
|
||||
|
||||
try {
|
||||
const response = await executeSearch(effectiveProvider, query, effectiveLimit);
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: formatSearchResults(response) }],
|
||||
details: undefined,
|
||||
};
|
||||
} catch (err) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text' as const,
|
||||
text: `Search failed: ${err instanceof Error ? err.message : String(err)}`,
|
||||
},
|
||||
],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
const webSearchNews: ToolDefinition = {
|
||||
name: 'web_search_news',
|
||||
label: 'Web Search (News)',
|
||||
description:
|
||||
'Search for recent news articles. Uses Brave News API if available, falls back to standard search with news keywords.',
|
||||
parameters: Type.Object({
|
||||
query: Type.String({ description: 'News search query' }),
|
||||
limit: Type.Optional(
|
||||
Type.Number({ description: `Max results (default 5, max ${MAX_RESULTS})` }),
|
||||
),
|
||||
}),
|
||||
async execute(_toolCallId, params) {
|
||||
const { query, limit } = params as { query: string; limit?: number };
|
||||
const effectiveLimit = Math.min(Math.max(limit ?? 5, 1), MAX_RESULTS);
|
||||
|
||||
// Try Brave News API first (dedicated news endpoint)
|
||||
const braveKey = process.env['BRAVE_API_KEY'];
|
||||
if (braveKey) {
|
||||
try {
|
||||
const newsParams = new URLSearchParams({
|
||||
q: query,
|
||||
count: String(effectiveLimit),
|
||||
});
|
||||
const res = await fetchWithTimeout(
|
||||
`https://api.search.brave.com/res/v1/news/search?${newsParams}`,
|
||||
{
|
||||
headers: {
|
||||
'X-Subscription-Token': braveKey,
|
||||
Accept: 'application/json',
|
||||
},
|
||||
},
|
||||
DEFAULT_TIMEOUT_MS,
|
||||
);
|
||||
if (res.ok) {
|
||||
const data = (await res.json()) as {
|
||||
results?: Array<{
|
||||
title: string;
|
||||
url: string;
|
||||
description: string;
|
||||
age?: string;
|
||||
}>;
|
||||
};
|
||||
const results: SearchResult[] = (data.results ?? [])
|
||||
.slice(0, effectiveLimit)
|
||||
.map((r) => ({
|
||||
title: r.title + (r.age ? ` (${r.age})` : ''),
|
||||
url: r.url,
|
||||
snippet: r.description,
|
||||
}));
|
||||
const response: SearchResponse = { provider: 'brave-news', query, results };
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: formatSearchResults(response) }],
|
||||
details: undefined,
|
||||
};
|
||||
}
|
||||
} catch {
|
||||
// Fall through to generic search
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: standard search with "news" appended
|
||||
const newsQuery = `${query} news latest`;
|
||||
const response = await executeSearch('auto', newsQuery, effectiveLimit);
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: formatSearchResults(response) }],
|
||||
details: undefined,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
const searchProviders: ToolDefinition = {
|
||||
name: 'web_search_providers',
|
||||
label: 'List Search Providers',
|
||||
description: 'List the currently available and configured web search providers.',
|
||||
parameters: Type.Object({}),
|
||||
async execute() {
|
||||
const available = getAvailableProviders();
|
||||
const allProviders = [
|
||||
{ name: 'brave', configured: !!process.env['BRAVE_API_KEY'], envVar: 'BRAVE_API_KEY' },
|
||||
{ name: 'tavily', configured: !!process.env['TAVILY_API_KEY'], envVar: 'TAVILY_API_KEY' },
|
||||
{ name: 'searxng', configured: !!process.env['SEARXNG_URL'], envVar: 'SEARXNG_URL' },
|
||||
{ name: 'duckduckgo', configured: true, envVar: '(none — always available)' },
|
||||
];
|
||||
|
||||
const lines = ['Search providers:\n'];
|
||||
for (const p of allProviders) {
|
||||
const status = p.configured ? '✓ configured' : '✗ not configured';
|
||||
lines.push(` ${p.name}: ${status} (${p.envVar})`);
|
||||
}
|
||||
lines.push(`\nActive providers for "auto" mode: ${available.join(', ')}`);
|
||||
return {
|
||||
content: [{ type: 'text' as const, text: lines.join('\n') }],
|
||||
details: undefined,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
return [webSearch, webSearchNews, searchProviders];
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
import { Module } from '@nestjs/common';
|
||||
import { APP_GUARD } from '@nestjs/core';
|
||||
import { HealthController } from './health/health.controller.js';
|
||||
import { ConfigModule } from './config/config.module.js';
|
||||
import { DatabaseModule } from './database/database.module.js';
|
||||
import { AuthModule } from './auth/auth.module.js';
|
||||
import { BrainModule } from './brain/brain.module.js';
|
||||
@@ -22,11 +23,13 @@ import { PreferencesModule } from './preferences/preferences.module.js';
|
||||
import { GCModule } from './gc/gc.module.js';
|
||||
import { ReloadModule } from './reload/reload.module.js';
|
||||
import { WorkspaceModule } from './workspace/workspace.module.js';
|
||||
import { QueueModule } from './queue/queue.module.js';
|
||||
import { ThrottlerGuard, ThrottlerModule } from '@nestjs/throttler';
|
||||
|
||||
@Module({
|
||||
imports: [
|
||||
ThrottlerModule.forRoot([{ name: 'default', ttl: 60_000, limit: 60 }]),
|
||||
ConfigModule,
|
||||
DatabaseModule,
|
||||
AuthModule,
|
||||
BrainModule,
|
||||
@@ -46,6 +49,7 @@ import { ThrottlerGuard, ThrottlerModule } from '@nestjs/throttler';
|
||||
PreferencesModule,
|
||||
CommandsModule,
|
||||
GCModule,
|
||||
QueueModule,
|
||||
ReloadModule,
|
||||
WorkspaceModule,
|
||||
],
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { IncomingMessage, ServerResponse } from 'node:http';
|
||||
import { toNodeHandler } from 'better-auth/node';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import type { NestFastifyApplication } from '@nestjs/platform-fastify';
|
||||
import { AUTH } from './auth.tokens.js';
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@ import {
|
||||
UnauthorizedException,
|
||||
} from '@nestjs/common';
|
||||
import { fromNodeHeaders } from 'better-auth/node';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import type { FastifyRequest } from 'fastify';
|
||||
import { AUTH } from './auth.tokens.js';
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { createAuth, type Auth } from '@mosaic/auth';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { createAuth, type Auth } from '@mosaicstack/auth';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { AUTH } from './auth.tokens.js';
|
||||
import { SsoController } from './sso.controller.js';
|
||||
@@ -14,7 +14,7 @@ import { SsoController } from './sso.controller.js';
|
||||
useFactory: (db: Db): Auth =>
|
||||
createAuth({
|
||||
db,
|
||||
baseURL: process.env['BETTER_AUTH_URL'] ?? 'http://localhost:4000',
|
||||
baseURL: process.env['BETTER_AUTH_URL'] ?? 'http://localhost:14242',
|
||||
secret: process.env['BETTER_AUTH_SECRET'],
|
||||
}),
|
||||
inject: [DB],
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Controller, Get } from '@nestjs/common';
|
||||
import { buildSsoDiscovery, type SsoProviderDiscovery } from '@mosaic/auth';
|
||||
import { buildSsoDiscovery, type SsoProviderDiscovery } from '@mosaicstack/auth';
|
||||
|
||||
@Controller('api/sso/providers')
|
||||
export class SsoController {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { createBrain, type Brain } from '@mosaic/brain';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { createBrain, type Brain } from '@mosaicstack/brain';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { BRAIN } from './brain.tokens.js';
|
||||
|
||||
|
||||
@@ -11,14 +11,15 @@ import {
|
||||
} from '@nestjs/websockets';
|
||||
import { Server, Socket } from 'socket.io';
|
||||
import type { AgentSessionEvent } from '@mariozechner/pi-coding-agent';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import type {
|
||||
SetThinkingPayload,
|
||||
SlashCommandPayload,
|
||||
SystemReloadPayload,
|
||||
RoutingDecisionInfo,
|
||||
} from '@mosaic/types';
|
||||
AbortPayload,
|
||||
} from '@mosaicstack/types';
|
||||
import { AgentService, type ConversationHistoryMessage } from '../agent/agent.service.js';
|
||||
import { AUTH } from '../auth/auth.tokens.js';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
@@ -325,6 +326,38 @@ export class ChatGateway implements OnGatewayInit, OnGatewayConnection, OnGatewa
|
||||
});
|
||||
}
|
||||
|
||||
@SubscribeMessage('abort')
|
||||
async handleAbort(
|
||||
@ConnectedSocket() client: Socket,
|
||||
@MessageBody() data: AbortPayload,
|
||||
): Promise<void> {
|
||||
const conversationId = data.conversationId;
|
||||
this.logger.log(`Abort requested by ${client.id} for conversation ${conversationId}`);
|
||||
|
||||
const session = this.agentService.getSession(conversationId);
|
||||
if (!session) {
|
||||
client.emit('error', {
|
||||
conversationId,
|
||||
error: 'No active session to abort.',
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
await session.piSession.abort();
|
||||
this.logger.log(`Agent session ${conversationId} aborted successfully`);
|
||||
} catch (err) {
|
||||
this.logger.error(
|
||||
`Failed to abort session ${conversationId}`,
|
||||
err instanceof Error ? err.stack : String(err),
|
||||
);
|
||||
client.emit('error', {
|
||||
conversationId,
|
||||
error: 'Failed to abort the agent operation.',
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
@SubscribeMessage('command:execute')
|
||||
async handleCommandExecute(
|
||||
@ConnectedSocket() client: Socket,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { CommandExecutorService } from './command-executor.service.js';
|
||||
import type { SlashCommandPayload } from '@mosaic/types';
|
||||
import type { SlashCommandPayload } from '@mosaicstack/types';
|
||||
|
||||
// Minimal mock implementations
|
||||
const mockRegistry = {
|
||||
@@ -82,6 +82,7 @@ function buildService(): CommandExecutorService {
|
||||
mockBrain as never,
|
||||
null,
|
||||
mockChatGateway as never,
|
||||
null,
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
import { forwardRef, Inject, Injectable, Logger, Optional } from '@nestjs/common';
|
||||
import type { QueueHandle } from '@mosaic/queue';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { SlashCommandPayload, SlashCommandResultPayload } from '@mosaic/types';
|
||||
import type { QueueHandle } from '@mosaicstack/queue';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import type { SlashCommandPayload, SlashCommandResultPayload } from '@mosaicstack/types';
|
||||
import { AgentService } from '../agent/agent.service.js';
|
||||
import { ChatGateway } from '../chat/chat.gateway.js';
|
||||
import { SessionGCService } from '../gc/session-gc.service.js';
|
||||
import { SystemOverrideService } from '../preferences/system-override.service.js';
|
||||
import { ReloadService } from '../reload/reload.service.js';
|
||||
import { McpClientService } from '../mcp-client/mcp-client.service.js';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { COMMANDS_REDIS } from './commands.tokens.js';
|
||||
import { CommandRegistryService } from './command-registry.service.js';
|
||||
@@ -28,6 +29,9 @@ export class CommandExecutorService {
|
||||
@Optional()
|
||||
@Inject(forwardRef(() => ChatGateway))
|
||||
private readonly chatGateway: ChatGateway | null,
|
||||
@Optional()
|
||||
@Inject(McpClientService)
|
||||
private readonly mcpClient: McpClientService | null,
|
||||
) {}
|
||||
|
||||
async execute(payload: SlashCommandPayload, userId: string): Promise<SlashCommandResultPayload> {
|
||||
@@ -105,6 +109,8 @@ export class CommandExecutorService {
|
||||
};
|
||||
case 'tools':
|
||||
return await this.handleTools(conversationId, userId);
|
||||
case 'mcp':
|
||||
return await this.handleMcp(args ?? null, conversationId);
|
||||
case 'reload': {
|
||||
if (!this.reloadService) {
|
||||
return {
|
||||
@@ -489,4 +495,92 @@ export class CommandExecutorService {
|
||||
conversationId,
|
||||
};
|
||||
}
|
||||
|
||||
private async handleMcp(
|
||||
args: string | null,
|
||||
conversationId: string,
|
||||
): Promise<SlashCommandResultPayload> {
|
||||
if (!this.mcpClient) {
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: false,
|
||||
message: 'MCP client service is not available.',
|
||||
};
|
||||
}
|
||||
|
||||
const action = args?.trim().split(/\s+/)[0] ?? 'status';
|
||||
|
||||
switch (action) {
|
||||
case 'status':
|
||||
case 'servers': {
|
||||
const statuses = this.mcpClient.getServerStatuses();
|
||||
if (statuses.length === 0) {
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: true,
|
||||
message:
|
||||
'No MCP servers configured. Set MCP_SERVERS env var to connect external tool servers.',
|
||||
};
|
||||
}
|
||||
const lines = ['MCP Server Status:\n'];
|
||||
for (const s of statuses) {
|
||||
const status = s.connected ? '✓ connected' : '✗ disconnected';
|
||||
lines.push(` ${s.name}: ${status}`);
|
||||
lines.push(` URL: ${s.url}`);
|
||||
lines.push(` Tools: ${s.toolCount}`);
|
||||
if (s.error) lines.push(` Error: ${s.error}`);
|
||||
lines.push('');
|
||||
}
|
||||
const tools = this.mcpClient.getToolDefinitions();
|
||||
if (tools.length > 0) {
|
||||
lines.push(`Total bridged tools: ${tools.length}`);
|
||||
lines.push(`Tool names: ${tools.map((t) => t.name).join(', ')}`);
|
||||
}
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: true,
|
||||
message: lines.join('\n'),
|
||||
};
|
||||
}
|
||||
|
||||
case 'reconnect': {
|
||||
const serverName = args?.trim().split(/\s+/).slice(1).join(' ');
|
||||
if (!serverName) {
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: false,
|
||||
message: 'Usage: /mcp reconnect <server-name>',
|
||||
};
|
||||
}
|
||||
try {
|
||||
await this.mcpClient.reconnectServer(serverName);
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: true,
|
||||
message: `MCP server "${serverName}" reconnected successfully.`,
|
||||
};
|
||||
} catch (err) {
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: false,
|
||||
message: `Failed to reconnect MCP server "${serverName}": ${err instanceof Error ? err.message : String(err)}`,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
default:
|
||||
return {
|
||||
command: 'mcp',
|
||||
conversationId,
|
||||
success: false,
|
||||
message: `Unknown MCP action: "${action}". Use: /mcp status, /mcp servers, /mcp reconnect <name>`,
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { describe, it, expect, beforeEach } from 'vitest';
|
||||
import { CommandRegistryService } from './command-registry.service.js';
|
||||
import type { CommandDef } from '@mosaic/types';
|
||||
import type { CommandDef } from '@mosaicstack/types';
|
||||
|
||||
const mockCmd: CommandDef = {
|
||||
name: 'test',
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Injectable, type OnModuleInit } from '@nestjs/common';
|
||||
import type { CommandDef, CommandManifest } from '@mosaic/types';
|
||||
import type { CommandDef, CommandManifest } from '@mosaicstack/types';
|
||||
|
||||
@Injectable()
|
||||
export class CommandRegistryService implements OnModuleInit {
|
||||
@@ -260,6 +260,23 @@ export class CommandRegistryService implements OnModuleInit {
|
||||
execution: 'socket',
|
||||
available: true,
|
||||
},
|
||||
{
|
||||
name: 'mcp',
|
||||
description: 'Manage MCP server connections (status/reconnect/servers)',
|
||||
aliases: [],
|
||||
args: [
|
||||
{
|
||||
name: 'action',
|
||||
type: 'enum',
|
||||
optional: true,
|
||||
values: ['status', 'reconnect', 'servers'],
|
||||
description: 'Action: status (default), reconnect <name>, servers',
|
||||
},
|
||||
],
|
||||
scope: 'agent',
|
||||
execution: 'socket',
|
||||
available: true,
|
||||
},
|
||||
{
|
||||
name: 'reload',
|
||||
description: 'Soft-reload gateway plugins and command manifest (admin)',
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { CommandRegistryService } from './command-registry.service.js';
|
||||
import { CommandExecutorService } from './command-executor.service.js';
|
||||
import type { SlashCommandPayload } from '@mosaic/types';
|
||||
import type { SlashCommandPayload } from '@mosaicstack/types';
|
||||
|
||||
// ─── Mocks ───────────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -65,6 +65,7 @@ function buildExecutor(registry: CommandRegistryService): CommandExecutorService
|
||||
mockBrain as never,
|
||||
null, // reloadService (optional)
|
||||
null, // chatGateway (optional)
|
||||
null, // mcpClient (optional)
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { forwardRef, Inject, Module, type OnApplicationShutdown } from '@nestjs/common';
|
||||
import { createQueue, type QueueHandle } from '@mosaic/queue';
|
||||
import { createQueue, type QueueHandle } from '@mosaicstack/queue';
|
||||
import { ChatModule } from '../chat/chat.module.js';
|
||||
import { GCModule } from '../gc/gc.module.js';
|
||||
import { ReloadModule } from '../reload/reload.module.js';
|
||||
|
||||
16
apps/gateway/src/config/config.module.ts
Normal file
16
apps/gateway/src/config/config.module.ts
Normal file
@@ -0,0 +1,16 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { loadConfig, type MosaicConfig } from '@mosaicstack/config';
|
||||
|
||||
export const MOSAIC_CONFIG = 'MOSAIC_CONFIG';
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
providers: [
|
||||
{
|
||||
provide: MOSAIC_CONFIG,
|
||||
useFactory: (): MosaicConfig => loadConfig(),
|
||||
},
|
||||
],
|
||||
exports: [MOSAIC_CONFIG],
|
||||
})
|
||||
export class ConfigModule {}
|
||||
@@ -15,7 +15,7 @@ import {
|
||||
Query,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
@@ -8,7 +8,7 @@ import {
|
||||
type MissionStatusSummary,
|
||||
type MissionTask,
|
||||
type TaskDetail,
|
||||
} from '@mosaic/coord';
|
||||
} from '@mosaicstack/coord';
|
||||
import { promises as fs } from 'node:fs';
|
||||
import path from 'node:path';
|
||||
|
||||
|
||||
@@ -1,28 +1,51 @@
|
||||
import { mkdirSync } from 'node:fs';
|
||||
import { homedir } from 'node:os';
|
||||
import { join } from 'node:path';
|
||||
import { Global, Inject, Module, type OnApplicationShutdown } from '@nestjs/common';
|
||||
import { createDb, type Db, type DbHandle } from '@mosaic/db';
|
||||
import { createDb, createPgliteDb, type Db, type DbHandle } from '@mosaicstack/db';
|
||||
import { createStorageAdapter, type StorageAdapter } from '@mosaicstack/storage';
|
||||
import type { MosaicConfig } from '@mosaicstack/config';
|
||||
import { MOSAIC_CONFIG } from '../config/config.module.js';
|
||||
|
||||
export const DB_HANDLE = 'DB_HANDLE';
|
||||
export const DB = 'DB';
|
||||
export const STORAGE_ADAPTER = 'STORAGE_ADAPTER';
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
providers: [
|
||||
{
|
||||
provide: DB_HANDLE,
|
||||
useFactory: (): DbHandle => createDb(),
|
||||
useFactory: (config: MosaicConfig): DbHandle => {
|
||||
if (config.tier === 'local') {
|
||||
const dataDir = join(homedir(), '.config', 'mosaic', 'gateway', 'pglite');
|
||||
mkdirSync(dataDir, { recursive: true });
|
||||
return createPgliteDb(dataDir);
|
||||
}
|
||||
return createDb(config.storage.type === 'postgres' ? config.storage.url : undefined);
|
||||
},
|
||||
inject: [MOSAIC_CONFIG],
|
||||
},
|
||||
{
|
||||
provide: DB,
|
||||
useFactory: (handle: DbHandle): Db => handle.db,
|
||||
inject: [DB_HANDLE],
|
||||
},
|
||||
{
|
||||
provide: STORAGE_ADAPTER,
|
||||
useFactory: (config: MosaicConfig): StorageAdapter => createStorageAdapter(config.storage),
|
||||
inject: [MOSAIC_CONFIG],
|
||||
},
|
||||
],
|
||||
exports: [DB],
|
||||
exports: [DB, STORAGE_ADAPTER],
|
||||
})
|
||||
export class DatabaseModule implements OnApplicationShutdown {
|
||||
constructor(@Inject(DB_HANDLE) private readonly handle: DbHandle) {}
|
||||
constructor(
|
||||
@Inject(DB_HANDLE) private readonly handle: DbHandle,
|
||||
@Inject(STORAGE_ADAPTER) private readonly storageAdapter: StorageAdapter,
|
||||
) {}
|
||||
|
||||
async onApplicationShutdown(): Promise<void> {
|
||||
await this.handle.close();
|
||||
await Promise.all([this.handle.close(), this.storageAdapter.close()]);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Module, type OnApplicationShutdown, Inject } from '@nestjs/common';
|
||||
import { createQueue, type QueueHandle } from '@mosaic/queue';
|
||||
import { createQueue, type QueueHandle } from '@mosaicstack/queue';
|
||||
import { SessionGCService } from './session-gc.service.js';
|
||||
import { REDIS } from './gc.tokens.js';
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||
import { Logger } from '@nestjs/common';
|
||||
import type { QueueHandle } from '@mosaic/queue';
|
||||
import type { LogService } from '@mosaic/log';
|
||||
import type { QueueHandle } from '@mosaicstack/queue';
|
||||
import type { LogService } from '@mosaicstack/log';
|
||||
import { SessionGCService } from './session-gc.service.js';
|
||||
|
||||
type MockRedis = {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Inject, Injectable, Logger, type OnModuleInit } from '@nestjs/common';
|
||||
import type { QueueHandle } from '@mosaic/queue';
|
||||
import type { LogService } from '@mosaic/log';
|
||||
import type { QueueHandle } from '@mosaicstack/queue';
|
||||
import type { LogService } from '@mosaicstack/log';
|
||||
import { LOG_SERVICE } from '../log/log.tokens.js';
|
||||
import { REDIS } from './gc.tokens.js';
|
||||
|
||||
|
||||
@@ -5,59 +5,72 @@ import {
|
||||
type OnModuleInit,
|
||||
type OnModuleDestroy,
|
||||
} from '@nestjs/common';
|
||||
import cron from 'node-cron';
|
||||
import { SummarizationService } from './summarization.service.js';
|
||||
import { SessionGCService } from '../gc/session-gc.service.js';
|
||||
import {
|
||||
QueueService,
|
||||
QUEUE_SUMMARIZATION,
|
||||
QUEUE_GC,
|
||||
QUEUE_TIER_MANAGEMENT,
|
||||
} from '../queue/queue.service.js';
|
||||
import type { Worker } from 'bullmq';
|
||||
import type { MosaicJobData } from '../queue/queue.service.js';
|
||||
|
||||
@Injectable()
|
||||
export class CronService implements OnModuleInit, OnModuleDestroy {
|
||||
private readonly logger = new Logger(CronService.name);
|
||||
private readonly tasks: cron.ScheduledTask[] = [];
|
||||
private readonly registeredWorkers: Worker<MosaicJobData>[] = [];
|
||||
|
||||
constructor(
|
||||
@Inject(SummarizationService) private readonly summarization: SummarizationService,
|
||||
@Inject(SessionGCService) private readonly sessionGC: SessionGCService,
|
||||
@Inject(QueueService) private readonly queueService: QueueService,
|
||||
) {}
|
||||
|
||||
onModuleInit(): void {
|
||||
async onModuleInit(): Promise<void> {
|
||||
const summarizationSchedule = process.env['SUMMARIZATION_CRON'] ?? '0 */6 * * *'; // every 6 hours
|
||||
const tierManagementSchedule = process.env['TIER_MANAGEMENT_CRON'] ?? '0 3 * * *'; // daily at 3am
|
||||
const gcSchedule = process.env['SESSION_GC_CRON'] ?? '0 4 * * *'; // daily at 4am
|
||||
|
||||
this.tasks.push(
|
||||
cron.schedule(summarizationSchedule, () => {
|
||||
this.summarization.runSummarization().catch((err) => {
|
||||
this.logger.error(`Scheduled summarization failed: ${err}`);
|
||||
});
|
||||
}),
|
||||
// M6-003: Summarization repeatable job
|
||||
await this.queueService.addRepeatableJob(
|
||||
QUEUE_SUMMARIZATION,
|
||||
'summarization',
|
||||
{},
|
||||
summarizationSchedule,
|
||||
);
|
||||
const summarizationWorker = this.queueService.registerWorker(QUEUE_SUMMARIZATION, async () => {
|
||||
await this.summarization.runSummarization();
|
||||
});
|
||||
this.registeredWorkers.push(summarizationWorker);
|
||||
|
||||
this.tasks.push(
|
||||
cron.schedule(tierManagementSchedule, () => {
|
||||
this.summarization.runTierManagement().catch((err) => {
|
||||
this.logger.error(`Scheduled tier management failed: ${err}`);
|
||||
});
|
||||
}),
|
||||
// M6-005: Tier management repeatable job
|
||||
await this.queueService.addRepeatableJob(
|
||||
QUEUE_TIER_MANAGEMENT,
|
||||
'tier-management',
|
||||
{},
|
||||
tierManagementSchedule,
|
||||
);
|
||||
const tierWorker = this.queueService.registerWorker(QUEUE_TIER_MANAGEMENT, async () => {
|
||||
await this.summarization.runTierManagement();
|
||||
});
|
||||
this.registeredWorkers.push(tierWorker);
|
||||
|
||||
this.tasks.push(
|
||||
cron.schedule(gcSchedule, () => {
|
||||
this.sessionGC.sweepOrphans().catch((err) => {
|
||||
this.logger.error(`Session GC sweep failed: ${err}`);
|
||||
});
|
||||
}),
|
||||
);
|
||||
// M6-004: GC repeatable job
|
||||
await this.queueService.addRepeatableJob(QUEUE_GC, 'session-gc', {}, gcSchedule);
|
||||
const gcWorker = this.queueService.registerWorker(QUEUE_GC, async () => {
|
||||
await this.sessionGC.sweepOrphans();
|
||||
});
|
||||
this.registeredWorkers.push(gcWorker);
|
||||
|
||||
this.logger.log(
|
||||
`Cron scheduled: summarization="${summarizationSchedule}", tier="${tierManagementSchedule}", gc="${gcSchedule}"`,
|
||||
`BullMQ jobs scheduled: summarization="${summarizationSchedule}", tier="${tierManagementSchedule}", gc="${gcSchedule}"`,
|
||||
);
|
||||
}
|
||||
|
||||
onModuleDestroy(): void {
|
||||
for (const task of this.tasks) {
|
||||
task.stop();
|
||||
}
|
||||
this.tasks.length = 0;
|
||||
this.logger.log('Cron tasks stopped');
|
||||
async onModuleDestroy(): Promise<void> {
|
||||
// Workers are closed by QueueService.onModuleDestroy — nothing extra needed here.
|
||||
this.registeredWorkers.length = 0;
|
||||
this.logger.log('CronService destroyed (workers managed by QueueService)');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Body, Controller, Get, Inject, Param, Post, Query, UseGuards } from '@nestjs/common';
|
||||
import type { LogService } from '@mosaic/log';
|
||||
import type { LogService } from '@mosaicstack/log';
|
||||
import { LOG_SERVICE } from './log.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import type { IngestLogDto, QueryLogsDto } from './log.dto.js';
|
||||
|
||||
@@ -1,16 +1,17 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { createLogService, type LogService } from '@mosaic/log';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { createLogService, type LogService } from '@mosaicstack/log';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import { LOG_SERVICE } from './log.tokens.js';
|
||||
import { LogController } from './log.controller.js';
|
||||
import { SummarizationService } from './summarization.service.js';
|
||||
import { CronService } from './cron.service.js';
|
||||
import { GCModule } from '../gc/gc.module.js';
|
||||
import { QueueModule } from '../queue/queue.module.js';
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
imports: [GCModule],
|
||||
imports: [GCModule, QueueModule],
|
||||
providers: [
|
||||
{
|
||||
provide: LOG_SERVICE,
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import type { LogService } from '@mosaic/log';
|
||||
import type { Memory } from '@mosaic/memory';
|
||||
import type { LogService } from '@mosaicstack/log';
|
||||
import type { Memory } from '@mosaicstack/memory';
|
||||
import { LOG_SERVICE } from './log.tokens.js';
|
||||
import { MEMORY } from '../memory/memory.tokens.js';
|
||||
import { EmbeddingService } from '../memory/embedding.service.js';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { sql, summarizationJobs } from '@mosaic/db';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import { sql, summarizationJobs } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
|
||||
const SUMMARIZATION_PROMPT = `You are a knowledge extraction assistant. Given the following agent interaction logs, extract the key decisions, learnings, and patterns. Output a concise summary (2-4 sentences) that captures the most important information for future reference. Focus on actionable insights, not raw events.
|
||||
|
||||
@@ -1,5 +1,13 @@
|
||||
#!/usr/bin/env node
|
||||
import { config } from 'dotenv';
|
||||
import { resolve } from 'node:path';
|
||||
import { existsSync } from 'node:fs';
|
||||
import { resolve, join } from 'node:path';
|
||||
import { homedir } from 'node:os';
|
||||
|
||||
// Load .env from daemon config dir (global install / daemon mode).
|
||||
// Loaded first so monorepo .env can override for local dev.
|
||||
const daemonEnv = join(homedir(), '.config', 'mosaic', 'gateway', '.env');
|
||||
if (existsSync(daemonEnv)) config({ path: daemonEnv });
|
||||
|
||||
// Load .env from monorepo root (cwd is apps/gateway when run via pnpm filter)
|
||||
config({ path: resolve(process.cwd(), '../../.env') });
|
||||
@@ -11,7 +19,7 @@ import { NestFactory } from '@nestjs/core';
|
||||
import { Logger, ValidationPipe } from '@nestjs/common';
|
||||
import { FastifyAdapter, type NestFastifyApplication } from '@nestjs/platform-fastify';
|
||||
import helmet from '@fastify/helmet';
|
||||
import { listSsoStartupWarnings } from '@mosaic/auth';
|
||||
import { listSsoStartupWarnings } from '@mosaicstack/auth';
|
||||
import { AppModule } from './app.module.js';
|
||||
import { mountAuthHandler } from './auth/auth.controller.js';
|
||||
import { mountMcpHandler } from './mcp/mcp.controller.js';
|
||||
@@ -51,7 +59,7 @@ async function bootstrap(): Promise<void> {
|
||||
mountAuthHandler(app);
|
||||
mountMcpHandler(app, app.get(McpService));
|
||||
|
||||
const port = Number(process.env['GATEWAY_PORT'] ?? 4000);
|
||||
const port = Number(process.env['GATEWAY_PORT'] ?? 14242);
|
||||
await app.listen(port, '0.0.0.0');
|
||||
logger.log(`Gateway listening on port ${port}`);
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import type { IncomingMessage, ServerResponse } from 'node:http';
|
||||
import { Logger } from '@nestjs/common';
|
||||
import { fromNodeHeaders } from 'better-auth/node';
|
||||
import type { Auth } from '@mosaic/auth';
|
||||
import type { Auth } from '@mosaicstack/auth';
|
||||
import type { NestFastifyApplication } from '@nestjs/platform-fastify';
|
||||
import type { McpService } from './mcp.service.js';
|
||||
import { AUTH } from '../auth/auth.tokens.js';
|
||||
|
||||
@@ -3,8 +3,8 @@ import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
|
||||
import { randomUUID } from 'node:crypto';
|
||||
import { z } from 'zod';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Memory } from '@mosaic/memory';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import type { Memory } from '@mosaicstack/memory';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { MEMORY } from '../memory/memory.tokens.js';
|
||||
import { EmbeddingService } from '../memory/embedding.service.js';
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Injectable, Logger } from '@nestjs/common';
|
||||
import type { EmbeddingProvider } from '@mosaic/memory';
|
||||
import type { EmbeddingProvider } from '@mosaicstack/memory';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Environment-driven configuration
|
||||
|
||||
@@ -12,7 +12,7 @@ import {
|
||||
Query,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Memory } from '@mosaic/memory';
|
||||
import type { Memory } from '@mosaicstack/memory';
|
||||
import { MEMORY } from './memory.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
@@ -1,11 +1,29 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { createMemory, type Memory } from '@mosaic/memory';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
import {
|
||||
createMemory,
|
||||
type Memory,
|
||||
createMemoryAdapter,
|
||||
type MemoryAdapter,
|
||||
type MemoryConfig,
|
||||
} from '@mosaicstack/memory';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
import type { StorageAdapter } from '@mosaicstack/storage';
|
||||
import type { MosaicConfig } from '@mosaicstack/config';
|
||||
import { MOSAIC_CONFIG } from '../config/config.module.js';
|
||||
import { DB, STORAGE_ADAPTER } from '../database/database.module.js';
|
||||
import { MEMORY } from './memory.tokens.js';
|
||||
import { MemoryController } from './memory.controller.js';
|
||||
import { EmbeddingService } from './embedding.service.js';
|
||||
|
||||
export const MEMORY_ADAPTER = 'MEMORY_ADAPTER';
|
||||
|
||||
function buildMemoryConfig(config: MosaicConfig, storageAdapter: StorageAdapter): MemoryConfig {
|
||||
if (config.memory.type === 'keyword') {
|
||||
return { type: 'keyword', storage: storageAdapter };
|
||||
}
|
||||
return { type: config.memory.type };
|
||||
}
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
providers: [
|
||||
@@ -14,9 +32,15 @@ import { EmbeddingService } from './embedding.service.js';
|
||||
useFactory: (db: Db): Memory => createMemory(db),
|
||||
inject: [DB],
|
||||
},
|
||||
{
|
||||
provide: MEMORY_ADAPTER,
|
||||
useFactory: (config: MosaicConfig, storageAdapter: StorageAdapter): MemoryAdapter =>
|
||||
createMemoryAdapter(buildMemoryConfig(config, storageAdapter)),
|
||||
inject: [MOSAIC_CONFIG, STORAGE_ADAPTER],
|
||||
},
|
||||
EmbeddingService,
|
||||
],
|
||||
controllers: [MemoryController],
|
||||
exports: [MEMORY, EmbeddingService],
|
||||
exports: [MEMORY, MEMORY_ADAPTER, EmbeddingService],
|
||||
})
|
||||
export class MemoryModule {}
|
||||
|
||||
@@ -12,7 +12,7 @@ import {
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
@@ -6,8 +6,8 @@ import {
|
||||
type OnModuleDestroy,
|
||||
type OnModuleInit,
|
||||
} from '@nestjs/common';
|
||||
import { DiscordPlugin } from '@mosaic/discord-plugin';
|
||||
import { TelegramPlugin } from '@mosaic/telegram-plugin';
|
||||
import { DiscordPlugin } from '@mosaicstack/discord-plugin';
|
||||
import { TelegramPlugin } from '@mosaicstack/telegram-plugin';
|
||||
import { PluginService } from './plugin.service.js';
|
||||
import type { IChannelPlugin } from './plugin.interface.js';
|
||||
import { PLUGIN_REGISTRY } from './plugin.tokens.js';
|
||||
@@ -48,7 +48,7 @@ class TelegramChannelPluginAdapter implements IChannelPlugin {
|
||||
}
|
||||
}
|
||||
|
||||
const DEFAULT_GATEWAY_URL = 'http://localhost:4000';
|
||||
const DEFAULT_GATEWAY_URL = 'http://localhost:14242';
|
||||
|
||||
function createPluginRegistry(): IChannelPlugin[] {
|
||||
const plugins: IChannelPlugin[] = [];
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { describe, it, expect, vi } from 'vitest';
|
||||
import { PreferencesService, PLATFORM_DEFAULTS, IMMUTABLE_KEYS } from './preferences.service.js';
|
||||
import type { Db } from '@mosaic/db';
|
||||
import type { Db } from '@mosaicstack/db';
|
||||
|
||||
/**
|
||||
* Build a mock Drizzle DB where the select chain supports:
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { eq, and, sql, type Db, preferences as preferencesTable } from '@mosaic/db';
|
||||
import { eq, and, sql, type Db, preferences as preferencesTable } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
|
||||
export const PLATFORM_DEFAULTS: Record<string, unknown> = {
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Injectable, Logger } from '@nestjs/common';
|
||||
import { createQueue, type QueueHandle } from '@mosaic/queue';
|
||||
import { createQueue, type QueueHandle } from '@mosaicstack/queue';
|
||||
|
||||
const SESSION_SYSTEM_KEY = (sessionId: string) => `mosaic:session:${sessionId}:system`;
|
||||
const SESSION_SYSTEM_FRAGMENTS_KEY = (sessionId: string) =>
|
||||
|
||||
@@ -13,7 +13,7 @@ import {
|
||||
Post,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
34
apps/gateway/src/queue/queue-admin.dto.ts
Normal file
34
apps/gateway/src/queue/queue-admin.dto.ts
Normal file
@@ -0,0 +1,34 @@
|
||||
export type JobStatus = 'active' | 'completed' | 'failed' | 'waiting' | 'delayed';
|
||||
|
||||
export interface JobDto {
|
||||
id: string;
|
||||
name: string;
|
||||
queue: string;
|
||||
status: JobStatus;
|
||||
attempts: number;
|
||||
maxAttempts: number;
|
||||
createdAt?: string;
|
||||
processedAt?: string;
|
||||
finishedAt?: string;
|
||||
failedReason?: string;
|
||||
data: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface JobListDto {
|
||||
jobs: JobDto[];
|
||||
total: number;
|
||||
}
|
||||
|
||||
export interface QueueStatusDto {
|
||||
name: string;
|
||||
waiting: number;
|
||||
active: number;
|
||||
completed: number;
|
||||
failed: number;
|
||||
delayed: number;
|
||||
paused: boolean;
|
||||
}
|
||||
|
||||
export interface QueueListDto {
|
||||
queues: QueueStatusDto[];
|
||||
}
|
||||
21
apps/gateway/src/queue/queue.module.ts
Normal file
21
apps/gateway/src/queue/queue.module.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
import { Global, Module } from '@nestjs/common';
|
||||
import { createQueueAdapter, type QueueAdapter } from '@mosaicstack/queue';
|
||||
import type { MosaicConfig } from '@mosaicstack/config';
|
||||
import { MOSAIC_CONFIG } from '../config/config.module.js';
|
||||
import { QueueService } from './queue.service.js';
|
||||
|
||||
export const QUEUE_ADAPTER = 'QUEUE_ADAPTER';
|
||||
|
||||
@Global()
|
||||
@Module({
|
||||
providers: [
|
||||
QueueService,
|
||||
{
|
||||
provide: QUEUE_ADAPTER,
|
||||
useFactory: (config: MosaicConfig): QueueAdapter => createQueueAdapter(config.queue),
|
||||
inject: [MOSAIC_CONFIG],
|
||||
},
|
||||
],
|
||||
exports: [QueueService, QUEUE_ADAPTER],
|
||||
})
|
||||
export class QueueModule {}
|
||||
412
apps/gateway/src/queue/queue.service.ts
Normal file
412
apps/gateway/src/queue/queue.service.ts
Normal file
@@ -0,0 +1,412 @@
|
||||
import {
|
||||
Inject,
|
||||
Injectable,
|
||||
Logger,
|
||||
Optional,
|
||||
type OnModuleInit,
|
||||
type OnModuleDestroy,
|
||||
} from '@nestjs/common';
|
||||
import { Queue, Worker, type Job, type ConnectionOptions } from 'bullmq';
|
||||
import type { LogService } from '@mosaicstack/log';
|
||||
import { LOG_SERVICE } from '../log/log.tokens.js';
|
||||
import type { JobDto, JobStatus } from './queue-admin.dto.js';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Typed job definitions
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export interface SummarizationJobData {
|
||||
triggeredBy?: string;
|
||||
}
|
||||
|
||||
export interface GCJobData {
|
||||
triggeredBy?: string;
|
||||
}
|
||||
|
||||
export interface TierManagementJobData {
|
||||
triggeredBy?: string;
|
||||
}
|
||||
|
||||
export type MosaicJobData = SummarizationJobData | GCJobData | TierManagementJobData;
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Queue health status
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export interface QueueHealthStatus {
|
||||
queues: Record<
|
||||
string,
|
||||
{
|
||||
waiting: number;
|
||||
active: number;
|
||||
failed: number;
|
||||
completed: number;
|
||||
paused: boolean;
|
||||
}
|
||||
>;
|
||||
healthy: boolean;
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Constants
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export const QUEUE_SUMMARIZATION = 'mosaic-summarization';
|
||||
export const QUEUE_GC = 'mosaic-gc';
|
||||
export const QUEUE_TIER_MANAGEMENT = 'mosaic-tier-management';
|
||||
|
||||
const DEFAULT_VALKEY_URL = 'redis://localhost:6380';
|
||||
|
||||
/**
|
||||
* Parse a Redis URL string into a BullMQ-compatible ConnectionOptions object.
|
||||
*
|
||||
* BullMQ v5 does `Object.assign({ port: 6379, host: '127.0.0.1' }, opts)` in
|
||||
* its RedisConnection constructor. If opts is a URL string, Object.assign only
|
||||
* copies character-index properties and the defaults survive — so 6379 wins.
|
||||
* We must parse the URL ourselves and return a plain RedisOptions object.
|
||||
*/
|
||||
function getConnection(): ConnectionOptions {
|
||||
const url = process.env['VALKEY_URL'] ?? DEFAULT_VALKEY_URL;
|
||||
try {
|
||||
const parsed = new URL(url);
|
||||
const opts: ConnectionOptions = {
|
||||
host: parsed.hostname || '127.0.0.1',
|
||||
port: parsed.port ? parseInt(parsed.port, 10) : 6380,
|
||||
};
|
||||
if (parsed.password) {
|
||||
(opts as Record<string, unknown>)['password'] = decodeURIComponent(parsed.password);
|
||||
}
|
||||
if (parsed.pathname && parsed.pathname.length > 1) {
|
||||
const db = parseInt(parsed.pathname.slice(1), 10);
|
||||
if (!isNaN(db)) {
|
||||
(opts as Record<string, unknown>)['db'] = db;
|
||||
}
|
||||
}
|
||||
return opts;
|
||||
} catch {
|
||||
// Fallback: hope the value is already a host string ioredis understands
|
||||
return { host: '127.0.0.1', port: 6380 } as ConnectionOptions;
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Job handler type
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export type JobHandler<T = MosaicJobData> = (job: Job<T>) => Promise<void>;
|
||||
|
||||
/** System session ID used for job-event log entries (no real user session). */
|
||||
const SYSTEM_SESSION_ID = 'system';
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// QueueService
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
@Injectable()
|
||||
export class QueueService implements OnModuleInit, OnModuleDestroy {
|
||||
private readonly logger = new Logger(QueueService.name);
|
||||
private readonly connection: ConnectionOptions;
|
||||
private readonly queues = new Map<string, Queue<MosaicJobData>>();
|
||||
private readonly workers = new Map<string, Worker<MosaicJobData>>();
|
||||
|
||||
constructor(
|
||||
@Optional()
|
||||
@Inject(LOG_SERVICE)
|
||||
private readonly logService: LogService | null,
|
||||
) {
|
||||
this.connection = getConnection();
|
||||
}
|
||||
|
||||
onModuleInit(): void {
|
||||
this.logger.log('QueueService initialised (BullMQ)');
|
||||
}
|
||||
|
||||
async onModuleDestroy(): Promise<void> {
|
||||
await this.closeAll();
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Queue helpers
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Get or create a BullMQ Queue for the given queue name.
|
||||
*/
|
||||
getQueue<T extends MosaicJobData = MosaicJobData>(name: string): Queue<T> {
|
||||
let queue = this.queues.get(name) as Queue<T> | undefined;
|
||||
if (!queue) {
|
||||
queue = new Queue<T>(name, { connection: this.connection });
|
||||
this.queues.set(name, queue as unknown as Queue<MosaicJobData>);
|
||||
}
|
||||
return queue;
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a BullMQ repeatable job (cron-style).
|
||||
* Uses `jobId` as a deterministic key so duplicate registrations are idempotent.
|
||||
*/
|
||||
async addRepeatableJob<T extends MosaicJobData>(
|
||||
queueName: string,
|
||||
jobName: string,
|
||||
data: T,
|
||||
cronExpression: string,
|
||||
): Promise<void> {
|
||||
const queue = this.getQueue<T>(queueName);
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
await (queue as Queue<any>).add(jobName, data, {
|
||||
repeat: { pattern: cronExpression },
|
||||
jobId: `${queueName}:${jobName}:repeatable`,
|
||||
});
|
||||
this.logger.log(
|
||||
`Repeatable job "${jobName}" registered on "${queueName}" (cron: ${cronExpression})`,
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a Worker for the given queue name with error handling and
|
||||
* exponential backoff.
|
||||
*/
|
||||
registerWorker<T extends MosaicJobData>(queueName: string, handler: JobHandler<T>): Worker<T> {
|
||||
const worker = new Worker<T>(
|
||||
queueName,
|
||||
async (job) => {
|
||||
this.logger.debug(`Processing job "${job.name}" (id=${job.id}) on queue "${queueName}"`);
|
||||
await this.logJobEvent(
|
||||
queueName,
|
||||
job.name,
|
||||
job.id ?? 'unknown',
|
||||
'started',
|
||||
job.attemptsMade + 1,
|
||||
);
|
||||
await handler(job);
|
||||
},
|
||||
{
|
||||
connection: this.connection,
|
||||
// Exponential backoff: base 5s, factor 2, max 5 attempts
|
||||
settings: {
|
||||
backoffStrategy: (attemptsMade: number) => {
|
||||
return Math.min(5000 * Math.pow(2, attemptsMade - 1), 60_000);
|
||||
},
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
worker.on('completed', (job) => {
|
||||
this.logger.log(`Job "${job.name}" (id=${job.id}) completed on queue "${queueName}"`);
|
||||
this.logJobEvent(
|
||||
queueName,
|
||||
job.name,
|
||||
job.id ?? 'unknown',
|
||||
'completed',
|
||||
job.attemptsMade,
|
||||
).catch((err) => this.logger.warn(`Failed to write completed job log: ${String(err)}`));
|
||||
});
|
||||
|
||||
worker.on('failed', (job, err) => {
|
||||
const errMsg = err instanceof Error ? err.message : String(err);
|
||||
this.logger.error(
|
||||
`Job "${job?.name ?? 'unknown'}" (id=${job?.id ?? 'unknown'}) failed on queue "${queueName}": ${errMsg}`,
|
||||
);
|
||||
this.logJobEvent(
|
||||
queueName,
|
||||
job?.name ?? 'unknown',
|
||||
job?.id ?? 'unknown',
|
||||
'failed',
|
||||
job?.attemptsMade ?? 0,
|
||||
errMsg,
|
||||
).catch((e) => this.logger.warn(`Failed to write failed job log: ${String(e)}`));
|
||||
});
|
||||
|
||||
this.workers.set(queueName, worker as unknown as Worker<MosaicJobData>);
|
||||
return worker;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return queue health statistics for all managed queues.
|
||||
*/
|
||||
async getHealthStatus(): Promise<QueueHealthStatus> {
|
||||
const queues: QueueHealthStatus['queues'] = {};
|
||||
let healthy = true;
|
||||
|
||||
for (const [name, queue] of this.queues) {
|
||||
try {
|
||||
const [waiting, active, failed, completed, paused] = await Promise.all([
|
||||
queue.getWaitingCount(),
|
||||
queue.getActiveCount(),
|
||||
queue.getFailedCount(),
|
||||
queue.getCompletedCount(),
|
||||
queue.isPaused(),
|
||||
]);
|
||||
queues[name] = { waiting, active, failed, completed, paused };
|
||||
} catch (err) {
|
||||
this.logger.error(`Failed to fetch health for queue "${name}": ${err}`);
|
||||
healthy = false;
|
||||
queues[name] = { waiting: 0, active: 0, failed: 0, completed: 0, paused: false };
|
||||
}
|
||||
}
|
||||
|
||||
return { queues, healthy };
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Admin API helpers (M6-006)
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* List jobs across all managed queues, optionally filtered by status.
|
||||
* BullMQ jobs are fetched by state type from each queue.
|
||||
*/
|
||||
async listJobs(status?: JobStatus): Promise<JobDto[]> {
|
||||
const jobs: JobDto[] = [];
|
||||
const states: JobStatus[] = status
|
||||
? [status]
|
||||
: ['active', 'completed', 'failed', 'waiting', 'delayed'];
|
||||
|
||||
for (const [queueName, queue] of this.queues) {
|
||||
try {
|
||||
for (const state of states) {
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
const raw = await (queue as Queue<any>).getJobs([state as any]);
|
||||
for (const j of raw) {
|
||||
jobs.push(this.toJobDto(queueName, j, state));
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
this.logger.warn(`Failed to list jobs for queue "${queueName}": ${String(err)}`);
|
||||
}
|
||||
}
|
||||
|
||||
return jobs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Retry a specific failed job by its BullMQ job ID (format: "queueName:id").
|
||||
* The caller passes "<queueName>__<jobId>" as the composite ID because BullMQ
|
||||
* job IDs are not globally unique — they are scoped to their queue.
|
||||
*/
|
||||
async retryJob(compositeId: string): Promise<{ ok: boolean; message: string }> {
|
||||
const sep = compositeId.lastIndexOf('__');
|
||||
if (sep === -1) {
|
||||
return { ok: false, message: 'Invalid job id format. Expected "<queue>__<jobId>".' };
|
||||
}
|
||||
const queueName = compositeId.slice(0, sep);
|
||||
const jobId = compositeId.slice(sep + 2);
|
||||
|
||||
const queue = this.queues.get(queueName);
|
||||
if (!queue) {
|
||||
return { ok: false, message: `Queue "${queueName}" not found.` };
|
||||
}
|
||||
|
||||
const job = await queue.getJob(jobId);
|
||||
if (!job) {
|
||||
return { ok: false, message: `Job "${jobId}" not found in queue "${queueName}".` };
|
||||
}
|
||||
|
||||
const state = await job.getState();
|
||||
if (state !== 'failed') {
|
||||
return { ok: false, message: `Job "${jobId}" is not in failed state (current: ${state}).` };
|
||||
}
|
||||
|
||||
await job.retry('failed');
|
||||
await this.logJobEvent(queueName, job.name, jobId, 'retried', (job.attemptsMade ?? 0) + 1);
|
||||
return { ok: true, message: `Job "${jobId}" on queue "${queueName}" queued for retry.` };
|
||||
}
|
||||
|
||||
/**
|
||||
* Pause a queue by name.
|
||||
*/
|
||||
async pauseQueue(name: string): Promise<{ ok: boolean; message: string }> {
|
||||
const queue = this.queues.get(name);
|
||||
if (!queue) return { ok: false, message: `Queue "${name}" not found.` };
|
||||
await queue.pause();
|
||||
this.logger.log(`Queue paused: ${name}`);
|
||||
return { ok: true, message: `Queue "${name}" paused.` };
|
||||
}
|
||||
|
||||
/**
|
||||
* Resume a paused queue by name.
|
||||
*/
|
||||
async resumeQueue(name: string): Promise<{ ok: boolean; message: string }> {
|
||||
const queue = this.queues.get(name);
|
||||
if (!queue) return { ok: false, message: `Queue "${name}" not found.` };
|
||||
await queue.resume();
|
||||
this.logger.log(`Queue resumed: ${name}`);
|
||||
return { ok: true, message: `Queue "${name}" resumed.` };
|
||||
}
|
||||
|
||||
private toJobDto(queueName: string, job: Job<MosaicJobData>, status: JobStatus): JobDto {
|
||||
return {
|
||||
id: `${queueName}__${job.id ?? 'unknown'}`,
|
||||
name: job.name,
|
||||
queue: queueName,
|
||||
status,
|
||||
attempts: job.attemptsMade,
|
||||
maxAttempts: job.opts?.attempts ?? 1,
|
||||
createdAt: job.timestamp ? new Date(job.timestamp).toISOString() : undefined,
|
||||
processedAt: job.processedOn ? new Date(job.processedOn).toISOString() : undefined,
|
||||
finishedAt: job.finishedOn ? new Date(job.finishedOn).toISOString() : undefined,
|
||||
failedReason: job.failedReason,
|
||||
data: (job.data as Record<string, unknown>) ?? {},
|
||||
};
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Job event logging (M6-007)
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/** Write a log entry to agent_logs for BullMQ job lifecycle events. */
|
||||
private async logJobEvent(
|
||||
queueName: string,
|
||||
jobName: string,
|
||||
jobId: string,
|
||||
event: 'started' | 'completed' | 'retried' | 'failed',
|
||||
attempts: number,
|
||||
errorMessage?: string,
|
||||
): Promise<void> {
|
||||
if (!this.logService) return;
|
||||
|
||||
const level = event === 'failed' ? ('error' as const) : ('info' as const);
|
||||
const content =
|
||||
event === 'failed'
|
||||
? `Job "${jobName}" (${jobId}) on queue "${queueName}" failed: ${errorMessage ?? 'unknown error'}`
|
||||
: `Job "${jobName}" (${jobId}) on queue "${queueName}" ${event} (attempt ${attempts})`;
|
||||
|
||||
try {
|
||||
await this.logService.logs.ingest({
|
||||
sessionId: SYSTEM_SESSION_ID,
|
||||
userId: 'system',
|
||||
level,
|
||||
category: 'general',
|
||||
content,
|
||||
metadata: {
|
||||
jobId,
|
||||
jobName,
|
||||
queue: queueName,
|
||||
event,
|
||||
attempts,
|
||||
...(errorMessage ? { errorMessage } : {}),
|
||||
},
|
||||
});
|
||||
} catch (err) {
|
||||
// Log errors must never crash job execution
|
||||
this.logger.warn(`Failed to write job event log for job ${jobId}: ${String(err)}`);
|
||||
}
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Lifecycle
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
private async closeAll(): Promise<void> {
|
||||
const workerCloses = Array.from(this.workers.values()).map((w) =>
|
||||
w.close().catch((err) => this.logger.error(`Worker close error: ${err}`)),
|
||||
);
|
||||
const queueCloses = Array.from(this.queues.values()).map((q) =>
|
||||
q.close().catch((err) => this.logger.error(`Queue close error: ${err}`)),
|
||||
);
|
||||
await Promise.all([...workerCloses, ...queueCloses]);
|
||||
this.workers.clear();
|
||||
this.queues.clear();
|
||||
this.logger.log('QueueService shut down');
|
||||
}
|
||||
}
|
||||
2
apps/gateway/src/queue/queue.tokens.ts
Normal file
2
apps/gateway/src/queue/queue.tokens.ts
Normal file
@@ -0,0 +1,2 @@
|
||||
export const QUEUE_REDIS = 'QUEUE_REDIS';
|
||||
export const QUEUE_SERVICE = 'QUEUE_SERVICE';
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Controller, HttpCode, HttpStatus, Inject, Post, UseGuards } from '@nestjs/common';
|
||||
import type { SystemReloadPayload } from '@mosaic/types';
|
||||
import type { SystemReloadPayload } from '@mosaicstack/types';
|
||||
import { AdminGuard } from '../admin/admin.guard.js';
|
||||
import { ChatGateway } from '../chat/chat.gateway.js';
|
||||
import { ReloadService } from './reload.service.js';
|
||||
|
||||
@@ -5,7 +5,7 @@ import {
|
||||
type OnApplicationBootstrap,
|
||||
type OnApplicationShutdown,
|
||||
} from '@nestjs/common';
|
||||
import type { SystemReloadPayload } from '@mosaic/types';
|
||||
import type { SystemReloadPayload } from '@mosaicstack/types';
|
||||
import { CommandRegistryService } from '../commands/command-registry.service.js';
|
||||
import { isMosaicPlugin } from './mosaic-plugin.interface.js';
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable } from '@nestjs/common';
|
||||
import { eq, type Db, skills } from '@mosaic/db';
|
||||
import { eq, type Db, skills } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
|
||||
type Skill = typeof skills.$inferSelect;
|
||||
|
||||
@@ -14,7 +14,7 @@ import {
|
||||
Query,
|
||||
UseGuards,
|
||||
} from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { AuthGuard } from '../auth/auth.guard.js';
|
||||
import { CurrentUser } from '../auth/current-user.decorator.js';
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import type { Brain } from '@mosaic/brain';
|
||||
import type { Brain } from '@mosaicstack/brain';
|
||||
import { BRAIN } from '../brain/brain.tokens.js';
|
||||
import { PluginService } from '../plugin/plugin.service.js';
|
||||
import { WorkspaceService } from './workspace.service.js';
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { eq, and, type Db, teams, teamMembers, projects } from '@mosaic/db';
|
||||
import { eq, and, type Db, teams, teamMembers, projects } from '@mosaicstack/db';
|
||||
import { DB } from '../database/database.module.js';
|
||||
|
||||
@Injectable()
|
||||
|
||||
@@ -4,15 +4,15 @@
|
||||
"rootDir": "../..",
|
||||
"baseUrl": ".",
|
||||
"paths": {
|
||||
"@mosaic/auth": ["../../packages/auth/src/index.ts"],
|
||||
"@mosaic/brain": ["../../packages/brain/src/index.ts"],
|
||||
"@mosaic/coord": ["../../packages/coord/src/index.ts"],
|
||||
"@mosaic/db": ["../../packages/db/src/index.ts"],
|
||||
"@mosaic/log": ["../../packages/log/src/index.ts"],
|
||||
"@mosaic/memory": ["../../packages/memory/src/index.ts"],
|
||||
"@mosaic/types": ["../../packages/types/src/index.ts"],
|
||||
"@mosaic/discord-plugin": ["../../plugins/discord/src/index.ts"],
|
||||
"@mosaic/telegram-plugin": ["../../plugins/telegram/src/index.ts"]
|
||||
"@mosaicstack/auth": ["../../packages/auth/src/index.ts"],
|
||||
"@mosaicstack/brain": ["../../packages/brain/src/index.ts"],
|
||||
"@mosaicstack/coord": ["../../packages/coord/src/index.ts"],
|
||||
"@mosaicstack/db": ["../../packages/db/src/index.ts"],
|
||||
"@mosaicstack/log": ["../../packages/log/src/index.ts"],
|
||||
"@mosaicstack/memory": ["../../packages/memory/src/index.ts"],
|
||||
"@mosaicstack/types": ["../../packages/types/src/index.ts"],
|
||||
"@mosaicstack/discord-plugin": ["../../plugins/discord/src/index.ts"],
|
||||
"@mosaicstack/telegram-plugin": ["../../plugins/telegram/src/index.ts"]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,7 +2,7 @@ import type { NextConfig } from 'next';
|
||||
|
||||
const nextConfig: NextConfig = {
|
||||
output: 'standalone',
|
||||
transpilePackages: ['@mosaic/design-tokens'],
|
||||
transpilePackages: ['@mosaicstack/design-tokens'],
|
||||
|
||||
// Enable gzip/brotli compression for all responses.
|
||||
compress: true,
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@mosaic/web",
|
||||
"version": "0.0.0",
|
||||
"name": "@mosaicstack/web",
|
||||
"version": "0.0.2",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"build": "next build",
|
||||
@@ -12,7 +12,7 @@
|
||||
"start": "next start"
|
||||
},
|
||||
"dependencies": {
|
||||
"@mosaic/design-tokens": "workspace:^",
|
||||
"@mosaicstack/design-tokens": "workspace:^",
|
||||
"better-auth": "^1.5.5",
|
||||
"clsx": "^2.1.0",
|
||||
"next": "^16.0.0",
|
||||
|
||||
@@ -5,9 +5,9 @@ import { defineConfig, devices } from '@playwright/test';
|
||||
*
|
||||
* Assumes:
|
||||
* - Next.js web app running on http://localhost:3000
|
||||
* - NestJS gateway running on http://localhost:4000
|
||||
* - NestJS gateway running on http://localhost:14242
|
||||
*
|
||||
* Run with: pnpm --filter @mosaic/web test:e2e
|
||||
* Run with: pnpm --filter @mosaicstack/web test:e2e
|
||||
*/
|
||||
export default defineConfig({
|
||||
testDir: './e2e',
|
||||
|
||||
0
apps/web/public/.gitkeep
Normal file
0
apps/web/public/.gitkeep
Normal file
@@ -1,4 +1,4 @@
|
||||
const GATEWAY_URL = process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:4000';
|
||||
const GATEWAY_URL = process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:14242';
|
||||
|
||||
export interface ApiRequestInit extends Omit<RequestInit, 'body'> {
|
||||
body?: unknown;
|
||||
|
||||
@@ -2,7 +2,7 @@ import { createAuthClient } from 'better-auth/react';
|
||||
import { adminClient, genericOAuthClient } from 'better-auth/client/plugins';
|
||||
|
||||
export const authClient = createAuthClient({
|
||||
baseURL: process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:4000',
|
||||
baseURL: process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:14242',
|
||||
plugins: [adminClient(), genericOAuthClient()],
|
||||
});
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { io, type Socket } from 'socket.io-client';
|
||||
|
||||
const GATEWAY_URL = process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:4000';
|
||||
const GATEWAY_URL = process.env['NEXT_PUBLIC_GATEWAY_URL'] ?? 'http://localhost:14242';
|
||||
|
||||
let socket: Socket | null = null;
|
||||
|
||||
|
||||
231
briefs/monorepo-consolidation.md
Normal file
231
briefs/monorepo-consolidation.md
Normal file
@@ -0,0 +1,231 @@
|
||||
# Brief: Monorepo Consolidation — mosaic/stack → mosaic/mosaic-stack
|
||||
|
||||
## Source
|
||||
|
||||
Architecture consolidation — merge the mosaic/stack repo (Forge pipeline, MACP protocol, framework tools) into mosaic/mosaic-stack (Harness Foundation platform). Two repos doing related work that need to converge.
|
||||
|
||||
## Context
|
||||
|
||||
**mosaic/stack** (OLD) contains:
|
||||
|
||||
- Forge progressive refinement pipeline (stages, agents, personas, rails, debate protocol, brief classification)
|
||||
- MACP protocol (JSON schemas, deterministic Python controller, dispatcher, event system, gate runner)
|
||||
- Credential resolver (Python — OC config, mosaic files, ambient env, JSON5 parser)
|
||||
- OC framework plugin (injects Mosaic rails into all agent sessions)
|
||||
- Profiles (runtime-neutral context packs for tech stacks and domains)
|
||||
- Stage adapter (Forge→MACP bridge)
|
||||
- Board tasks (multi-agent board evaluation)
|
||||
- OpenBrain specialist memory (learning capture/recall)
|
||||
- 17 guides, 5 universal skills
|
||||
|
||||
**mosaic/mosaic-stack** (NEW) contains:
|
||||
|
||||
- Harness Foundation platform (NestJS gateway, Next.js web, Drizzle ORM, Pi SDK runtime)
|
||||
- 5 provider adapters, task classifier, routing rules, model capability matrix
|
||||
- MACP OC plugin (ACP runtime backend with Pi bridge)
|
||||
- TS coord package (mission runner, tasks file manager, status tracker — 1635 lines)
|
||||
- BullMQ job queue, OTEL telemetry, channel plugins (Discord, Telegram)
|
||||
- CLI with TUI, 65/65 tasks done, v0.2.0
|
||||
|
||||
**Decision:** NEW repo is the base. All unique work from OLD gets ported into NEW as packages.
|
||||
|
||||
## Scope
|
||||
|
||||
### Work Package 1: Forge Pipeline Package (`packages/forge`)
|
||||
|
||||
Port the entire Forge progressive refinement pipeline as a TypeScript package.
|
||||
|
||||
**From OLD:**
|
||||
|
||||
- `forge/pipeline/stages/*.md` — 11 stage definitions
|
||||
- `forge/pipeline/agents/{board,generalists,specialists,cross-cutting}/*.md` — all persona definitions
|
||||
- `forge/pipeline/rails/*.md` — debate protocol, dynamic composition, worker rails
|
||||
- `forge/pipeline/gates/` — gate reviewer definitions
|
||||
- `forge/pipeline/orchestrator/run-structure.md` — file-based observability spec
|
||||
- `forge/templates/` — brief and PRD templates
|
||||
- `forge/pipeline/orchestrator/board_tasks.py` → rewrite in TS
|
||||
- `forge/pipeline/orchestrator/stage_adapter.py` → rewrite in TS
|
||||
- `forge/pipeline/orchestrator/pipeline_runner.py` → rewrite in TS
|
||||
- `forge/forge` CLI (Python) → rewrite in TS, integrate with `packages/cli`
|
||||
|
||||
**Package structure:**
|
||||
|
||||
```
|
||||
packages/forge/
|
||||
├── src/
|
||||
│ ├── index.ts # Public API
|
||||
│ ├── pipeline-runner.ts # Orchestrates full pipeline run
|
||||
│ ├── stage-adapter.ts # Maps stages to MACP/coord tasks
|
||||
│ ├── board-tasks.ts # Multi-agent board evaluation task generator
|
||||
│ ├── brief-classifier.ts # strategic/technical/hotfix classification
|
||||
│ ├── types.ts # Stage specs, run manifest, gate results
|
||||
│ └── constants.ts # Stage sequence, timeouts, labels
|
||||
├── pipeline/
|
||||
│ ├── stages/ # .md stage definitions (copied)
|
||||
│ ├── agents/ # .md persona definitions (copied)
|
||||
│ │ ├── board/
|
||||
│ │ ├── cross-cutting/
|
||||
│ │ ├── generalists/
|
||||
│ │ └── specialists/
|
||||
│ │ ├── language/
|
||||
│ │ └── domain/
|
||||
│ ├── rails/ # .md rails (copied)
|
||||
│ ├── gates/ # .md gate definitions (copied)
|
||||
│ └── templates/ # brief + PRD templates (copied)
|
||||
└── package.json
|
||||
```
|
||||
|
||||
**Key design decisions:**
|
||||
|
||||
- Pipeline markdown assets are runtime data, not compiled — ship as-is in the package
|
||||
- `pipeline-runner.ts` calls into `packages/coord` for task execution (not a separate controller)
|
||||
- Stage adapter generates coord-compatible tasks, not MACP JSON directly
|
||||
- Board tasks use `depends_on_policy: "all_terminal"` for synthesis
|
||||
- Per-stage timeouts from `STAGE_TIMEOUTS` map
|
||||
- Brief classifier supports CLI flag, YAML frontmatter, and keyword auto-detection
|
||||
- Run output goes to project-scoped `.forge/runs/{run-id}/` (not inside the Forge package)
|
||||
|
||||
**Persona override system (new):**
|
||||
|
||||
- Base personas ship with the package (read-only)
|
||||
- Project-level overrides in `.forge/personas/{role}.md` extend (not replace) base personas
|
||||
- Board composition configurable via `.forge/config.yaml`:
|
||||
```yaml
|
||||
board:
|
||||
additional_members:
|
||||
- compliance-officer.md
|
||||
skip_members: []
|
||||
specialists:
|
||||
always_include:
|
||||
- proxmox-expert
|
||||
```
|
||||
- OpenBrain integration for cross-run specialist memory (when enabled)
|
||||
|
||||
### Work Package 2: MACP Protocol Package (`packages/macp`)
|
||||
|
||||
Port the MACP protocol layer, event system, and gate runner as a TypeScript package.
|
||||
|
||||
**From OLD:**
|
||||
|
||||
- `tools/macp/protocol/task.schema.json` — task JSON schema
|
||||
- `tools/macp/protocol/` — event schemas
|
||||
- `tools/macp/controller/gate_runner.py` → rewrite in TS as `gate-runner.ts`
|
||||
- `tools/macp/events/` — event watcher, webhook adapter, Discord formatter → rewrite in TS
|
||||
- `tools/macp/dispatcher/credential_resolver.py` → rewrite in TS as `credential-resolver.ts`
|
||||
- `tools/macp/memory/learning_capture.py` + `learning_recall.py` → rewrite in TS
|
||||
|
||||
**Package structure:**
|
||||
|
||||
```
|
||||
packages/macp/
|
||||
├── src/
|
||||
│ ├── index.ts # Public API
|
||||
│ ├── types.ts # Task, event, result, gate types
|
||||
│ ├── schemas/ # JSON schemas (copied)
|
||||
│ ├── gate-runner.ts # Mechanical + AI review quality gates
|
||||
│ ├── credential-resolver.ts # Provider credential resolution (mosaic files, OC config, ambient)
|
||||
│ ├── event-emitter.ts # Append events to ndjson, structured event types
|
||||
│ ├── event-watcher.ts # Poll events.ndjson with cursor persistence
|
||||
│ ├── webhook-adapter.ts # POST events to configurable URL
|
||||
│ ├── discord-formatter.ts # Human-readable event messages
|
||||
│ └── learning.ts # OpenBrain capture + recall
|
||||
└── package.json
|
||||
```
|
||||
|
||||
**Integration with existing packages:**
|
||||
|
||||
- `packages/coord` uses `packages/macp` for event emission, gate running, and credential resolution
|
||||
- `plugins/macp` uses `packages/macp` for protocol types and credential resolution
|
||||
- `packages/forge` uses `packages/macp` gate types for stage gates
|
||||
|
||||
### Work Package 3: OC Framework Plugin (`plugins/mosaic-framework`)
|
||||
|
||||
Port the OC framework plugin that injects Mosaic rails into all agent sessions.
|
||||
|
||||
**From OLD:**
|
||||
|
||||
- `oc-plugins/mosaic-framework/index.ts` — `before_agent_start` + `subagent_spawning` hooks
|
||||
- `oc-plugins/mosaic-framework/openclaw.plugin.json`
|
||||
|
||||
**Structure:**
|
||||
|
||||
```
|
||||
plugins/mosaic-framework/
|
||||
├── src/
|
||||
│ └── index.ts # Plugin hooks
|
||||
└── package.json
|
||||
```
|
||||
|
||||
**This is separate from `plugins/macp`:**
|
||||
|
||||
- `mosaic-framework` = injects Mosaic rails/contracts into every OC session (passive enforcement)
|
||||
- `macp` = provides an ACP runtime backend for MACP task execution (active runtime)
|
||||
|
||||
### Work Package 4: Profiles + Guides + Skills
|
||||
|
||||
Port reference content as a documentation/config package or top-level directories.
|
||||
|
||||
**From OLD:**
|
||||
|
||||
- `profiles/domains/*.json` — HIPAA, fintech, crypto context packs
|
||||
- `profiles/tech-stacks/*.json` — NestJS, Next.js, FastAPI, React conventions
|
||||
- `profiles/workflows/*.json` — API development, frontend component, testing workflows
|
||||
- `guides/*.md` — 17 guides (auth, backend, QA, orchestrator, PRD, etc.)
|
||||
- `skills-universal/` — jarvis, macp, mosaic-standards, prd, setup-cicd skills
|
||||
|
||||
**Destination:**
|
||||
|
||||
```
|
||||
profiles/ # Top-level (same as OLD)
|
||||
guides/ # Top-level (same as OLD)
|
||||
skills/ # Top-level (renamed from skills-universal)
|
||||
```
|
||||
|
||||
These are runtime-neutral assets consumed by any agent or profile loader — they don't belong in a compiled package.
|
||||
|
||||
## Out of Scope
|
||||
|
||||
- Rewriting the NestJS orchestrator app from OLD (`apps/orchestrator/`) — its functionality is subsumed by `packages/coord` + `apps/gateway`
|
||||
- Porting the FastAPI coordinator from OLD (`apps/coordinator/`) — its functionality (webhook receiver, issue parser, quality orchestrator) is handled by `packages/coord` + `apps/gateway` in the new architecture
|
||||
- Porting the Prisma schema or OLD's `apps/api` — Drizzle migration is complete
|
||||
- Old Docker Compose configs (Traefik, Matrix, OpenBao) — NEW has its own infra setup
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. `packages/forge` exists with all 11 stage definitions, all persona markdowns, all rails, and TS implementations of pipeline-runner, stage-adapter, board-tasks, and brief-classifier
|
||||
2. `packages/macp` exists with gate-runner, credential-resolver, event system, and learning capture/recall — all in TypeScript
|
||||
3. `plugins/mosaic-framework` exists and registers OC hooks for rails injection
|
||||
4. Profiles, guides, and skills are present at top-level
|
||||
5. `packages/forge` integrates with `packages/coord` for task execution
|
||||
6. `packages/macp` credential-resolver is used by `plugins/macp` Pi bridge
|
||||
7. All existing tests pass (no regressions)
|
||||
8. New packages have test coverage ≥85%
|
||||
9. `pnpm lint && pnpm typecheck && pnpm build` passes
|
||||
10. `.forge/runs/` project-scoped output directory works for at least one test run
|
||||
|
||||
## Technical Constraints
|
||||
|
||||
- All new code is ESM with NodeNext module resolution
|
||||
- No Python in the new repo — everything rewrites to TypeScript
|
||||
- Pipeline markdown assets (stages, personas, rails) are shipped as package data, not compiled
|
||||
- Credential resolver must support: mosaic credential files, OC config (JSON5), ambient environment — same resolution order as the Python version
|
||||
- Must preserve `depends_on_policy` semantics (all, any, all_terminal)
|
||||
- Per-stage timeouts must be preserved
|
||||
- JSON5 stripping must use the placeholder-extraction approach (not naive regex on string content)
|
||||
|
||||
## Estimated Complexity
|
||||
|
||||
High — crosses 4 work packages with protocol porting, TS rewrites, and integration wiring. Each work package is independently shippable.
|
||||
|
||||
**Suggested execution order:**
|
||||
|
||||
1. WP4 (profiles/guides/skills) — pure copy, no code, fast win
|
||||
2. WP2 (packages/macp) — protocol foundation, needed by WP1 and WP3
|
||||
3. WP1 (packages/forge) — the big one, depends on WP2
|
||||
4. WP3 (plugins/mosaic-framework) — OC integration, can parallel with WP1
|
||||
|
||||
## Dependencies
|
||||
|
||||
- `packages/coord` must be stable (it is — WP1 integrates with it)
|
||||
- `plugins/macp` must be stable (it is — WP2 provides types/credentials to it)
|
||||
- Pi SDK (`@mariozechner/pi-agent-core`) already in the dependency tree
|
||||
@@ -1,70 +1,73 @@
|
||||
# Mission Manifest — Harness Foundation
|
||||
# Mission Manifest — Install UX v2
|
||||
|
||||
> Persistent document tracking full mission scope, status, and session history.
|
||||
> Updated by the orchestrator at each phase transition and milestone completion.
|
||||
|
||||
## Mission
|
||||
|
||||
**ID:** harness-20260321
|
||||
**Statement:** Transform Mosaic Stack from a functional demo into a real multi-provider, task-routing AI harness. Persist all conversations, integrate frontier LLM providers (Anthropic, OpenAI, OpenRouter, Z.ai, Ollama), build granular task-aware agent routing, harden agent sessions, replace cron with BullMQ, and design the channel protocol for future Matrix/remote integration.
|
||||
**Phase:** Execution
|
||||
**Current Milestone:** M3: Provider Integration
|
||||
**Progress:** 2 / 7 milestones
|
||||
**ID:** install-ux-v2-20260405
|
||||
**Statement:** The install-ux-hardening mission shipped the plumbing (uninstall, masked password, hooks consent, unified flow, headless path), but the first real end-to-end run surfaced a critical regression and a collection of UX failings that make the wizard feel neither quick nor intelligent. This mission closes the bootstrap regression as a hotfix, then rethinks the first-run experience around a provider-first, intent-driven flow with a drill-down main menu and a genuinely fast quick-start.
|
||||
**Phase:** Planning
|
||||
**Current Milestone:** IUV-M01
|
||||
**Progress:** 0 / 3 milestones
|
||||
**Status:** active
|
||||
**Last Updated:** 2026-03-21 UTC
|
||||
**Last Updated:** 2026-04-05
|
||||
**Parent Mission:** [install-ux-hardening-20260405](./archive/missions/install-ux-hardening-20260405/MISSION-MANIFEST.md) (complete — `mosaic-v0.0.25`)
|
||||
|
||||
## Context
|
||||
|
||||
Real-run testing of `@mosaicstack/mosaic@0.0.25` uncovered:
|
||||
|
||||
1. **Critical:** admin bootstrap fails with HTTP 400 `property email should not exist` — `bootstrap.controller.ts` uses `import type { BootstrapSetupDto }`, erasing the class at runtime. Nest's `@Body()` falls back to plain `Object` metatype, and ValidationPipe with `forbidNonWhitelisted` rejects every property. One-character fix (drop the `type` keyword), but it blocks the happy path of the release that just shipped.
|
||||
2. The wizard reports `✔ Wizard complete` and `✔ Done` _after_ the bootstrap 400 — failure only propagates in headless mode (`wizard.ts:147`).
|
||||
3. The gateway port prompt does not prefill `14242` in the input buffer.
|
||||
4. `"What is Mosaic?"` intro copy does not mention Pi SDK (the actual agent runtime behind Claude/Codex/OpenCode).
|
||||
5. CORS origin prompt is confusing — the user should be able to supply an FQDN/hostname and have the system derive the CORS value.
|
||||
6. Skill / additional feature install section is unusable in practice.
|
||||
7. Quick-start asks far too many questions to be meaningfully "quick".
|
||||
8. No drill-down main menu — everything is a linear interrogation.
|
||||
9. Provider setup happens late and without intelligence. An OpenClaw-style provider-first flow would let the user describe what they want in natural language, have the agent expound on it, and have the agent choose its own name based on that intent.
|
||||
|
||||
## Success Criteria
|
||||
|
||||
- [ ] AC-1: Send messages in TUI → restart TUI → resume conversation → agent has full history and context
|
||||
- [ ] AC-2: Route a coding task to Claude Opus 4.6, a simple question to Haiku, a summarization to GLM-5 — all via granular routing rules
|
||||
- [ ] AC-3: Two users exist, User A's memory searches never return User B's data
|
||||
- [ ] AC-4: `/model claude-sonnet-4-6` in TUI switches the active model for subsequent messages
|
||||
- [ ] AC-5: `/agent coding-agent` in TUI switches to a different agent with different system prompt and tools
|
||||
- [ ] AC-6: BullMQ jobs execute on schedule, failures retry with backoff, admin can inspect via `/api/admin/jobs`
|
||||
- [ ] AC-7: Channel protocol document exists with Matrix integration points defined, reviewed, and approved
|
||||
- [ ] AC-8: Embeddings run on Ollama local models (no external API dependency for vector operations)
|
||||
- [ ] AC-9: All five providers (Anthropic, OpenAI, OpenRouter, Z.ai, Ollama) connect, list models, and complete chat requests
|
||||
- [ ] AC-10: Routing transparency — TUI displays which model was selected and the routing reason for each response
|
||||
- [ ] AC-1: Admin bootstrap completes successfully end-to-end on a fresh install (DTO value import, no forbidNonWhitelisted regression); covered by an integration or e2e test that exercises the real DTO binding.
|
||||
- [ ] AC-2: Wizard fails loudly (non-zero exit, clear error) when the bootstrap stage returns `completed: false`, in both interactive and headless modes. No more silent `✔ Wizard complete` after a 400.
|
||||
- [ ] AC-3: Gateway port prompt prefills `14242` in the input field (user can press Enter to accept).
|
||||
- [ ] AC-4: `"What is Mosaic?"` intro copy mentions Pi SDK as the underlying agent runtime.
|
||||
- [ ] AC-5: Release `mosaic-v0.0.26` tagged and published to the Gitea npm registry, unblocking the 0.0.25 happy path.
|
||||
- [ ] AC-6: CORS origin prompt replaced with FQDN/hostname input; CORS string is derived from that.
|
||||
- [ ] AC-7: Skill / additional feature install section is reworked until it is actually usable end-to-end (worker defines the concrete failure modes during diagnosis).
|
||||
- [ ] AC-8: First-run flow has a drill-down main menu with at least `Plugins` (Recommended / Custom), `Providers`, and the other top-level configuration groups. Linear interrogation is gone.
|
||||
- [ ] AC-9: `Quick Start` path completes with a minimal, curated set of questions (target: under 90 seconds for a returning user; define the exact baseline during design).
|
||||
- [ ] AC-10: Provider setup happens first, driven by a natural-language intake prompt. The agent expounds on the user's intent and chooses its own name based on that intent (OpenClaw-style). Naming is confirmable / overridable.
|
||||
- [ ] AC-11: All milestones ship as merged PRs with green CI and closed issues.
|
||||
|
||||
## Milestones
|
||||
|
||||
| # | ID | Name | Status | Branch | Issue | Started | Completed |
|
||||
| --- | ------ | ---------------------------------- | ----------- | ------ | --------- | ---------- | ---------- |
|
||||
| 1 | ms-166 | Conversation Persistence & Context | done | — | #224–#231 | 2026-03-21 | 2026-03-21 |
|
||||
| 2 | ms-167 | Security & Isolation | done | — | #232–#239 | 2026-03-21 | 2026-03-21 |
|
||||
| 3 | ms-168 | Provider Integration | in-progress | — | #240–#251 | 2026-03-21 | — |
|
||||
| 4 | ms-169 | Agent Routing Engine | not-started | — | #252–#264 | — | — |
|
||||
| 5 | ms-170 | Agent Session Hardening | not-started | — | #265–#272 | — | — |
|
||||
| 6 | ms-171 | Job Queue Foundation | not-started | — | #273–#280 | — | — |
|
||||
| 7 | ms-172 | Channel Protocol Design | not-started | — | #281–#288 | — | — |
|
||||
| # | ID | Name | Status | Branch | Issue | Started | Completed |
|
||||
| --- | ------- | ------------------------------------------------------------ | ----------- | ---------------------- | ----- | ---------- | --------- |
|
||||
| 1 | IUV-M01 | Hotfix: bootstrap DTO + wizard failure + port prefill + copy | in-progress | fix/bootstrap-hotfix | #436 | 2026-04-05 | — |
|
||||
| 2 | IUV-M02 | UX polish: CORS/FQDN, skill installer rework | not-started | feat/install-ux-polish | #437 | — | — |
|
||||
| 3 | IUV-M03 | Provider-first intelligent flow + drill-down main menu | not-started | feat/install-ux-intent | #438 | — | — |
|
||||
|
||||
## Deployment
|
||||
## Subagent Delegation Plan
|
||||
|
||||
| Target | URL | Method |
|
||||
| -------------------- | --------- | -------------------------- |
|
||||
| Docker Compose (dev) | localhost | docker compose up |
|
||||
| Production | TBD | Docker Swarm via Portainer |
|
||||
| Milestone | Recommended Tier | Rationale |
|
||||
| --------- | ---------------- | --------------------------------------------------------------------- |
|
||||
| IUV-M01 | sonnet | Tight bug cluster with known fix sites + small release cycle |
|
||||
| IUV-M02 | sonnet | UX rework, moderate surface, diagnostic-heavy for the skill installer |
|
||||
| IUV-M03 | opus | Architectural redesign of first-run flow, state machine + LLM intake |
|
||||
|
||||
## Coordination
|
||||
## Risks
|
||||
|
||||
- **Primary Agent:** claude-opus-4-6
|
||||
- **Sibling Agents:** codex (for pure coding tasks), sonnet (for review/standard work)
|
||||
- **Shared Contracts:** docs/PRD-Harness_Foundation.md, docs/TASKS.md
|
||||
- **Hotfix regression surface** — the `import type` → `import` fix on the DTO class is one character but needs an integration test that binds the real DTO, not just a controller unit test, to prevent the same class-erasure regression from sneaking back in.
|
||||
- **LLM-driven intake latency / offline** — M03's provider-first intent flow assumes an available LLM call to expound on user input and choose a name. Offline installs need a deterministic fallback.
|
||||
- **Menu vs. linear back-compat** — M03 changes the top-level flow shape; existing `tools/install.sh --yes` + env-var headless path must continue to work.
|
||||
- **Scope creep in M03** — "redesign the wizard" can absorb arbitrary work. Keep it bounded with explicit non-goals.
|
||||
|
||||
## Token Budget
|
||||
## Out of Scope
|
||||
|
||||
| Metric | Value |
|
||||
| ------ | ------ |
|
||||
| Budget | — |
|
||||
| Used | 0 |
|
||||
| Mode | normal |
|
||||
|
||||
## Session History
|
||||
|
||||
| Session | Runtime | Started | Duration | Ended Reason | Last Task |
|
||||
| ------- | --------------- | ---------- | -------- | ------------ | ------------- |
|
||||
| 1 | claude-opus-4-6 | 2026-03-21 | — | — | Planning gate |
|
||||
|
||||
## Scratchpad
|
||||
|
||||
Path: `docs/scratchpads/harness-20260321.md`
|
||||
- Migrating the wizard to a GUI / web UI (still terminal-first)
|
||||
- Replacing the Gitea registry or the Woodpecker publish pipeline
|
||||
- Multi-tenant / multi-user onboarding (still single-admin bootstrap)
|
||||
- Reworking `mosaic uninstall` (M01 of the parent mission — stable)
|
||||
|
||||
@@ -153,7 +153,7 @@ for any `<Image>` components added in the future.
|
||||
|
||||
```bash
|
||||
# Run the DB migration (requires a live DB)
|
||||
pnpm --filter @mosaic/db exec drizzle-kit migrate
|
||||
pnpm --filter @mosaicstack/db exec drizzle-kit migrate
|
||||
|
||||
# Or, in Docker/Swarm — migrations run automatically on gateway startup
|
||||
# via runMigrations() in packages/db/src/migrate.ts
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user