All baseline gates pass (typecheck 18/18, lint 18/18, format clean),
build succeeds across all packages, 10 tests pass. Phase 4 complete.
Memory + log + skills system fully implemented.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add SkillsService with CRUD operations against the skills table,
toggle enable/disable, and findByName lookup. Wire SkillsController
with REST endpoints at /api/skills (list, get, create, update,
toggle, delete). Skills support builtin/community/custom sources
with JSON config storage.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add memory tools (search, get_preferences, save_preference,
save_insight) to agent sessions via Pi SDK custom tools. Agent
sessions now have access to semantic memory search, preference
storage, and insight capture. EmbeddingService injected into
AgentService for embedding generation during tool execution.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add SummarizationService that reads hot agent logs (>24h), groups by
session, calls a cheap LLM (gpt-4o-mini default, configurable via
SUMMARIZATION_MODEL) to extract key insights, stores them with
embeddings in the insights table, and transitions processed logs to
warm tier. Add CronService with node-cron for scheduled execution
(summarization every 6h, tier management daily at 3am). Tier
management promotes warm→cold (30d) and purges cold logs (90d).
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Implement AgentLogsRepo with structured log ingest (single + batch),
flexible query builder (filter by session, level, category, tier,
date range), and tiered storage management (hot→warm→cold→purge).
Add getLogsForSummarization() for the summarization pipeline.
Wire LogModule into gateway with REST endpoints at /api/logs.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add EmbeddingService using OpenAI-compatible embeddings API (supports
text-embedding-3-small, configurable via EMBEDDING_MODEL and
EMBEDDING_API_URL env vars). Wire embedding generation into insight
creation and semantic search endpoint. POST /api/memory/search now
generates a query embedding and performs cosine distance search via
pgvector when OPENAI_API_KEY is configured.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add memory tables to DB schema (preferences, insights with pgvector
embedding column, agent_logs, skills, summarization_jobs). Implement
PreferencesRepo (CRUD + upsert) and InsightsRepo (CRUD + semantic
search + relevance decay). Define VectorStore and EmbeddingProvider
interfaces for future provider abstraction. Wire MemoryModule into
gateway with REST endpoints at /api/memory/*.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>