Streaming AI responses via Matrix message edits #383
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Summary
Implement streaming AI chat responses in Matrix rooms using incremental message edits. This fills the gap left by the unimplemented
streamChatMessagein the REST API.Context
The current LLM chat endpoint (
/api/llm/chat) is request-response only.streamChatMessageinapps/web/src/api/chat.tsis marked as not implemented. Matrix's protocol natively supports message edits (m.replacerelation), making it a natural transport for streaming LLM output.Implementation
Flow
Chunking Strategy
m.typing) while generatingLLM Integration
Typing Indicator
m.typingevent when processing startsAcceptance Criteria
Refs
apps/api/src/llm/apps/web/src/api/chat.ts(search for streamChatMessage)m.replacerelation typeCompleted in commit
93cd314on branch feature/m12-matrix-bridge.