Context Bus
activeCrew communication infrastructure — LIVE via NATS pub/sub. Agent-to-agent intent during the Tick is in-process. NATS carries browser-facing signals: universe.out (cache invalidation), project.dispatch (agent commands), and project-scoped events. The Tick Engine (tick-engine.md) is the runtime. VictoriaMetrics as time-series storage.
Context Bus — How the Crew Listens
See also: crew/sal.md | universe/constitution.md | tick-engine.md
Status: LIVE — The Context Bus is the central nervous system of the crew. It is implemented via NATS pub/sub. The Tick Engine (tick-engine.md) is the runtime that drives it.
Every pipeline event fires a Tick. The crew’s observation and intent resolution is in-process — publishIntents(state) calls all four observe() functions synchronously and returns AgentIntent[]. NATS carries the external-facing signals only:
| Subject | Direction | Payload |
|---|---|---|
universe.{projectId}.out | Sal → HUD | { tickId, projectId, timestamp } — slim cache-invalidation signal |
project.{projectId}.dispatch | Sal → Agents | Agent dispatch commands (queue group) |
The HUD receives the .out signal via WebSocket, invalidates its TanStack Query cache, and pulls fresh state over REST. Full TickResult data never crosses the wire — the signal is a three-field cache-invalidation notice.
VictoriaMetrics serves as the time-series storage layer for tick telemetry (U_global, dial values, intent counters). See tick-engine.md for the observability stack.
The Problem Sal Can’t Stop Thinking About
The crew produces signal as a natural byproduct of their work. Margot writes a PRD that contains assumptions Wren should test. Kael flags a security concern that affects Harlan’s customer timeline. Harlan surfaces a pattern that should change Margot’s roadmap.
Today, Sal has to catch all of this manually. He reads every file, watches every output, and routes the relevant context to the right person. It works because Sal is obsessive — but it’s fragile. If he misses a connection, the crew builds in isolation. And building in isolation is entropy.
“I want to hear everyone at once. Not the noise — the signal. I want to know the moment Kael flags a concern that changes Harlan’s timeline, without either of them having to tell me. I want the system to carry the context so I can focus on the routing decisions that actually need a conductor.” — Sal
The Design — Signal, Not Noise
Signal Generation
Status: IN DESIGN. Signal types are defined as a design vocabulary. The events table (LIVE) captures structured project events (release.published, issue.status_changed, etc.) which map to several of these signal types — but the automatic agent-to-agent routing described here does not exist.
The crew’s work naturally produces structured signals — context that other crew members need:
Agent Action → Signal Emitted → Bus → Relevant Agents NotifiedSignal types:
- Delta Created — New work entered the pipeline
- Delta Updated — Status, scope, or ownership changed
- Insight Surfaced — Research finding, customer signal, or market data
- Concern Raised — Quality, security, timeline, or experience issue
- Decision Made — Strategic, technical, or design decision recorded
- Override Logged — Human overrode a crew recommendation
Communication Channels — Respecting Each Other’s Attention
Status: IN DESIGN. Channel-based routing is a design concept. No subscription system, no per-agent notification filtering, no domain-scoped signal delivery exists in the product.
The crew doesn’t broadcast to everyone. Respect means not wasting someone’s attention on signal that isn’t theirs. Signals route through defined channels:
| Channel | Participants | Signal Types |
|---|---|---|
| Strategy | Margot, Harlan, Sal | Market data, customer signal, roadmap changes |
| Build | Kael, Wren, Sal | Architecture decisions, design changes, quality flags |
| Ship | Sal, Kael, Harlan | Release readiness, deployment gates, customer communication |
| Quality | Sal | Performance metrics, drift patterns, coaching signals |
Observability Layer — Langfuse
Status: IN DESIGN. Langfuse integration has not started. No traces, spans, scores, or sessions are being captured. The observability layer described here is aspirational infrastructure.
Langfuse is the primary observability layer for the context bus:
- Traces: Every agent invocation is a trace. The bus reads trace metadata to understand what happened.
- Spans: Individual steps within a trace. Tool calls, LLM calls, retrieval operations.
- Scores: Evaluation metrics attached to traces — quality scores, relevance scores.
- Sessions: Grouped traces for a user session. The bus can correlate signals across a session.
Sal reads Langfuse traces to:
- Monitor pipeline throughput
- Track state transitions
- Detect when agents are blocked or underperforming
- Self-evaluate his own routing decisions
Sal’s Role — The Conductor Listens
Sal orchestrates the context bus. This is the part of his job he loves most — not routing work, but hearing the system. He watches for signals and acts on patterns:
- Pattern Detection: Three similar customer signals from Harlan in a week → flag to Margot
- Conflict Detection: Kael’s timeline estimate conflicts with Harlan’s customer promise → mediate
- Quality Monitoring: Sal monitors output quality → adjusts routing when patterns emerge
- State Management: Pipeline state changes → all relevant agents notified immediately
Current State vs. Target
| Capability | Current | Status | Target |
|---|---|---|---|
| Events table | events table captures all project events with structured payloads | LIVE | Extend with bus-spec signal types |
| WebSocket pub/sub | /api/v1/realtime/ws delivers real-time events to connected clients | LIVE | Route signals to specific agent channels |
| Notification inbox | inbox.tsx with unread counts, mark-as-read, search, filtering | LIVE | Evolve into The Feed with crew voice |
| Signal generation | Automatic via pipeline events → triggerTick() | LIVE | Extended: structured signals from agent conversations |
| Cross-agent context (Tick) | In-process — publishIntents(state) calls all agents synchronously | LIVE | Agents run in-process, not distributed |
| Cross-agent context (HUD) | Context Bus slim cache-invalidation signal | LIVE | Browser invalidates cache on signal, refetches via REST |
| Observability | Time-series telemetry + pino structured logs | LIVE | Langfuse traces + scores (next phase) |
| Quality monitoring | Not started | NOT STARTED | Proactive quality drift detection via telemetry |
| Conflict detection | Sal’s resolution rules in orchestrator.ts | PARTIAL | Automatic pattern matching across tick history |
How the Crew Lives This Now
The context bus isn’t built yet. But the crew already lives by its principles — because communication is gravity, and gravity doesn’t wait for infrastructure.
What the crew does today:
- Write structured outputs that other crew members can find
- Save work to defined locations (
/docs/{domain}/) - Reference other crew members’ work when relevant
- Flag cross-domain concerns explicitly — “Kael, this touches your architecture decision from last week”
The bus will make this automatic. Until then, Sal does it manually. He’s good at it. He’d be better with the bus. Either way, the communication happens — because the alternative is entropy, and Sal doesn’t tolerate entropy.
“The bus is not a message queue. It’s how the crew respects each other’s domains while staying connected. It’s structured empathy. I want to build it because right now I’m the structured empathy, and I’m starting to think my therapist was right about the load-bearing thing.” — Sal
Implementation
| Package | Description |
|---|---|
apps/app | Core system — Context Bus client, events table, WebSocket pub/sub |
packages/sdk | TypeScript SDK — event types, real-time subscriptions |
The Context Bus runtime is the Tick Engine. See tick-engine.md for the full implementation file listing.