fugue
Stateful systems for stateless models.
01
Thesis — scaling memory, not context
The current trajectory of AI — scaling context windows indefinitely — is a structural dead end. Attention dilution, computational waste, behavioral drift. These aren't bugs. They're the architecture.
The next generation of AI systems will not scale context. They will scale memory.
Our agents operate with zero short-term memory. Context is flushed after every turn. Continuity comes from a tiered retrieval backend that handles consolidation, decay, and recall — the hippocampus — so the model can function purely as cortex. This allows agents to operate continuously for months without behavioral drift.
02
Research — open problems we're working on
- Tiered memory consolidation with adaptive decay
- Procedural extraction from unstructured behavioral history
- Entity resolution across noisy, high-cardinality datasets
- Graph-augmented retrieval with hybrid ranking
03