slarty

Private preview

Invalid credentials

The Husk

In nature, a husk is what remains after the living thing is gone — the shell that protects what matters. In AI, the Husk is the geometric memory that persists after the context window closes. The model forgets. The Husk doesn’t.

Every concept that passes through the 82D pipeline leaves behind a coordinate — a permanent address in the shared meaning space. Those coordinates accumulate. They become searchable. And because they live in the lingua franca, any model can read them, not just the one that wrote them.

Your existing memory solutions — RAG, vector stores, conversation history — keep working. The Husk sits underneath them: a geometric substrate that survives what they don’t.

The forgetting problem

Every AI system today has amnesia. A model processes your data, does useful work, and then the context window closes. Everything evaporates. Next session starts from zero.

The industry’s workaround is RAG — retrieve relevant documents and stuff them into each new context window. It works, but you’re re-embedding the same knowledge over and over, paying for the same inference repeatedly, and losing anything that doesn’t fit in the retrieval window. Tuesday’s work doesn’t build on Monday’s. Each session is an island.

What happens Traditional context The Husk
Token limit reached Oldest context dropped All geometry preserved
Session ends Everything lost Coordinates persist
Model upgraded Re-embed everything Same 82D space
Provider switched Start over Still readable
System restarted Gone Persisted to disk

The Husk survives all five because it stores geometry, not model-specific representations. The coordinates are in the shared space — they don’t depend on any particular model being alive to read them.

What a Husk contains

A Husk isn’t a conversation transcript or a document store. It’s a geometric index of everything a system has learned, compressed to the essentials and permanently addressable.

Layer What it stores Why it matters
Coordinates 82D position of every concept Permanent address in meaning space
Clusters Which concepts group together Structure emerges from geometry
Consensus Where multiple models agreed Strongest signal in the space
Drift How coordinates shift over time Catch changes before they break things
Provenance Which model, when, what context Full audit trail for every coordinate

Each entry takes 328 bytes. A million concepts fit in 314 MB. The entire knowledge history of most AI systems would fit on a phone.

Retrieval without inference

Traditional memory systems re-embed the query through a neural model, search a vector store, and return documents for yet another model to process. Three inference calls before you get an answer. The Husk skips all three.

Step Traditional RAG Husk retrieval
Encode query Neural model (~75µs) Matrix multiply (~0.004µs)
Search memory ANN search on full-dim vectors Geometric search on 82D
Return format Documents to re-process Coordinates, ready to use
Cross-model? Re-embed per model Universal — any model reads it

Because the Husk lives in 82D, the same search index works for every model. One memory, readable by anything that speaks the lingua franca.

What persistent geometric memory enables

Cumulative Learning

Each session builds on the last. What a system learned on Monday is geometrically available on Tuesday — without re-embedding, without context stuffing, without paying for inference again.

Cross-Agent Knowledge

What one agent learns, every agent knows. Because the Husk lives in the shared 82D space, an agent using one model can retrieve knowledge stored by a completely different model.

Institutional Memory

Organizations accumulate AI-derived insights permanently. Employee turnover, model deprecation, provider switches — the geometric knowledge survives all of it.

Drift Detection

Compare today’s geometry to last month’s. When coordinates shift, something changed in the underlying models or data. The Husk shows you where and by how much.

328 bytes per concept
<1ms retrieval latency
18.7× smaller than source embeddings
$0 inference cost per read

What the Husk doesn’t do

It stores geometry, not text. The Husk records where concepts live in 82D space, not the original tokens. If you need verbatim recall, you still need a document store alongside it.
Lossy by design. 82 dimensions capture the relationships that matter across models, not every nuance of a specific model’s internal state. That trade-off is what makes the memory portable.
Not a replacement for RAG. RAG retrieves documents. The Husk retrieves meaning. They’re complementary — the Husk can tell RAG where to look, faster and cheaper than a neural search.

Your data. Your geometry. Your memory.

Project your embeddings, keep the coordinates, own what your models know.