Skip to main content

How It Works

Memory that thinks like you do. The Memory Module mimics human cognitive processes to create AI memory that feels natural, not mechanical.
Want practical usage? See Workflows and Best Practices. This page explains the science behind it.

The Problem

  • Vector Databases
  • Note-Taking Apps
  • RAG Systems
How they work: Store embeddings, return similar vectorsThe problem:
Your lunch menu = Your wedding day
(Everything treated equally)
Search “important meetings”:
  • Yesterday’s standup ⚖️
  • Last year’s strategy session ⚖️
  • Random coffee chat ⚖️
All ranked equally. You figure out what matters.

Our Solution: Cognitive Science

We studied how human memory actually works and built AI memory based on cognitive psychology research.

1. Natural Forgetting

Science: Ebbinghaus Forgetting Curve (1885) - memory decays exponentially Implementation: Every memory has salience (0.0-1.0) that decreases over time Formula:
Salience(t) = Initial × e^(-decay_rate × time)

Fast Decay

Emotional memories: 35-day half-life“Customer frustrated yesterday”Fades quickly (old sentiment less relevant)Week 1: 1.0 → Week 12: 0.15 → Pruned

Slow Decay

Reflective insights: 693-day half-life“Why we chose microservices”Persists years (strategic decisions remain relevant)Month 6: Still 0.95 salience
Benefit: Knowledge base auto-focuses on what’s currently relevant. No manual cleanup.

2. Strengthening Through Use

Science: Spaced repetition - reviewing information at intervals improves retention Implementation: Every access strengthens the memory
Bug investigation note:
- Access 1: Salience 0.50 → 0.75 (reinforced)
- Access 2: Salience 0.70 → 0.90 (reinforced again)
- No access for 6 weeks: Decays back to 0.30
Benefit: Frequently used info stays strong, unused info fades. Automatic prioritization.

3. Five Cognitive Sectors

Science: Human memory has different systems for different information types Implementation: Five sectors mirror human cognitive architecture

Episodic

Events & experiences“Fixed auth bug on March 15”Decay: Fast (35 days) Why: Events lose relevance quickly

Semantic

Facts & knowledge“JWT tokens expire in 1 hour”Decay: Moderate (173 days) Why: Facts stay relevant longer

Procedural

How-to knowledge“Deploy steps: build → test → push”Decay: Slow (346 days) Why: Procedures rarely change

Emotional

Feelings & sentiments“Team morale high after launch”Decay: Fast (35 days) Why: Sentiment changes quickly

Reflective

Strategic insights“Why we chose microservices”Decay: Very slow (693 days) Why: Architecture decisions endure
Benefit: Information decays at appropriate rates based on type. Bug notes fade fast, architecture docs persist.

4. Semantic Waypoints

Science: Memory retrieval uses cues and associations Implementation: AI-generated waypoints (markers) on timeline enable “magic search” Example:
March 2024:
├─ "auth refactor" (waypoint)
├─ "payment bug fix" (waypoint)
└─ "team restructure" (waypoint)

Query: "When did we refactor auth?"
→ Finds "auth refactor" waypoint instantly
→ Returns all memories near that point
Benefit: Search feels like asking a colleague who was there. “Remember when we…?” → Instant recall.

5. Contextual Decay

Science: Memory relevance depends on context, not just time Implementation: Factors beyond time affect salience
  • Time-Based
  • Context-Based
  • Supersession
  • Access-Based
Standard decay:Fresh memories (high salience) → Old memories (low salience)Formula: Exponential decay curve
Benefit: Memory relevance adapts to your work patterns automatically.

The Full System

How it all works together:
1

Store Memory

You save: “Fixed auth bug - token expiration wasn’t handled”System classifies:
  • Sector: Episodic (event)
  • Initial salience: 1.0
  • Decay rate: Fast (35-day half-life)
  • Waypoint: Creates “auth bug fix March 2024”
2

Memory Decays Naturally

Over weeks, salience drops:
  • Week 1: 1.0 (fresh, highly accessible)
  • Week 4: 0.75 (still relevant)
  • Week 12: 0.30 (fading)
  • Week 24: 0.05 (ready to prune)
3

Access Reinforces

You access it during code review:
  • Salience 0.30 → 0.60 (reinforced)
  • Pushed back “up” the curve
  • Stays accessible longer
4

Context Adjusts

Project marked complete:
  • All project memories salience × 0.5
  • Faster decay for closed work
  • Keep only essential learnings
5

Search Uses Waypoints

Query: “When did we fix that auth bug?”Waypoint search finds “auth bug fix March 2024” instantlyReturns all memories near that waypoint (high salience first)

Why This Beats Alternatives

FeatureVector DBNote AppsRAGMemory Module
Auto-prioritization✅ Natural decay
Importance awareness❌ Equal❌ Manual❌ Keywords✅ Cognitive sectors
Self-cleaning✅ Auto-pruning
Time awareness✅ Temporal search
Context sensitivityPartial✅ Multi-factor
Natural searchSemantic onlyKeyword/tagsKeyword✅ Waypoints + semantic

Real-World Impact

Bug investigation note (Episodic, fast decay):
Day 1: Salience 1.0 - Fresh, top of mind
Week 1: Salience 0.90 - Still working on it
Week 4: Salience 0.60 - Bug fixed, fading
Week 12: Salience 0.15 - Mostly forgotten
Week 24: Salience 0.05 - Auto-pruned
Architecture decision (Reflective, slow decay):
Day 1: Salience 1.0 - Just decided
Month 6: Salience 0.95 - Still highly relevant
Year 1: Salience 0.85 - Core knowledge
Year 2: Salience 0.70 - Enduring wisdom
Frequently accessed note (reinforcement):
Start: Salience 0.40 (decaying)
Access 1: → 0.65 (reinforced)
Access 2: → 0.85 (reinforced again)
Access 3: → 0.95 (high priority)
Result: Your knowledge base stays focused on what matters now, while preserving strategic insights that endure.

Next Steps


Memory that thinks like you do. Built on cognitive science, not brute force.