Skip to content
Learn Agentic AI
Learn Agentic AI9 min read4 views

Temporal Memory Decay: Building Agents That Forget Irrelevant Information Naturally

Implement memory decay functions that let AI agents naturally forget stale information while preserving important memories, using importance scoring, refresh-on-access, and automated cleanup.

The Problem with Perfect Recall

An agent that never forgets accumulates noise. Old preferences that the user has since changed, outdated facts, stale task context — all of it clutters retrieval results and wastes context window tokens. Human memory fades naturally, and that forgetting is a feature, not a bug. It surfaces what matters and lets irrelevant details dissolve.

Temporal memory decay gives agents the same advantage. Memories lose strength over time unless they are reinforced through access or marked as permanently important.

Decay Functions

The simplest decay model is exponential decay, borrowed from the Ebbinghaus forgetting curve. Each memory starts with a strength of 1.0 and decays toward 0.0 based on time elapsed.

flowchart TD
    MSG(["New message"])
    WORKING["Working memory<br/>rolling window"]
    EPISODIC[("Episodic memory<br/>past sessions")]
    SEMANTIC[("Semantic memory<br/>facts and preferences")]
    SUM["Summarizer<br/>compresses old turns"]
    ROUTER{"Retrieve<br/>needed memories"}
    PROMPT["Assembled context"]
    LLM["LLM"]
    UPD["Memory updater<br/>writes new facts"]
    MSG --> WORKING --> ROUTER
    ROUTER -->|Past sessions| EPISODIC
    ROUTER -->|User facts| SEMANTIC
    EPISODIC --> SUM --> PROMPT
    SEMANTIC --> PROMPT
    WORKING --> PROMPT --> LLM --> UPD
    UPD --> EPISODIC
    UPD --> SEMANTIC
    style ROUTER fill:#4f46e5,stroke:#4338ca,color:#fff
    style LLM fill:#f59e0b,stroke:#d97706,color:#1f2937
    style EPISODIC fill:#ede9fe,stroke:#7c3aed,color:#1e1b4b
    style SEMANTIC fill:#ede9fe,stroke:#7c3aed,color:#1e1b4b
import math
from datetime import datetime
from dataclasses import dataclass, field

@dataclass
class DecayingMemory:
    content: str
    created_at: datetime
    last_accessed: datetime
    base_importance: float = 0.5
    access_count: int = 0
    decay_rate: float = 0.01  # higher = faster decay
    pinned: bool = False

    def strength(self, now: datetime | None = None) -> float:
        if self.pinned:
            return 1.0
        now = now or datetime.now()
        hours_since_access = (
            (now - self.last_accessed).total_seconds() / 3600
        )
        time_decay = math.exp(-self.decay_rate * hours_since_access)
        importance_boost = min(self.base_importance * 1.5, 1.0)
        access_boost = min(self.access_count * 0.05, 0.3)
        return min(time_decay + access_boost, 1.0) * importance_boost

The decay rate parameter controls how fast memories fade. A rate of 0.01 means a memory retains about 79 percent of its strength after 24 hours. A rate of 0.1 means it drops to about 9 percent in the same period.

Hear it before you finish reading

Talk to a live CallSphere AI voice agent in your browser — 60 seconds, no signup.

Try Live Demo →

Importance Scoring

Not all memories should decay at the same rate. A user's stated preference ("I prefer concise answers") should persist far longer than an intermediate calculation from a task that finished yesterday.

Importance scoring assigns a base importance when the memory is created. The score is determined by the type of information.

IMPORTANCE_RULES = {
    "user_preference": 0.95,
    "explicit_instruction": 0.9,
    "task_result": 0.6,
    "observation": 0.4,
    "intermediate_step": 0.2,
}

def assign_importance(content: str, memory_type: str) -> float:
    base = IMPORTANCE_RULES.get(memory_type, 0.5)
    # Boost if content contains keywords suggesting permanence
    permanent_keywords = ["always", "never", "prefer", "remember"]
    for kw in permanent_keywords:
        if kw in content.lower():
            base = min(base + 0.1, 1.0)
    return base

Memories with high importance decay much more slowly because their strength floor stays elevated through the importance boost multiplier.

Refresh on Access

Every time the agent retrieves a memory, its last_accessed timestamp resets and its access count increments. This implements the spacing effect — memories that are used regularly stay strong.

class DecayingMemoryStore:
    def __init__(self, decay_rate: float = 0.01):
        self.memories: list[DecayingMemory] = []
        self.decay_rate = decay_rate

    def add(
        self,
        content: str,
        memory_type: str = "observation",
        pinned: bool = False,
    ):
        importance = assign_importance(content, memory_type)
        now = datetime.now()
        mem = DecayingMemory(
            content=content,
            created_at=now,
            last_accessed=now,
            base_importance=importance,
            decay_rate=self.decay_rate,
            pinned=pinned,
        )
        self.memories.append(mem)

    def retrieve(self, query: str, top_k: int = 5) -> list[DecayingMemory]:
        now = datetime.now()
        scored = []
        for mem in self.memories:
            if query.lower() in mem.content.lower():
                relevance = mem.strength(now)
                scored.append((relevance, mem))
        scored.sort(key=lambda x: x[0], reverse=True)
        # Refresh accessed memories
        results = []
        for _, mem in scored[:top_k]:
            mem.last_accessed = now
            mem.access_count += 1
            results.append(mem)
        return results

Automated Cleanup

Even with decay, dead memories consume storage. A periodic cleanup job removes memories whose strength has dropped below a threshold.

def cleanup(self, threshold: float = 0.05):
    """Remove memories that have decayed below the threshold."""
    now = datetime.now()
    before_count = len(self.memories)
    self.memories = [
        m for m in self.memories
        if m.strength(now) >= threshold
    ]
    removed = before_count - len(self.memories)
    return removed

Run cleanup on a schedule — every hour, every 100 interactions, or before each retrieval if the store is small. The threshold controls how aggressive the forgetting is. A threshold of 0.05 keeps most memories for days. A threshold of 0.2 aggressively prunes within hours.

Still reading? Stop comparing — try CallSphere live.

CallSphere ships complete AI voice agents per industry — 14 tools for healthcare, 10 agents for real estate, 4 specialists for salons. See how it actually handles a call before you book a demo.

Combining Decay with Hierarchical Memory

Decay works well alongside hierarchical tiers. Working memory does not need decay because it is replaced per task. Short-term memory uses aggressive decay (high rate, low threshold). Long-term memory uses gentle decay so that established knowledge fades only after weeks of disuse.

short_term_store = DecayingMemoryStore(decay_rate=0.05)
long_term_store = DecayingMemoryStore(decay_rate=0.002)

FAQ

Won't important memories accidentally decay away?

That is what the pinned flag and importance scoring prevent. User preferences and explicit instructions receive high importance scores that keep their strength elevated. Critical memories can be pinned to never decay at all.

How do I tune the decay rate for my use case?

Start with 0.01 and observe how fast your agent forgets useful context. If users complain the agent lost track of something discussed yesterday, lower the rate. If retrieval returns too many stale results, raise it. Log the strength of retrieved memories to build intuition.

Should I use wall-clock time or interaction count for decay?

Wall-clock time works best for agents that run continuously. Interaction count is better for agents that are invoked sporadically — you do not want a memory to decay just because the user went on vacation. Some systems use a hybrid approach that counts both.


#MemoryDecay #AgentMemory #Forgetting #Python #AgenticAI #LearnAI #AIEngineering

Share

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.

Related Articles You May Like