Memory Systems

Enable agents to maintain conversation context and state across interactions with pluggable memory backends.

Stateless Agents, Persistent Memory

AA Kit follows a stateless design where agents themselves don't store state. Instead, conversation context is managed by external memory backends, enabling horizontal scaling and resilience.

Memory Backends

In-Memory

Development & Testing

Pros:

  • No setup required
  • Fast performance
  • Perfect for testing

Cons:

  • Data lost on restart
  • Not suitable for production
  • No persistence
memory="memory"

Redis

Production Applications

Pros:

  • High performance
  • Distributed support
  • TTL support
  • Scalable

Cons:

  • Requires Redis server
  • Additional infrastructure
  • Network overhead
memory="redis"

SQLite

Local Applications

Pros:

  • File-based storage
  • No server required
  • SQL queries
  • Portable

Cons:

  • Single machine only
  • File locking issues
  • Not for high concurrency
memory="sqlite"

PostgreSQL

Enterprise Applications

Pros:

  • ACID compliance
  • Advanced queries
  • Multi-user
  • Robust

Cons:

  • Complex setup
  • Requires database server
  • Higher latency
memory="postgres" # Coming soon

Implementation Examples

In-Memory Backend (Development)

python
from aakit import Agent

# Default in-memory backend for development
agent = Agent(
    name="dev_agent",
    instruction="You remember our conversation",
    model="gpt-4",
    memory="memory"  # or omit for default
)

# Chat with session persistence
await agent.chat("My name is Alice", session_id="user_123")
await agent.chat("What's my name?", session_id="user_123")
# Response: "Your name is Alice"

# Different session = different context
await agent.chat("What's my name?", session_id="user_456")
# Response: "I don't know your name yet"

Redis Backend (Production)

python
from aakit import Agent
import redis

# Configure Redis connection
redis_client = redis.Redis(
    host='localhost',
    port=6379,
    db=0,
    decode_responses=True
)

# Create agent with Redis memory
agent = Agent(
    name="prod_agent",
    instruction="You provide consistent support",
    model="gpt-4",
    memory="redis",
    memory_config={
        "client": redis_client,
        "ttl": 3600,  # 1 hour expiration
        "prefix": "agent:sessions:"
    }
)

# Conversations persist across restarts
await agent.chat("Remember this number: 42", session_id="user_123")
# ... application restarts ...
await agent.chat("What number did I tell you?", session_id="user_123")
# Response: "You told me the number 42"

SQLite Backend (Local Persistence)

python
from aakit import Agent

# SQLite for local persistent storage
agent = Agent(
    name="local_agent",
    instruction="You maintain conversation history",
    model="gpt-4",
    memory="sqlite",
    memory_config={
        "db_path": "./conversations.db",
        "table_name": "agent_memory"
    }
)

# Conversations saved to local database
await agent.chat("Start project Alpha", session_id="project_123")
# Data persists between runs

Memory Management

python
from aakit import Agent

agent = Agent(
    name="managed_agent",
    instruction="You help with various tasks",
    model="gpt-4",
    memory="redis"
)

# Clear specific session
await agent.clear_memory(session_id="user_123")

# Get session history
history = await agent.get_history(session_id="user_123")
for message in history:
    print(f"{message['role']}: {message['content']}")

# Update memory directly
await agent.add_to_memory(
    session_id="user_123",
    role="system",
    content="User prefers formal language"
)

Memory Architecture

How Memory Works

1

Session Identification

Each conversation has a unique session_id that groups related messages.

2

Message Storage

User messages and agent responses are stored with role, content, and timestamp.

3

Context Retrieval

On each request, relevant conversation history is loaded from the backend.

4

Context Window

Only recent messages within the token limit are sent to the LLM.

Configuration Options

OptionTypeDefaultDescription
max_messagesint100Maximum messages to store per session
ttlint3600Time-to-live in seconds (Redis only)
context_windowint20Number of recent messages to include
compressboolFalseCompress stored messages

Best Practices

Performance Tips

  • • Use session IDs consistently
  • • Set appropriate TTL values
  • • Limit context window size
  • • Enable compression for long conversations
  • • Use Redis for production workloads

Data Management

  • • Implement data retention policies
  • • Regular cleanup of old sessions
  • • Monitor memory usage
  • • Backup critical conversations
  • • Handle PII data carefully

Privacy Considerations

  • • Encrypt sensitive conversation data
  • • Implement proper access controls
  • • Allow users to delete their data
  • • Comply with data protection regulations (GDPR, CCPA)

Advanced Patterns

Conversation Branching

Create multiple conversation branches from a single point:

base_session = "user_123"
branch_a = f"{base_session}:scenario_a"
branch_b = f"{base_session}:scenario_b"

Memory Sharing

Share context between multiple agents:

shared_memory = RedisMemory(prefix="shared:")
agent1 = Agent(memory=shared_memory)
agent2 = Agent(memory=shared_memory)

Next Steps

Understand how AA Kit implements the Model Context Protocol for universal interoperability.

Continue to MCP Protocol →