Persistent memory for AI coding agents

Your AI agent has
amnesia

Every new session starts from scratch. You re-debug the same bugs, re-explain decisions you already made. ContextPool gives your agent persistent memory across sessions.

Claude CodeCursorWindsurfKiro
terminal

How it works

step 1

Install

One curl command. Single static binary. No runtime dependencies. Runs on macOS, Linux, and Windows.

step 2

Initialize

cxp scans your past Cursor and Claude Code sessions and extracts engineering insights using an LLM.

step 3

Recall

Your agent automatically loads relevant past context via MCP at session start. No prompting needed.

What your agent remembers

Not conversation summaries. Actionable engineering knowledge distilled from your sessions.

bug

Bugs & root causes

tokio runtime panics when calling block_on inside async context

fix

Fixes & solutions

Add #[tokio::main] instead of manual Runtime::new()

decision

Design decisions

Chose libsql over rusqlite for Turso cloud compatibility

gotcha

Gotchas & patterns

macOS keychain blocks in MCP subprocess context

Works where you work

Zero config in Claude Code. One JSON entry everywhere else.

Zero config

Claude Code

No API key needed. CXP uses your existing Claude Code authentication automatically. Just install and init.

Zero config required

Run cxp init claude-code after installing. That's it.

Built for reliable team memory

ContextPool keeps memory durable, portable, and safe by design.

Stable project IDs

Project IDs are derived from your git remote URL, so teammates resolve to consistent IDs automatically.

Multi-backend LLM routing

Provider fallback chain: Claude CLI → Anthropic API → OpenAI → NVIDIA for resilient extraction.

MCP protocol

Agents query memory via standard MCP tools. No custom integration glue is required.

Keychain storage

API keys are stored in the system keychain, with a safe file fallback where keychain is unavailable.

Team memory

Your teammate debugged this last week. Your agent should know that. Push insights to a shared pool. Everyone on the team pulls the collective knowledge.

terminal

Pricing

Local mode is free forever. Cloud sync for teams.

0+
avg insights extracted
0
sessions processed
0s
to install
0
IDE integrations

Local

Full local functionality. No account needed.

Freeforever
  • +Unlimited local insights?
  • +All IDE integrations?
  • +4 LLM backends?
  • +Secret redaction?
  • +MCP server?
Install Now
Popular

Pro

Team access + cloud sync with a 7-day free trial.

$7.99/month
  • +Everything in Local?
  • +Team access?
  • +Unlimited insights?
  • +Unlimited projects?
  • +7-day free trial?
Start 7-day free trial

Privacy first

Local-first by default: raw transcripts stay on your machine, while only extracted insights are synced when you opt in.

Stays local

Raw transcripts never leave your machine.

Redaction-first

Secrets are stripped before LLM processing and again before cloud sync.

To cloud (opt-in)

Only extracted insights are synced, never full transcripts.

Stop re-discovering what you already know

Install in 30 seconds. Your agent starts remembering immediately.

Or create a team account to sync with your teammates.