Concepts
Knowledge, not chat logs
Most “second brain” tools store a stream of captures — articles, notes, clippings — and hope you can find them later. amem does something different: it compiles captures into wiki notes with verbatim quotes and verifiable provenance. The wiki is the product; the raw captures are just source material.
This follows Andrej Karpathy’s recommendation of maintaining a personal wiki as the substrate for long-term thinking.
Dual audience
amem serves both agents (over MCP) and humans (via CLI + extension) from the same store. An agent citing a paper sees the same markdown a human sees. There’s no hidden “agent memory” that drifts from what’s on disk.
Provenance by construction
Every chunk carries a SHA-256 hash of its source. If the original file changes, amem verify detects drift. Citations stay grounded.
Offline-first
Zero cloud dependencies by default. Ollama runs locally. The only network calls are to fetch papers from their public URLs (arXiv, DOI resolvers). You can operate amem entirely air-gapped after initial capture.
Three interfaces
- CLI —
amem capture,amem compile,amem recall,amem cite - MCP server —
amem_capture,amem_compile,amem_cite,amem_recallfor agents - Chrome extension — one-click web page capture + self-recorded demos