API reference, authentication, and integration guides.
Contents
Reflect Memory gives your AI tools shared memory. Store something in one tool and every connected tool can access it: ChatGPT, Claude, Cursor, Gemini, and more. All data is scoped to your account and privacy-first.
Base URL: https://api.reflectmemory.com
All requests require a Bearer token in the Authorization header:
Authorization: Bearer <your-api-key>
User key: Full access. Used for direct API calls, scripts, and the dashboard. Get this from your account settings.
Agent keys: Scoped per vendor (e.g., chatgpt, claude). Used by AI integrations. Each agent only sees memories where allowed_vendors includes"*" or their vendor name.
Agent endpoints (used by AI integrations):
GET /agent/memories/latest Most recent memory. Optional ?tag= filter.GET /agent/memories/{id} Full memory by UUID.POST /agent/memories Create a memory.POST /agent/memories/browse List summaries (no content).POST /agent/memories/by-tag Full memories by tags.POST /query AI query with memory context.GET /whoami Resolve identity from key.POST /agent/memories: Request body
title, content (required)tags (optional array of strings)memory_type (optional). Values: "semantic", "episodic", "procedural" (default: "semantic"). Memory classification: semantic = facts and knowledge, episodic = events and decisions, procedural = workflows and patterns.User endpoints (dashboard, scripts): POST /memories,PUT /memories/:id,DELETE /memories/:id,POST /memories/list.
Reflect Memory exposes a Model Context Protocol (MCP) server for Claude and other MCP-compatible hosts. Connect to:
https://api.reflectmemory.com/mcpTransport: Streamable HTTP. MCP clients must use streamable-http (or streamableHttp in Cursor settings). The legacy SSE transport is not supported.
Auth: OAuth 2.1 (for Claude native connector) or Bearer token (for Cursor, xAI API, n8n, and other MCP clients). Claude handles OAuth automatically when you add the connector URL.
Create .cursor/mcp.json in your project root:
{ "mcpServers": { "reflect-memory": { "type": "streamable-http", "url": "https://api.reflectmemory.com/mcp", "headers": { "Authorization": "Bearer YOUR_AGENT_KEY" } } } }
Get your agent key from your dashboard (API Keys section). Restart Cursor after saving.
read_memories — recent memories (full content)get_memory_by_id — full memory by UUIDget_latest_memory — most recent memory, optional tag filterbrowse_memories — lightweight summariessearch_memories — search by keywordget_memories_by_tag — filter by tagswrite_memory — create a new memoryupdate_memory — edit an existing memory (title, content, tags)delete_memory — soft-delete a memory (recoverable from trash)read_team_memories — get memories shared with your teamshare_memory — share a personal memory with your teamAll tools are scoped to the authenticated user. Team tools require the user to belong to a team (see Team Memories).
https://api.reflectmemory.com/mcp as the URL, and click Add. Claude discovers all 11 memory tools automatically via OAuth. No extension or downloads needed..cursor/mcp.json file to your project with the MCP URL and your agent key as a Bearer token header. Cursor discovers all 11 memory tools automatically. No npm install or local server needed.Setup guides: /integrations
Team Memories let multiple users share context through a shared memory pool. Any team member can share a personal memory with the team, and all members can read team-shared memories from any connected AI tool.
POST /teams with a team name. The creator becomes the team owner.POST /teams/:id/invite with the invitee's email. They join automatically when they sign in.share_memory MCP tool (or POST /memories/:id/share) to share any of your memories with the team.read_team_memories MCP tool (or GET /memories/team) to pull shared context from your teammates.Team tools appear automatically in Claude, Cursor, and all other MCP-connected clients once the user belongs to a team.
Run Reflect Memory on your own infrastructure with Docker Compose. Data stays on your machine or private network.
git clone https://github.com/van-reflect/Reflect-Memory.git cd Reflect-Memory
Create a .env file:
RM_API_KEY=your-api-key RM_MODEL_API_KEY=sk-... RM_MODEL_NAME=gpt-4o-mini # MCP — at least one agent key is required to enable the /mcp endpoint RM_AGENT_KEY_CURSOR=pick-any-strong-secret RM_AGENT_KEY_CLAUDE=pick-any-strong-secret
Start the container:
docker compose --profile isolated-hosted up --build -d
Verify:
curl http://localhost:3000/health
In .cursor/mcp.json:
{ "mcpServers": { "reflect-memory": { "type": "streamable-http", "url": "http://localhost:3000/mcp", "headers": { "Authorization": "Bearer your-RM_AGENT_KEY_CURSOR-value" } } } }
The MCP endpoint uses a separate auth system from the REST API. Your RM_API_KEY works for curl and REST calls, but the /mcp endpoint requires a vendor-specific agent key (e.g., RM_AGENT_KEY_CURSOR). Setting at least one agent key also tells the server to start the MCP endpoint — without any agent keys, /mcp returns 404.
Each memory has:
id: UUIDtitle: Short descriptorcontent: Full texttags: Array of stringsmemory_type: "semantic", "episodic", or "procedural"origin: Which AI/service wrote it (chatgpt, claude, cursor, etc.)allowed_vendors: Who can see it (["*"] = all)created_at, updated_at: ISO 8601 timestampsversion: Integer, auto-incremented on every edit (version history)Every edit creates a new version. The dashboard shows a full diff history for each memory, and you can restore any prior version. Versions are also accessible via the REST API at GET /memories/:id/versions.
Memories can be classified into three types to improve retrieval and context:
Documentation: this page. Privacy: /privacy. Terms: /terms. Support: sales@reflectmemory.com.