TARXDocs
⌘Ktarx.com →

Search Docs

Skip to Content
MCP + Tools

MCP

Connect TARX tools, memory, files, and route-aware workflows to compatible AI clients.

What is MCP?

The Model Context Protocol is an open standard for connecting AI clients to tool servers. TARX uses MCP to expose approved tools and context while keeping local-private work on the user’s Computer when that boundary is required.

Connect from Claude Web

Use the remote endpoint for discovery, install guidance, public context, and user-approved relay flows:

{ "mcpServers": { "tarx": { "url": "https://mcp.tarx.com/mcp" } } }

Connect Claude Desktop Locally

For private memory, Vault, files, and local device context, use a local stdio server so Claude talks to 127.0.0.1:11440 on the user’s Computer:

{ "mcpServers": { "tarx-local": { "command": "node", "args": ["@tarx/mcp"], "env": { "TARX_COMPUTER_URL": "http://127.0.0.1:11440" } } } }

Claude Web cannot access a user’s localhost directly. If a web client needs private context, TARX must route through a user-approved relay or an explicit export bundle.

OpenAI / Agents

OpenAI Agents and Responses API patterns can use remote MCP servers with a server URL. For local-private memory, keep the tool executor on the user’s Computer and point it at the TARX local runtime:

curl https://mcp.tarx.com/openai-tools

The exported tools cover tarx_status, tarx_memory_store, tarx_memory_search, and tarx_context.

What’s available

  • Memory tools — store and recall information across sessions
  • File tools — index, search, and manage documents
  • Project context — full project state for AI sessions
  • Delegation — queue approved tasks for background execution

Build your own

MCP servers are composable. Build a skill, expose it as a tool, TARX orchestrates it alongside the built-ins.

View on GitHub ->