MCP Protocol  ·  30+ Clients  ·  10+ LLMs

MCP Integrations
Clients & LLM Providers

WHAT CONNECTS HERE

  • AI coding IDEs — Cursor, VS Code Copilot, Windsurf, Continue, Cline, Zed, JetBrains ACP
  • Desktop AI apps — Claude Desktop, Cherry Studio, AnythingLLM, Jan.ai, Msty
  • LLM providers — Claude, GPT-4o, Gemini 2.0 natively; Mistral, Llama, DeepSeek, Grok via bridge
  • Two transports — stdio for local tools, Streamable HTTP for cloud and remote
  • One config — same API key, same 7 tools, any client
IDE Clients LLM Providers Config Examples
30+
MCP Clients
3
Native LLM APIs
2
Transport Protocols
1
Config File

AI Coding IDEs

Editors and coding tools with native MCP support — connect anonym.legal in minutes

VS Code (GitHub Copilot)
Streamable HTTP stdio Copilot required

Visual Studio Code with GitHub Copilot Chat. MCP tools available in Agent mode. Configure via .vscode/mcp.json or workspace settings. Agent mode required — Copilot Chat inline suggestions do not call MCP.

Config: .vscode/mcp.json
MCP since: June 2025 (v1.99)
LLMs: GPT-4o, Claude 3.5, Gemini 2.0
Windsurf (Codeium)
Streamable HTTP stdio Freemium

Codeium's AI code editor with Cascade agent. Windsurf uses MCP to extend the Cascade agentic flow. Configure in Windsurf Settings → MCP Servers. Supports project-level and global server config.

Config: Windsurf settings UI
MCP since: Late 2024
LLMs: Claude 3.5, GPT-4o, Codeium models
Continue
Streamable HTTP stdio Free, Open Source

Open-source AI code assistant extension for VS Code and JetBrains. Highly configurable — supports any LLM provider. MCP tools available in agent mode via config.json. Connects to local Ollama models or cloud APIs.

Config: ~/.continue/config.json
MCP since: 2024
LLMs: Any — Claude, GPT, Gemini, Llama, Mistral, DeepSeek
Cline (formerly Claude Dev)
Streamable HTTP stdio Free, Open Source

Autonomous AI coding agent VS Code extension. One of the most MCP-capable open-source clients. Manages its own MCP server registry via the Cline settings panel. Supports full agentic loops with tool calling.

Config: Cline settings UI
MCP since: 2024
LLMs: Claude, GPT-4o, DeepSeek, Gemini, Llama
Zed
Streamable HTTP stdio Free, Open Source

High-performance editor written in Rust with native AI assistant. MCP support added in 2025 for the Zed AI assistant and agent features. Configure via settings.json.

Config: ~/.config/zed/settings.json
MCP since: 2025
LLMs: Claude 3.5, GPT-4o
JetBrains IDEs (ACP)
Streamable HTTP stdio AI Pro required

IntelliJ IDEA, PyCharm, WebStorm, GoLand, Rider — all via JetBrains AI Assistant and the new Agent Coordination Protocol (ACP). MCP servers available in the AI Assistant tool window. Supports all JetBrains IDEs.

Config: JetBrains settings → AI Assistant → MCP
MCP since: January 28, 2026 (ACP GA)
LLMs: Claude, GPT-4o, Gemini
Replit
Streamable HTTP Replit Core

Cloud-based IDE and development platform. Replit Agent supports MCP servers, allowing tools like anonym.legal to be called from within agentic workflows. Configured via project settings or the Replit MCP panel.

Config: Replit MCP settings panel
LLMs: GPT-4o, Claude 3.5

Desktop AI Applications

Standalone AI tools with MCP tool-calling support

Cherry Studio
stdio HTTP Free, Open Source

Desktop AI chat app supporting 20+ LLM providers and full MCP integration. Configure MCP servers in the Cherry Studio settings panel. Supports both stdio and HTTP transports. Popular in China for DeepSeek + MCP workflows.

Config: Cherry Studio → Settings → MCP
LLMs: Claude, GPT-4o, DeepSeek, Gemini, Qwen, Doubao, and 20+ more
AnythingLLM
HTTP Free, Open Source

Self-hosted AI workspace with RAG, document chat, and MCP agent tools. Connect anonym.legal MCP via the Agent Skills section. Supports local Ollama models and cloud APIs. Deployable on-premise for full data control.

Config: AnythingLLM → Agent Skills → MCP
LLMs: Any — OpenAI, Anthropic, Ollama, LM Studio, Mistral
Jan.ai
stdio Free, Open Source

Offline-first AI assistant and local model runner. MCP support added for tool-calling in chat sessions. Configure via the Jan settings panel. Fully local — ideal for air-gapped environments.

Config: Jan → Settings → Extensions → MCP
LLMs: Local models (Llama, Mistral, Phi, Gemma), OpenAI API
Msty
stdio HTTP Free

Desktop AI chat app with MCP integration. Supports multiple LLM providers simultaneously. Add anonym.legal as an MCP tool server in the Msty MCP panel. Simple UI-based configuration.

LLMs: Claude, GPT-4o, Gemini, local models
Goose (Block)
stdio HTTP Free, Open Source

Open-source autonomous AI agent by Block (formerly Square). Built for agentic workflows with MCP extensions. Configure via ~/.config/goose/config.yaml. Strong support for developer toolchains.

Config: ~/.config/goose/config.yaml
LLMs: Claude 3.5, GPT-4o, Gemini

Cloud & Developer Platforms

SDKs, frameworks, and cloud platforms with MCP support

OpenAI Agents SDK
Streamable HTTP stdio

OpenAI's official Python SDK for building agents. Native MCP support added March 2025. Connect any MCP server to GPT-4o, GPT-4.5, o1, or o3. Use MCPServerStreamableHTTP or MCPServerStdio classes.

Package: openai-agents
MCP since: March 2025
LLMs: GPT-4o, GPT-4.5, o1, o3, o3-mini
Google Vertex AI / Gemini API
Streamable HTTP

Google's enterprise AI platform and Gemini API support MCP via the Agent Builder and Google ADK (Agent Development Kit). Connect anonym.legal as an MCP tool server to Gemini 2.0 Flash, Gemini 2.5 Pro, or Gemini 1.5 agents.

Platform: Vertex AI Agent Builder, Google ADK
MCP since: February 2025
LLMs: Gemini 2.5 Pro, Gemini 2.0 Flash, Gemini 1.5 Pro
Amazon Q Developer
Streamable HTTP stdio

AWS's AI-powered coding assistant in the IDE and CLI. Amazon Q Developer added MCP support for custom tool integration. Connect anonym.legal to protect PII in AWS documentation queries and code generation workflows.

Config: ~/.aws/amazonq/mcp.json
LLMs: AWS foundation models (Amazon Nova, Claude on Bedrock)
LangChain / LangGraph
Streamable HTTP stdio

Python/TypeScript framework for building LLM applications and agents. MCP adapter package bridges LangChain tools to any MCP server. Use langchain-mcp-adapters to connect anonym.legal tools to any LangChain agent.

Package: langchain-mcp-adapters
LLMs: Any LangChain-supported model
LlamaIndex
Streamable HTTP

Data framework for LLM applications with native MCP tool integration. LlamaIndex can call anonym.legal MCP tools as part of a RAG or agentic pipeline. The McpToolSpec class handles MCP server connections.

Package: llama-index-tools-mcp
LLMs: Any LlamaIndex-supported model
n8n
Streamable HTTP

Open-source workflow automation platform. n8n AI Agent nodes support MCP tool servers. Add anonym.legal as an MCP Tool node to automatically anonymize PII in documents flowing through n8n workflows.

Node: AI Agent → MCP Tool
LLMs: OpenAI, Anthropic, Google, local via Ollama

LLM Providers

Native MCP support in the API — no bridge layer needed

Native — MCP built into the API/SDK
Bridged — MCP via client application
Claude
Anthropic — claude.ai & Claude API
Native MCP

MCP originated at Anthropic in November 2024. Claude Desktop uses stdio MCP natively. The Claude API (Messages API with tool_use) is the protocol foundation. All MCP tool calls ultimately execute as Claude tool_use content blocks.

Models: Claude 3.5 Sonnet, Claude 3.5 Haiku, Claude 3 Opus
MCP via: Claude Desktop (stdio), any HTTP MCP client
API: tool_use content blocks in Messages API
GPT-4o / GPT-4.5 / o-series
OpenAI — OpenAI Agents SDK
Native MCP

OpenAI added first-class MCP support to the OpenAI Agents SDK in March 2025. Connect anonym.legal directly to GPT-4o, GPT-4.5, o1, or o3 agent workflows. Both stdio and Streamable HTTP transports supported.

Models: GPT-4o, GPT-4o-mini, GPT-4.5, o1, o3, o3-mini
MCP via: openai-agents Python SDK
Class: MCPServerStreamableHTTP
Gemini 2.0 / 2.5
Google — Vertex AI & Gemini API
Native MCP

Google added MCP support to Vertex AI Agent Builder and the Google ADK in February 2025. Gemini 2.0 Flash and Gemini 2.5 Pro are the primary models. Connect anonym.legal via Streamable HTTP to Gemini-powered agents.

Models: Gemini 2.5 Pro, Gemini 2.0 Flash, Gemini 1.5 Pro
MCP via: Vertex AI Agent Builder, Google ADK
Also: VS Code Copilot, Cursor, Cline
Llama 3.x / Llama 4
Meta — via Ollama, Continue, Cline, AnythingLLM
Bridged

Meta's open-weight Llama models run locally via Ollama. Continue and Cline support calling MCP servers while using Llama as the backend. AnythingLLM connects local Llama models to MCP tool servers. Full local workflow — no data leaves your machine.

Models: Llama 3.1 70B, Llama 3.3 70B, Llama 4 Scout
Runtime: Ollama, LM Studio, llama.cpp
MCP via: Continue, Cline, AnythingLLM
Mistral / Mixtral
Mistral AI — via Continue, Cherry Studio, Cline
Bridged

Mistral Large 2, Mistral Medium, and open-weight Mixtral 8x22B support tool calling via clients. Continue supports Mistral API with MCP servers. Locally via Ollama. Ideal for EU deployments — Mistral servers are EU-based.

Models: Mistral Large 2, Mistral Medium 3, Mixtral 8x22B
MCP via: Continue, Cherry Studio, Cline
DeepSeek
DeepSeek — via Cursor, Cherry Studio, Cline, Continue
Bridged

DeepSeek V3 and DeepSeek R1 are popular open-weight models with strong coding performance. Widely used in Cherry Studio and Cursor with MCP tool support. Anonym.legal is particularly valuable here — DeepSeek servers are outside the EU.

Models: DeepSeek V3, DeepSeek R1, DeepSeek Coder V2
MCP via: Cherry Studio, Cursor, Cline, Continue
Grok
xAI — via Cursor, VS Code, Continue
Bridged

xAI's Grok 2 and Grok 3 are accessible via Cursor and VS Code with custom API endpoints. MCP tools are available when using Grok as the model backend in clients that support it. xAI is working on native MCP support.

Models: Grok 2, Grok 3 Beta
MCP via: Cursor, VS Code (OpenAI-compatible endpoint)
Cohere Command
Cohere — via LangChain MCP adapters
Bridged

Cohere Command R and Command R+ support tool calling via the Cohere API. Connect anonym.legal using the LangChain MCP adapter with Cohere as the LLM backend. Useful for RAG pipelines with PII-bearing documents.

Models: Command R+, Command R
MCP via: LangChain MCP adapters

Transport Protocols

How your client connects to the MCP server — choose based on deployment

stdio — Subprocess IPC
Local only Zero network exposure

The client launches the MCP server as a child process and communicates via standard input/output streams. JSON-RPC messages exchanged over stdin/stdout. No TCP port, no HTTP, no network.

WHEN TO USE

  • Claude Desktop (only supported transport)
  • Air-gapped environments
  • Local npm/npx servers
  • Maximum security — no network attack surface

USED BY

  • Claude Desktop
  • Continue (local servers)
  • Cline (local servers)
  • Cursor (local servers)
anonym.legal — stdio config (Claude Desktop)
{
  "mcpServers": {
    "anonym-legal": {
      "command": "npx",
      "args": ["@anthropic-ai/mcp-server-anonym-legal"],
      "env": {
        "ANONYM_LEGAL_API_KEY": "your-api-key"
      }
    }
  }
}
Streamable HTTP — Modern Standard
Remote & Local MCP 1.0 spec

HTTP POST to a single endpoint with optional Server-Sent Events streaming for long-running operations. Defined in MCP Specification 2025-11-25 as the standard transport. anonym.legal's HTTP endpoint uses this protocol.

WHEN TO USE

  • Cursor, VS Code, Windsurf, Cline
  • Remote/cloud MCP servers
  • Multi-user shared instances
  • All non-Claude-Desktop clients

ENDPOINT

https://anonym.legal/mcp

Auth via Authorization header
Bearer token or API key

anonym.legal — HTTP config (Cursor / VS Code)
{
  "mcpServers": {
    "anonym-legal": {
      "url": "https://anonym.legal/mcp",
      "headers": {
        "Authorization": "Bearer your-api-key"
      }
    }
  }
}
SSE — Server-Sent Events (Legacy)
Deprecated in MCP 1.0

The original HTTP transport from MCP's early spec. Used two endpoints: /sse (server-to-client stream) and /messages (client-to-server POST). Replaced by Streamable HTTP in the MCP 1.0 spec (November 2025). Still supported by many clients for backwards compatibility but not recommended for new deployments. anonym.legal supports Streamable HTTP only.

MCP 1.0 Protocol Spec

Model Context Protocol specification — 2025-11-25 release. What anonym.legal implements.

Tools

Functions the LLM can call. anonym.legal exposes 7 tools: analyze_text, anonymize_text, detokenize_text, get_balance, estimate_cost, list_sessions, delete_session. Each tool has a JSON Schema definition for its parameters.

anonym.legal: ✓ 7 tools implemented
Resources

URI-addressed data sources the LLM can read (files, database rows, API responses). Clients like Claude Desktop can subscribe to resource changes. Not currently used by anonym.legal — data flows through tool calls.

anonym.legal: — not used
Prompts

Reusable prompt templates that clients can list and insert. Servers expose named prompt templates with arguments. anonym.legal exposes a default anonymization workflow prompt that configures the AI to automatically call the correct tools.

anonym.legal: ✓ Default workflow prompt
Sampling

MCP server requests the client to run an LLM inference. Enables server-side agentic loops where the MCP server itself can ask the AI to generate text. Requires explicit client opt-in. Not currently used by anonym.legal.

anonym.legal: — not used
Roots

File system scope boundaries. Clients tell servers which directories are accessible. Servers that read files (like file system MCP servers) use Roots to stay within safe bounds. Not applicable to anonym.legal.

anonym.legal: — not applicable
Elicitation

New in MCP 1.0 — allows the server to request user input mid-call via the client UI. The server can ask for missing parameters, confirmations, or clarifications without aborting the tool call. Currently experimental.

anonym.legal: ∿ Future consideration

PROTOCOL VERSION

The current MCP specification version is 2025-11-25, released November 25, 2025 as MCP 1.0. anonym.legal MCP Server is v2.2.0 and implements the full Tools capability set with Streamable HTTP transport and stdio transport. The protocol is maintained at spec.modelcontextprotocol.io.

Configuration Examples

Ready-to-use config snippets for every major client

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows). Restart Claude Desktop after saving.

claude_desktop_config.json
{
  "mcpServers": {
    "anonym-legal": {
      "command": "npx",
      "args": [
        "-y",
        "@anthropic-ai/mcp-server-anonym-legal"
      ],
      "env": {
        "ANONYM_LEGAL_API_KEY": "your-api-key-here"
      }
    }
  }
}

Transport: stdio (subprocess). The -y flag auto-confirms npx install. Replace your-api-key-here with your anonym.legal API key from anonym.legal.

Create or edit .cursor/mcp.json in your project root (project-level) or in ~/.cursor/mcp.json (global).

.cursor/mcp.json — HTTP (recommended)
{
  "mcpServers": {
    "anonym-legal": {
      "url": "https://anonym.legal/mcp",
      "headers": {
        "Authorization": "Bearer your-api-key-here"
      }
    }
  }
}
.cursor/mcp.json — stdio (local)
{
  "mcpServers": {
    "anonym-legal": {
      "command": "npx",
      "args": ["-y", "@anthropic-ai/mcp-server-anonym-legal"],
      "env": {
        "ANONYM_LEGAL_API_KEY": "your-api-key-here"
      }
    }
  }
}

Create .vscode/mcp.json in your workspace. Requires VS Code 1.99+ with GitHub Copilot Chat and Agent mode enabled.

.vscode/mcp.json
{
  "servers": {
    "anonym-legal": {
      "type": "http",
      "url": "https://anonym.legal/mcp",
      "headers": {
        "Authorization": "Bearer your-api-key-here"
      }
    }
  }
}

Open Copilot Chat → Agent mode → the anonym-legal tools will appear in the tool picker. VS Code uses the key "servers" (not "mcpServers").

Edit ~/.continue/config.json. Add to the mcpServers array.

~/.continue/config.json — add to mcpServers array
{
  "mcpServers": [
    {
      "name": "anonym-legal",
      "transport": {
        "type": "streamable-http",
        "url": "https://anonym.legal/mcp",
        "requestOptions": {
          "headers": {
            "Authorization": "Bearer your-api-key-here"
          }
        }
      }
    }
  ]
}

Cline stores MCP config in VS Code settings. Open Cline → MCP Servers tab → Add Server → paste this config.

Cline MCP config
{
  "anonym-legal": {
    "url": "https://anonym.legal/mcp",
    "disabled": false,
    "transportType": "streamable-http",
    "headers": {
      "Authorization": "Bearer your-api-key-here"
    }
  }
}

After saving, Cline will auto-discover all 7 anonym.legal tools. They appear in the tools list and can be called from any Cline conversation.

Using the OpenAI Agents SDK (Python). Install: pip install openai-agents

Python — OpenAI Agents SDK + anonym.legal MCP
from agents import Agent, Runner
from agents.mcp import MCPServerStreamableHTTP

async def main():
    async with MCPServerStreamableHTTP(
        url="https://anonym.legal/mcp",
        headers={"Authorization": "Bearer your-api-key-here"},
        name="anonym-legal"
    ) as mcp_server:
        agent = Agent(
            name="Privacy Agent",
            model="gpt-4o",
            instructions="Use anonym_legal_anonymize_text before sending any document to external services.",
            mcp_servers=[mcp_server]
        )
        result = await Runner.run(
            agent,
            "Anonymize this report before archiving: John Smith (SSN 123-45-6789)..."
        )
        print(result.final_output)

Compatibility Matrix

Quick reference — transport support and key features per client

Client stdio HTTP Auto-call Tool picker Free tier
Claude Desktop
Cursor limited
VS Code (Copilot) Copilot required
Windsurf limited
Continue
Cline
Zed partial
JetBrains (ACP) AI Pro required
Cherry Studio
AnythingLLM
OpenAI Agents SDK code-only
LangChain code-only
n8n self-hosted

Auto-call = AI automatically calls tools without user clicking. Tool picker = UI to select which tools to include. Free tier = client has a free usage tier.

Integration Patterns

How different teams connect anonym.legal to their AI stack

Legal & Compliance

Document Review with Claude Desktop

Lawyers drop contracts, depositions, and client files into Claude Desktop. anonym.legal MCP intercepts every prompt — client names hash to tokens, SSNs redact, financial data masks. Claude analyses the structure; the law firm's data stays protected.

Stack: Claude Desktop + stdio
Operators: hash (names), redact (SSN), encrypt (account numbers)
Software Development

Code Review with Real Data in Cursor

Developers paste production logs, database dumps, or API responses into Cursor for debugging. anonym.legal removes customer emails, IPs, and auth tokens before Cursor sends the context to any LLM — GPT-4o, Claude, or Gemini.

Stack: Cursor + HTTP
Operators: redact (API keys), replace (emails), keep (IPs for debugging)
Healthcare / HIPAA

Patient Data in AI Workflows

Healthcare teams use Continue or Cline to summarize clinical notes and discharge letters. anonym.legal's HEALTHCARE entity group detects names, dates-of-birth, diagnoses, and medical record numbers — all tokenized before any LLM sees the text.

Stack: Continue / Cline + HTTP
Entity group: HEALTHCARE + UNIVERSAL
Operators: replace (person), redact (MRN)
Data Engineering

LLM Pipelines on Production Data

Data teams run LlamaIndex or LangChain pipelines over real customer datasets. anonym.legal MCP is called programmatically via the OpenAI Agents SDK or LangChain adapter — anonymize before the LLM, detokenize after.

Stack: LangChain / LlamaIndex + HTTP
Entity group: FINANCIAL + UNIVERSAL
E2E mode: server never stores mapping
Enterprise Automation

n8n Workflow Automation

Operations teams route customer tickets, emails, and CRM data through n8n AI workflows. The anonym.legal MCP Tool node sits before every LLM call — customer names, phone numbers, and account data are tokenized in transit.

Stack: n8n AI Agent + HTTP
Entity group: UNIVERSAL + CORPORATE
Operators: replace (person), mask (phone)
Local / Offline

Llama + AnythingLLM On-Premise

Teams that run Llama or Mistral locally via Ollama + AnythingLLM can still protect PII before sending text to the local model. The anonym.legal API call goes to EU servers; the anonymized text stays local. Hybrid: cloud anonymization, local inference.

Stack: AnythingLLM + HTTP + Ollama (local LLM)
LLMs: Llama 3.3, Mistral, Phi-3

Security Checklist

Deploy anonym.legal MCP safely in regulated environments

✓ API Key Management

  • Store API key in environment variable, not in config file
  • Never commit claude_desktop_config.json to git
  • Add .cursor/mcp.json to .gitignore if it contains keys
  • Use project-level configs with placeholders; inject keys at runtime
  • Rotate API keys regularly via anonym.legal dashboard

✓ Transport Security

  • HTTP transport: always HTTPS — never plain HTTP
  • Verify endpoint: https://anonym.legal/mcp (TLS 1.3)
  • stdio transport: process isolated, no network attack surface
  • E2E mode: use when the server must not store token mappings

✓ Session & Data Retention

  • Use persistence: "session" (24h) for short workflows
  • Use persistence: "persistent" (30d) only when detokenization is needed later
  • Call anonym_legal_delete_session after completing a task (GDPR Art. 17)
  • E2E mode: session token mapping never stored server-side

✓ Regulated Industries

  • Infrastructure: ISO 27001:2022 (Hetzner, Germany)
  • All data processing within EU (GDPR compliant)
  • Zero-knowledge auth option available
  • HIPAA: use HEALTHCARE entity group + redact/replace operators
  • PCI-DSS: use FINANCIAL group + encrypt or mask for card data
  • GDPR: use delete_session for right-to-erasure workflows

IMPORTANT NOTE ON CLIENT TRUST

anonym.legal MCP intercepts PII before the text reaches the AI model. However, the AI tool (Claude Desktop, Cursor, etc.) still receives the anonymized text and the AI's response. Your data is protected from the LLM provider — it is not protected from the AI tool client itself. Choose AI clients that comply with your organization's data processing agreements.

Connect Your AI Tool Now

One API key. One endpoint. Works with every MCP client listed here.

Get API Key MCP Reference Full Docs