Introduction to MCP and Its Growing Relevance
Model Context Protocol (MCP) is an open standard that provides a unified way for AI models (like large language models, or LLMs) to interact with external tools and data. In essence, MCP standardizes how AI tool interactions work (supergateway – npm). You can think of it as a kind of “USB-C port” for AI applications – just as USB-C offers a common interface for connecting all sorts of devices, MCP offers a common interface for connecting AI models to various data sources and tools (Introduction – Model Context Protocol). This means developers can build a tool once, and any AI that speaks MCP can use it.
MCP is rapidly gaining traction in the AI community. Notably, Anthropic – the company behind the Claude AI assistant – has embraced MCP in its workflows. For example, Claude Desktop (a local Claude AI interface) is designed as an MCP-compatible host, meaning Claude can plug into MCP tools out-of-the-box (Introduction – Model Context Protocol). More and more LLM-driven applications and frameworks are integrating MCP to let AI agents fetch information, perform calculations, or execute actions in a standardized way. This growing adoption makes MCP an important technology for anyone looking to extend AI capabilities with external services.
How does MCP work? Under the hood, MCP uses JSON-RPC 2.0 as the message format for communication between components (Specification – Model Context Protocol). But you don’t need to know the low-level details to use it in n8n. What’s important is understanding the three core roles in the MCP architecture: Host, Client, and Server.
MCP Basics: Host, Client, and Server
MCP follows a client-server model with clear responsibility for each role in an AI-tool integration (Introduction – Model Context Protocol):
- MCP Host: This is the application or AI agent that wants to use external tools. The host initiates the connection. Example: Claude Desktop, an AI IDE, or any chat app with an AI assistant can act as an MCP Host (Introduction – Model Context Protocol). The host is essentially the “brain” (the LLM) that says, “I need to calculate something, or retrieve some data – who can help me?”
- MCP Client: This is the connector that lives with the host and manages the communication with a tool server. It maintains a 1:1 connection to an MCP server on behalf of the host (Introduction – Model Context Protocol). You can think of the client as the host’s phone line to call the outside tool. In practice, the client handles sending requests (from the AI) to the server and receiving responses.
- MCP Server: This is the external tool or service that provides some capability, exposed in a standardized way. It’s a lightweight program (or workflow) that knows how to do one or more specific things – e.g. perform calculations, access a database, control an API – and it communicates via MCP (Introduction – Model Context Protocol). The server is like a service provider waiting for requests. In our context, an n8n workflow can act as an MCP Server, exposing any of n8n’s automation tasks to the AI.
In summary, the Host (AI) uses an MCP Client to talk to an MCP Server (tool). Next, we’ll dive into how you can use n8n, the popular automation tool, to set up both an MCP Server and an MCP Client. This will allow you to connect AI agents (like Claude) with virtually any automation workflow in n8n.
Part 1: Setting Up an MCP Server Trigger in n8n
In this first part, we’ll create a simple MCP Server in n8n. This server will act as a calculator tool that any MCP-compatible AI (our MCP Host, Claude) can use. By the end of Part 1, you’ll have an active n8n workflow exposing a “calculator” function to MCP hosts.
Creating an MCP Server Workflow in n8n
Let’s walk through building the workflow step by step:
- Create a New Workflow and Add MCP Server Trigger: In your n8n editor, create a new workflow. From the list of trigger nodes, add the MCP Server Trigger node. This special trigger node turns your workflow into an MCP-accessible service. You can give the MCP server a recognizable name (for example, “Calculator”) in the node’s settings. This name or ID will help identify the tool to the AI host.
- Implement the Tool’s Logic (Calculator): Next, add the nodes that will perform the calculator operation. For simplicity, you can use a Function node (JavaScript code) or a Set node to compute a result. Suppose we want our tool to add two numbers:
- Connect the MCP Server Trigger node output to a Function node.
- In the Function node, write code to extract the incoming parameters (e.g. two numbers) and return the sum. For example, in pseudocode:
// Example: assume the MCP request includes { "a": 5, "b": 7 } const a = $input.item.json.a; const b = $input.item.json.b; return { json: { result: a + b } };
In a real scenario, the structure of
$input.item.json
depends on how the AI calls the tool (the MCP trigger will provide any parameters sent by the host). The key idea is to process those inputs and produce an output. Our calculator simply adds two numbers and returns the result in a field (here we call itresult
).
- Automatically Return the Result: When the workflow finishes executing, the MCP Server Trigger will send the output back to the requesting client (the AI). In n8n, the output of the last node in the trigger path is typically what gets returned. In our calculator example, the Function node’s output (with the
result
) will be sent as the response to the AI. You don’t need a separate “respond” node – the MCP Trigger handles the response for you, similar to how n8n’s Webhook trigger works. - Save and Activate the Workflow: Save your workflow and activate it. Activation is important – it tells n8n to start listening for incoming MCP connections on this workflow. Once activated, your n8n instance is effectively hosting a live MCP server named “Calculator” (or whatever name you chose).How do external clients reach this MCP server? When the workflow is active, n8n will have dedicated endpoints for MCP communication (one for receiving requests and one for sending events, following the MCP spec). If you’re running n8n locally on default settings (say, at
http://localhost:5678/
), the MCP Server Trigger might be listening at a URL derived from your workflow or node ID. For example, n8n might expose an SSE endpoint like:http://localhost:5678/mcp/<unique-server-id>/sse
and a corresponding endpoint for messages:
http://localhost:5678/mcp/<unique-server-id>/message
(The exact URL or ID will be shown in the node’s UI or logs when activated.) These URLs adhere to MCP’s requirement for an event stream (Server-Sent Events for receiving responses/updates) and a message endpoint for requests. Keep note of the base URL (up to the unique ID) – we’ll need it to connect Claude to this tool.
Connecting Claude Desktop to the n8n MCP Server
Now that your n8n workflow is running an MCP server, the next step is to enable an AI agent (Claude) to use it. Claude Desktop, acting as an MCP Host, needs to know how to connect to our new “Calculator” service. We’ll configure Claude Desktop using SuperGateway, a utility that helps bridge Claude to MCP servers.
What is SuperGateway? It’s a command-line tool that can connect Claude (which expects tools via a standard input/output interface) to an MCP server over HTTP (SSE/WS) (supergateway – npm) (supergateway – npm). In our case, Claude Desktop will use SuperGateway to talk to the n8n service.
Claude Desktop typically has a JSON configuration file (often named something like config.json
or similar) where you can declare available MCP servers. We will add an entry for our n8n Calculator. Here’s a sample snippet to include in Claude Desktop’s config under the mcpServers
section:
{
"mcpServers": {
"n8nCalculator": {
"command": "npx",
"args": [
"-y",
"supergateway",
"--sse",
"http://localhost:5678/mcp/<your-server-id>"
]
}
}
}
Let’s break down what this does:
n8nCalculator
is a key we chose as the identifier for this server (you can name it anything descriptive). Claude will use this name to refer to the tool.- We instruct Claude Desktop to run a command (
npx -y supergateway ...
) to connect to the MCP server:npx -y supergateway
launches SuperGateway without needing a separate install (the-y
auto-confirms installing the package if not present).--sse "http://localhost:5678/mcp/<your-server-id>"
tells SuperGateway to connect to our n8n MCP Server via its SSE interface. Use the actual URL or IP for your n8n instance. If you’re running Claude Desktop on the same machine as n8n,localhost:5678
works. Replace<your-server-id>
with the identifier from your MCP trigger (as noted when you activated the workflow). This is essentially the base URL of your MCP server; SuperGateway will automatically append the required/sse
and/message
paths for the protocol.
Once this configuration is in place, restart Claude Desktop (or reload its config). Claude should now recognize a new tool named “n8nCalculator” (or the name you gave) as an available MCP server. In a conversation with Claude, you could now prompt it to use the “calculator” tool when needed (depending on how Claude’s agent reasoning is set up, it might automatically call the tool for math tasks or you might invoke it via a command). Claude Desktop, through SuperGateway, will spawn the connection to your n8n workflow when the tool is used.
Troubleshooting Tip: If Claude Desktop doesn’t seem to connect to the n8n MCP server, or you encounter an error (for example, on macOS you might get a permission error when it tries to run the npx
command), you can debug by running the SuperGateway command manually in a terminal. For instance, open a terminal and run:
npx -y supergateway --sse "http://localhost:5678/mcp/<your-server-id>"
Watch the output for any errors or logs. This can reveal issues such as the MCP URL being incorrect, the supergateway
package failing to install, or OS permission denials. If it connects successfully, you’ll see logs indicating it’s listening on a port or connected to the SSE stream. Once the manual test is working, you can try launching Claude Desktop again. Often, running it once manually (especially the first time, to install the package) resolves any initial setup quirks. Also ensure that your n8n instance is accessible to Claude (if they’re on different machines, you might need to adjust the URL to use an IP or hostname that Claude’s machine can reach, and open any necessary ports).
At this point, you have a fully functioning MCP Server in n8n and an AI agent (Claude) configured to use it. In a real use case, Claude can now call your calculator tool whenever it needs to perform a calculation as part of a conversation or task. Next, we’ll explore the reverse scenario: using n8n as an MCP client to call tools (which is useful for testing and chaining AI actions in workflows).
Part 2: Using the MCP Client Node in n8n
Thus far, we created a tool service that an AI can call. Now, in Part 2, we’ll simulate the AI agent within n8n itself using the MCP Client node. The MCP Client node lets n8n act as the caller – it’s like an AI agent node that can connect to any MCP server. We’ll use it to call our own “Calculator” service to verify everything end-to-end. This is also a great way to integrate AI-driven actions in your workflows: n8n could orchestrate when to invoke an AI tool via MCP.
Building a Workflow to Call the MCP Server (Acting as the AI)
Let’s set up a second workflow in n8n that will play the role of the AI agent calling our calculator:
- Create a New Workflow (MCP Client): In n8n, create another new workflow. Add a Manual Trigger node (or any trigger of your choice) as the start. This will allow us to execute the workflow on demand for testing.
- Add the MCP Client Node: From the nodes panel, add an MCP Client node to the workflow. This node is going to initiate a connection to an MCP Server (our calculator). Place it after the trigger. In essence, this MCP Client node will act like Claude (the host’s client) within our workflow.
- Configure the MCP Client Connection: In the MCP Client node’s settings, you’ll need to specify how to connect to the target server:
- Provide the MCP Server URL/Host. This is the address of the MCP server we set up in Part 1. If your n8n is on the same instance, you can use the same base URL we discussed earlier (e.g.
http://localhost:5678/mcp/<your-server-id>
). The node likely has fields for the endpoint; choose the appropriate protocol (Server-Sent Events or WebSocket). For simplicity, select SSE (Server-Sent Events) if that’s an option, since our config in Claude used SSE as well. Typically, you might enter the base URL and the node will use the standard/sse
and/message
routes by default. - If the node supports it, you may also need to provide an identifier or name of the tool to call on that server. In many cases, MCP servers can have multiple methods. Our server is simple (essentially one function: add numbers). Some MCP client implementations first do a “discover/list tools” step. In n8n’s MCP Client node, there may be an option to fetch available actions. You can try using an operation like “List Tools” to see if the client can retrieve what the server offers. However, if you know what to call, you can proceed to the next step.
- Provide the MCP Server URL/Host. This is the address of the MCP server we set up in Part 1. If your n8n is on the same instance, you can use the same base URL we discussed earlier (e.g.
- Call the Calculator Tool via MCP: Now configure the MCP Client node to actually perform an action. We want it to send a request to our calculator for a sample calculation:
- Specify the method or action to invoke. If you named the action in your MCP server, use that. For instance, perhaps our MCP server exposes a method named
"add"
or"calculate"
. (If unsure, it might be something generic if not explicitly set. For our example, let’s say it’s"calculate"
.) - Provide the parameters that the calculator expects. In our earlier example, we anticipated two inputs (a and b). The MCP Client node might let you define these as JSON or as separate fields. For example, set
a = 5
andb = 7
as the inputs for the calculation. This mirrors what an AI like Claude might send when it wants to add 5 and 7.
- Specify the method or action to invoke. If you named the action in your MCP server, use that. For instance, perhaps our MCP server exposes a method named
- Execute the Workflow and Verify the Result: Run the MCP Client workflow (since we used a Manual Trigger, just click “Execute Workflow” in n8n). What should happen:
- The MCP Client node will initiate a connection to the MCP server (if not already connected) and send the “calculate” request with the parameters.
- Behind the scenes, this will trigger our first workflow (the MCP Server workflow). You should see that workflow execute (the MCP Server Trigger will fire, and the Function node will run to add 5 + 7).
- The result (12 in this case) will be sent back through MCP to the client node.
- The MCP Client node in the second workflow will receive the response and output it.
In n8n’s interface, once the execution finishes, click on the MCP Client node to inspect its output. You should see the data returned from the server – likely a JSON containing the result, for example:
{ "result": 12 }
(plus possibly some metadata). This confirms that the MCP client successfully called the MCP server and got the correct response. We have essentially mimicked Claude asking the calculator tool for an answer, entirely within n8n.
Tip: You can also check the execution of the first workflow to see the interaction. If you have both workflows open side by side (in two browser tabs), you’ll notice the MCP Server workflow executed when the client made the request. This real-time coupling shows the power of MCP in action: it linked two workflows as if an AI were making a tool request.
Extending This Setup
While our example is a simple calculator, this pattern can be extended enormously. You could create all sorts of MCP servers in n8n – for example, a workflow that looks up information from a database, sends an email, interacts with a third-party API, or controls IoT devices – and then allow an AI agent to call those functions. Conversely, using the MCP Client node, n8n can interface with any external MCP-compliant tool or data source out there (there’s a growing ecosystem of MCP servers, from web browsers to cloud services). Essentially, n8n can become both a provider and consumer of AI-driven capabilities.
Conclusion: Unlocking AI-Powered Workflows in n8n
By leveraging the new MCP Server and MCP Client nodes in n8n, we’ve shown how you can bridge AI agents with automation workflows. In Part 1, we turned an n8n workflow into an MCP-compatible tool that an AI (Claude) can use. In Part 2, we flipped the script and let n8n act as the AI agent calling an MCP tool. This two-way integration is incredibly powerful:
- For n8n users: You can now expose any of your n8n workflows as tools for AI. Your automations (whether it’s a complex data processing task or an action like creating a report) can be invoked by AI assistants in a controlled, structured manner. Imagine asking Claude to “trigger the daily report workflow” or to use a custom n8n tool to get info from your internal API – all possible with MCP and n8n.
- For AI enthusiasts: You can greatly expand what your AI assistant can do by hooking it into n8n. Instead of being limited to what the AI model knows or a fixed set of plugins, you can build your own MCP servers in n8n to give the AI new superpowers (as an example, any integration that exists in n8n’s palette can be made accessible to the AI). And because MCP is a standard, the same n8n tool could be used not just by Claude but potentially by other AI platforms that adopt MCP in the future.
Before you jump in, make sure your n8n instance is up-to-date to have these MCP nodes available. The MCP integration is relatively new, and improvements are likely coming as the community explores its possibilities. As you experiment, keep security in mind – only expose tools that you’re comfortable the AI using, and consider permission controls if needed (MCP is designed to be secure and work within your infrastructure (Introduction – Model Context Protocol), but it’s wise to stay mindful of what actions you allow).
Try it out! Build a simple MCP server in n8n, connect Claude or another agent, and see the magic happen. This tutorial’s calculator example is just a starting point. You could build MCP tools for anything – from a Gmail-sending tool to a database query tool – all within n8n’s no-code/low-code environment. With MCP, we’re entering an era where AI agents can interface with software and services as easily as a web browser accesses websites (supergateway – npm). The combination of n8n and MCP opens the door to endless creative automations.
In summary, n8n’s new MCP nodes allow a seamless connection between AI and automation. Whether you’re letting an AI control your n8n workflows or enabling n8n to use AI-driven tools, the integration is straightforward and powerful. Update n8n, give these features a spin, and start building your own AI-connected workflows. The ability to have AI agents interact with virtually any service (through n8n) in a standardized way is a huge step forward – and you’re now on the cutting edge of it. Happy automating (and happy conversing with your new AI-augmented workflows)! (Introduction – Model Context Protocol) (Introduction – Model Context Protocol)