LogoAIAny
Icon for item

n8n-MCP

n8n-MCP is a Model Context Protocol (MCP) server that gives AI assistants structured access to n8n node documentation, properties, and operations. It enables AI-driven workflow building, validation, and template discovery for n8n automation nodes, with multiple deployment options (hosted, Docker, npx, self-host).

Introduction

n8n-MCP — Model Context Protocol server for n8n

n8n-MCP implements a Model Context Protocol (MCP) server that provides AI assistants with deep, structured access to n8n's workflow automation ecosystem. Its goal is to let AI systems (Claude Desktop / Claude Code / Windsurf / Cursor and others) query and use n8n node metadata, documentation, example configurations, and templates so they can reliably design, validate, and manage n8n workflows.

Core capabilities
  • Provides structured information for 1,084 n8n nodes (537 core + 547 community).
  • High coverage of node metadata: ~99% node property schemas and documented operations coverage (~63.6% action coverage).
  • Documentation coverage: ~87% coverage of official n8n docs including AI nodes.
  • Template & example support: 2,709 workflow templates indexed, 2,646 pre-extracted real-world configurations.
  • AI tools detection: 265 AI-capable tool variants detected and documented.
  • Fast responses via an optimized SQLite-backed database (average query time ~12ms in the project metrics).
Deployment & usage

n8n-MCP supports multiple deployment options so you can run it as a hosted service or self-host:

  • Hosted service (dashboard.n8n-mcp.com) — free tier with limited daily tool calls for quick evaluation.
  • npx (local quickstart) — run npx n8n-mcp for immediate local usage (recommended for quick testing).
  • Docker image — lightweight container (ghcr.io/czlonkowski/n8n-mcp:latest), smaller than typical n8n images because it contains no n8n runtime dependencies.
  • Self-host/development — clone the repo and run with Node.js, build steps and database initialization provided.
  • Railway one-click deploy and docker-compose examples are included for cloud deployment.

The project includes clear examples and configuration snippets for integrating with AI clients (Claude Desktop config examples are provided) and optional n8n API credentials to enable workflow management (create/update/execute) from the MCP server.

Safety & best practices

n8n-MCP explicitly warns: NEVER edit production workflows directly with AI. Recommended precautions include:

  • Always copy workflows before AI edits.
  • Test changes in development environments first.
  • Export backups and validate changes before deployment.

These safety rules are emphasized in the README and in the provided Claude project instructions.

Integration & tooling

n8n-MCP exposes a set of MCP tools (search_nodes, get_node, validate_node, search_templates, validate_workflow, plus n8n management tools when N8N_API_URL and API key are configured). It provides validation profiles (minimal, runtime, ai-friendly, strict), node/property search, human-readable docs mode, template search modes (by_nodes, by_task, by_metadata), and workflow deployment helpers (n8n_create_workflow, n8n_update_partial_workflow, etc.).

Project metrics & quality
  • Comprehensive test suite (thousands of tests reported in repository metrics).
  • Database size and performance optimizations described; two database adapters supported (better-sqlite3 default in Docker, sql.js fallback).
Who maintains it

The project repo is maintained by the GitHub user czlonkowski. It is MIT licensed, open-source, and includes contribution and sponsor information in the README.

When to use n8n-MCP

Use n8n-MCP when you want an AI assistant to: discover n8n templates, inspect node documentation and properties, validate node configurations before deployment, or assist in building/iterating n8n workflows with stronger correctness guarantees derived from structured node metadata.

Information

  • Websitegithub.com
  • Authorsczlonkowski
  • Published date2025/06/07

Categories

More Items