A new open-source tool called n8n-to-claw lets developers convert their n8n workflow JSON files into OpenClaw-compatible agent skills with a single CLI command. The tool, hosted on GitHub under just-claw-it, uses an LLM to handle the transpilation pipelineβ€”parsing n8n workflow structure and generating both SKILL.md descriptors and skill.ts implementations.

How the Conversion Works

The tool runs a four-stage pipeline: input parsing (normalizing n8n JSON to an intermediate representation), LLM transpilation (generating SKILL.md + skill.ts from the IR), TypeScript validation, and packaging to disk. Developers point the CLI at a local workflow.json file or fetch directly from a running n8n instance using --n8n-url and --workflow-id flags. The LLM handles the heavy lifting of translating node logic into TypeScript.

Setup and Requirements

The tool requires Node.js 20 or higher and any OpenAI-compatible LLM API (OpenAI, Groq, OpenRouter, Ollama). Users configure environment variables for LLM_BASE_URL, LLM_API_KEY, and LLM_MODEL. The minimum recommended model is GPT-4o or Claude Sonnet tierβ€”smaller Ollama models (7B–13B) produce syntactically broken output. The tool retries once on TypeScript errors, but logical correctness isn't guaranteed.

Node Coverage and Graceful Degradation

The tool explicitly maps 479 node types across categories including triggers (108), HTTP/integrations (230), transforms (61), database (30), file storage (18), email (16), flow control (12), and webhooks (4). Unknown node types emit stubs with TODO comments. Credential references generate placeholder .env files. Large workflows (50+ nodes) may exceed context limits on smaller models.

Output Structure

Generated skills land in ~/.openclaw/workspace/skills// with SKILL.md, skill.ts, credentials.example.env, and warnings.json. If TypeScript validation fails twice, output routes to a draft/ subdirectory for manual fixing.

Web UI and Docker

Beyond the CLI, there's a web interface (Express + Vite) running on port 3847 for browser-based conversions. Users upload workflow JSON, preview parse results, and run transpilation with their own API key. A Docker container bundles everything for serverless deployment.

Key Takeaways

  • Bridges the n8n automation ecosystem to OpenClaw's agent framework
  • LLM-powered transpilation handles complex node logic translation
  • 479 pre-mapped node types with graceful degradation for unknowns
  • Requires GPT-4o or Claude Sonnet tier for reliable output
  • CLI-first design with optional web UI and Docker deployment

The Bottom Line

This is exactly the kind of glue infrastructure the OpenClaw ecosystem needsβ€”n8n has a massive library of community workflows, and this tool lets developers port that existing work into reusable agent skills rather than rebuilding from scratch. The LLM dependency is a bit heavy for casual use, but if you're serious about OpenClaw automation, this saves hours of manual skill writing.