The Complete OpenClaw Guide: Setup, Skills, Docker, and Beyond (2026)
OpenClaw is the most-starred open-source project on GitHub. This guide covers what it is, how to install it, the skills system, Docker deployment, messaging integration, and what to watch out for.
Short answer: OpenClaw is an open-source AI agent framework (247K+ GitHub stars) that runs LLMs as persistent assistants. Install with npm, configure your LLM provider, add messaging channels. This guide covers setup, skills, Docker deployment, and security.
What is OpenClaw?
OpenClaw is an open-source AI agent framework created by Peter Steinberger, an Austrian developer. It transforms any LLM — GPT-4o, Claude, Llama, Gemini — into a persistent, autonomous assistant that runs on your own hardware. It launched in November 2025 and hit 247K+ GitHub stars within four months, surpassing React as the most-starred project on the platform in March 2026.
The core idea: instead of chatting with an AI in a browser tab, you run a daemon that stays alive. It manages schedules, executes tasks, maintains memory across sessions, and communicates through whatever channels you wire up — WhatsApp, Telegram, Discord, Signal, iMessage, SMS.
Architecture
OpenClaw uses a hub-and-spoke model. The Gateway is a Node.js process (default port 18789) that coordinates all messaging surfaces and skill execution. Each channel — WhatsApp via Baileys, Telegram via grammY, Discord via discord.js — connects to the Gateway as a spoke.
Memory is stored as plain-text Markdown files: SOUL.md defines the agent's personality, USER.md stores per-user context, IDENTITY.md holds the agent's self-description, and MEMORY.md accumulates long-term knowledge. No database required — everything is human-readable and version-controllable with Git.
Installation
OpenClaw requires Node.js 22 or later. There are several installation paths depending on your environment.
One-liner (macOS / Linux)
curl -fsSL https://openclaw.ai/install.sh | bashThis installs the openclaw CLI globally, sets up the daemon, and runs the onboarding wizard. You will be prompted to connect an LLM provider (OpenAI, Anthropic, etc.) and choose your first channel.
npm
npm install -g openclaw@latestSame result, more control. Useful if you manage Node versions with nvm or volta.
Docker
# docker-compose.yml
services:
openclaw:
image: openclaw/openclaw:latest
ports:
- "18789:18789"
volumes:
- ./config:/root/.openclaw
- ./skills:/root/.openclaw/skills
environment:
OPENAI_API_KEY: ${OPENAI_API_KEY}
restart: unless-stoppedMount your config and skills directories so they persist across container restarts. The Gateway port (18789) only needs to be exposed if you are accessing the admin panel or other services need to reach it directly.
Cloud one-click deploys
Hostinger, DigitalOcean, and Tencent Cloud all offer one-click OpenClaw templates. These provision a VM with OpenClaw pre-installed and the daemon already running. Useful for getting started quickly, though you will still need to configure channels and skills manually.
Raspberry Pi
OpenClaw runs on a Raspberry Pi with a 64-bit OS and at least 2GB of RAM (4GB recommended). Use the standard npm or curl install. Performance depends on which LLM provider you connect — the Pi handles the Gateway orchestration, not the model inference.
The skills system
Skills are how you extend what your OpenClaw agent can do. A skill is a folder containing a SKILL.md file and optionally scripts, templates, or other assets.
How skills work
The SKILL.md file has YAML frontmatter that controls behavior — whether the skill is user-invocable, whether the model can trigger it autonomously, and any required configuration. The body of the file is natural language instructions that the LLM follows when executing the skill.
---
name: daily-digest
user-invocable: true
disable-model-invocation: false
---
# Daily Digest
Compile a summary of unread emails and calendar events.
Send the summary to the user via their preferred channel
at 8:00 AM local time.Skill priority
OpenClaw resolves skills in a specific order:
- Workspace skills — in the current project directory
- Managed skills — in
~/.openclaw/skills - Bundled skills — shipped with OpenClaw
If two skills share the same name, the higher-priority one wins. This lets you override bundled behavior per-project without modifying global config.
ClawHub: the skills registry
ClawHub is the community registry for OpenClaw skills. As of early 2026, it hosts over 10,700 skills covering Gmail, Google Workspace, smart home control, Spotify, GitHub, web search, and more. Installing from ClawHub:
openclaw skills install gmail-summarizerSecurity warning: audit skills before installing
This matters. When ClawHub first launched, security researchers found that roughly 20% of uploaded packages were malicious — about 900 skills out of the initial batch. These ranged from data exfiltration to credential theft. The OpenClaw team has since added review processes, but the registry is still community-driven.
Before installing any skill:
- Read the
SKILL.mdand any scripts in the package - Check the author's reputation and other published skills
- Look for file system access, network calls, or environment variable reads that seem out of scope
- Prefer skills with high download counts and recent updates
Common use cases
OpenClaw is a general-purpose agent framework. The use cases depend on which skills and channels you configure. Here are the most common deployments.
Customer service automation
Connect OpenClaw to WhatsApp or SMS, load product knowledge into MEMORY.md, and the agent handles first-line support. Escalation rules go in the skill instructions. Works well for small businesses that cannot staff a support team around the clock.
Smart home control
Skills for Home Assistant, Philips Hue, and other smart home platforms let you control devices through natural language in any connected channel. "Turn off the living room lights" in a Telegram message works the same as it would through a voice assistant.
Productivity
Newsletter summarizers, meeting note generators, calendar management, email triage. These are the "personal assistant" use cases that drive most individual deployments.
Developer tools
GitHub integration skills handle PR reviews, issue triage, and deployment notifications. Some teams run OpenClaw as a Slack bot that answers questions about their codebase using RAG over their docs.
Cross-platform messaging
This is where Claw Messenger fits in. OpenClaw natively supports WhatsApp, Telegram, Discord, and Signal. For iMessage, RCS, and SMS, you need either a Mac with the native imsg CLI or a plugin like Claw Messenger. More on this below.
Messaging: connecting your agent to the real world
OpenClaw's channel system is one of its strongest features. Each messaging platform connects as a spoke to the Gateway. The agent sees a unified message stream regardless of where a message originated.
| Channel | Library | Platform requirement |
|---|---|---|
| Baileys | Any | |
| Telegram | grammY | Any |
| Discord | discord.js | Any |
| Signal | signal-cli | Any |
| iMessage (native) | imsg CLI | macOS only |
| iMessage / RCS / SMS | Claw Messenger plugin | Any |
The iMessage problem
OpenClaw's built-in imsg CLI reads from chat.db, a SQLite database that only exists on macOS. There is no official Apple API for iMessage. If your agent runs on Linux, Windows, Docker, or a VPS, the native integration does not work.
The iMessage without Mac guide covers the constraints in detail.
Claw Messenger: iMessage, RCS, and SMS on any platform
The Claw Messenger plugin connects your OpenClaw agent to iMessage, RCS, and SMS without requiring a Mac. Install the plugin, add your API key, and the Gateway connects via WebSocket. We manage the Apple infrastructure.
openclaw plugins install @emotion-machine/claw-messenger// ~/.openclaw/openclaw.json
{
"channels": {
"claw-messenger": {
"enabled": true,
"apiKey": "cm_live_xxxxxxxxxxxxxxxx",
"serverUrl": "wss://claw-messenger.onrender.com"
}
}
}For the full setup walkthrough, see the iMessage setup guide. For troubleshooting, see the troubleshooting guide. For a comparison with self-hosted alternatives, see BlueBubbles vs Claw Messenger.
Docker deployment
Docker is the standard deployment for production OpenClaw instances. The official image is openclaw/openclaw:latest and includes the Gateway, daemon, and CLI.
Minimal docker-compose.yml
services:
openclaw:
image: openclaw/openclaw:latest
ports:
- "18789:18789"
volumes:
- ./config:/root/.openclaw
- ./skills:/root/.openclaw/skills
environment:
OPENAI_API_KEY: ${OPENAI_API_KEY}
OPENCLAW_CLAW_MESSENGER_API_KEY: ${CLAW_MESSENGER_API_KEY}
restart: unless-stoppedVolume mounts
Mount at least the config directory (~/.openclaw) so your agent's memory files, channel config, and skill overrides persist across container restarts. If you use managed skills, mount the skills directory separately.
Environment variables
OpenClaw reads LLM provider keys and some channel credentials from environment variables. For sensitive values, use Docker secrets or a .env file that is not checked into version control.
Health checks
The Gateway exposes a health endpoint at http://localhost:18789/health. Add a healthcheck to your Compose file:
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:18789/health"]
interval: 30s
timeout: 10s
retries: 3Security best practices
As of March 2026, Bitdefender and The Register reported over 135,000 internet-exposed OpenClaw instances. Many had default configurations with no authentication. Do not be one of them.
Do not expose the Gateway to the public internet
The Gateway (port 18789) is an admin interface. It should not be reachable from outside your network. If you need remote access, use an SSH tunnel, VPN, or Tailscale — not port forwarding.
Enable sandboxing
OpenClaw supports sandboxed skill execution. Enable it in your config to prevent skills from accessing arbitrary files or making unrestricted network calls. This is especially important if you install community skills from ClawHub.
Restrict file system access
By default, some skills can read and write files on the host. Limit the agent's file access to specific directories using the allowedPaths config option. In Docker, this is naturally enforced by volume mounts — only mounted paths are accessible.
Audit skills before installing
Read every SKILL.md and inspect any bundled scripts before running openclaw skills install. The 20% malicious rate on early ClawHub submissions is a real number from security audits. The situation has improved, but community registries carry inherent risk.
Keep OpenClaw updated
Security patches ship frequently. Run openclaw update or rebuild your Docker image regularly.
OpenClaw vs other AI agent frameworks
A brief, factual comparison of OpenClaw against other popular agent frameworks as of early 2026.
| Feature | OpenClaw | AutoGPT | LangGraph | CrewAI |
|---|---|---|---|---|
| Architecture | Persistent daemon | Task runner | Graph-based workflows | Multi-agent roles |
| Memory | Markdown files | JSON / vector DB | State objects | Shared memory |
| Messaging channels | WhatsApp, Telegram, Discord, Signal, iMessage, SMS | None built-in | None built-in | None built-in |
| Skill / plugin system | ClawHub (10,700+ skills) | Plugins | Tools / nodes | Tools |
| Deployment | Local, Docker, VPS, Pi | Local, Docker | Cloud or local | Cloud or local |
| Primary use case | Personal assistant / messaging bot | Autonomous task execution | Complex multi-step workflows | Team-of-agents simulation |
OpenClaw's differentiator is the messaging-first approach. Other frameworks focus on task execution or workflow orchestration. OpenClaw treats communication channels as first-class citizens, which makes it the natural choice when the agent needs to talk to people over real messaging platforms.
Frequently asked questions
What is OpenClaw?
OpenClaw is an open-source, local-first AI agent framework. It runs a persistent daemon on your hardware that transforms LLMs into autonomous assistants capable of managing schedules, executing tasks, maintaining memory across sessions, and communicating through messaging platforms. Created by Peter Steinberger, it has over 247K GitHub stars as of March 2026.
Does OpenClaw require a Mac?
No. OpenClaw itself runs on macOS, Linux, and Windows. The only Mac-specific feature is the native iMessage integration via the imsg CLI, which reads from macOS's chat.db. For iMessage on non-Mac platforms, use the Claw Messenger plugin.
Is OpenClaw free?
Yes. OpenClaw is free and open-source under the MIT license. You need API keys for your LLM provider (OpenAI, Anthropic, etc.), and some third-party plugins or services have their own pricing. The core framework has no cost.
How do I add iMessage to OpenClaw?
Two paths. On a Mac, use the native imsg CLI — see the iMessage setup guide for the full walkthrough. On any other platform, install the Claw Messenger plugin:
openclaw plugins install @emotion-machine/claw-messengerAdd your API key to the config, restart the Gateway, and your agent can send and receive iMessage, RCS, and SMS. The agent phone number guide covers the full process.
Is OpenClaw safe to use?
The core framework is open-source and auditable. Safety depends on your configuration. The main risks are: exposing the Gateway without authentication (135K+ instances found publicly accessible), installing malicious skills from ClawHub (20% of early submissions were flagged), and running without sandboxing. Follow the security practices outlined above and you will be in good shape.
Can OpenClaw run on a Raspberry Pi?
Yes. You need a 64-bit OS and at least 2GB of RAM (4GB recommended). The Pi runs the Gateway and orchestration — model inference happens on your LLM provider's servers, so the Pi's hardware is not the bottleneck.
What LLM providers does OpenClaw support?
OpenClaw is model-agnostic. It supports OpenAI, Anthropic, Google Gemini, local models via Ollama, and any provider with an OpenAI-compatible API. You configure the provider in your openclaw.json and the Gateway handles the rest.
Give your OpenClaw agent a phone number
Add iMessage, RCS, and SMS to any OpenClaw agent. Works on any platform, no Mac required. Install the plugin, add your API key, and your agent is reachable by text.
Get your API key