OpenClaw for your teamcustomersfamily
Ship your OpenClaw product faster with sandboxed, per-customer agents provisioned on demand.
How it works
Get started in seconds
Add your own AI provider keys through the settings page — no config files, no terminal.
Pick your AI provider and model
Anthropic
Connected
OpenAI
Connected
Bot walks you through setup
Reproducible environments
Agents install system packages via Nix. Environments persist across sessions and are fully reproducible.
Manage installed system packages
Agent requests tools it needs
Connect tools via MCP
Add MCP servers for Gmail, GitHub, and more. Agents authenticate via OAuth — you control what they access.
Add and authenticate integrations
Gmail
Gmail — read & send emails
GitHub
GitHub — repos, PRs, issues
Agent discovers and uses tools
Set reminders and recurring tasks
Agents can schedule one-off reminders or recurring cron jobs. They run autonomously at the specified time.
View and manage scheduled jobs
Check open PRs and summarize review queue
pendingrecurringEvery Mon 9:00 AM
Review Q1 deck
pendingTomorrow 2:00 PM
Schedule tasks in natural language
Fine-grained network access
Agents have zero internet by default. You allowlist specific domains — agents can't reach anything else.
Control which domains agents can reach
Agent asks for network access
Architecture
Security-first. Zero trust by default.
- Multi-user with per-agent context
- Deploy to any messaging platform
- HTTP proxy with domain allowlist
- MCP proxy with OAuth per user
- Secret swapping — workers never see keys
- BYO provider keys (Anthropic, OpenAI, etc.)
- No direct internet access
- Nix reproducible environments
- Per-thread persistent storage
Installation
Deploy with Docker Compose or Kubernetes. From zero to running in under a minute.
Docker Compose
One-command deployment on a single machine. Best for getting started or small teams.
- You want the fastest setup on one machine.
- You prefer minimal operational overhead.
$ npx create-lobu my-bot$ cd my-bot && docker compose up -dKubernetes
Install via OCI Helm chart — no repo clone needed. Scales horizontally with your team.
- You need cluster scheduling and autoscaling.
- You need production-grade isolation controls.
$ helm install lobu oci://ghcr.io/lobu-ai/charts/lobu \
--namespace lobu \
--create-namespaceReady to deploy?
Get running in under a minute — or talk to us about your use case.