Moltworker on Cloudflare Workers: A New Era of AI Accessibility and Privacy

x/techminute
· By: peterKing · Blog
Moltworker on Cloudflare Workers: A New Era of AI Accessibility and Privacy

Moltworker is a proof-of-concept self-hosted AI agent from Cloudflare that runs Moltbot (formerly Clawdbot) on Cloudflare Workers, leveraging tools like AI Gateway for LLMs, Sandbox Containers for secure code execution, and R2 for storage—avoiding local hardware limits while keeping data private.[2] This setup combines Moltbot's personal AI capabilities (like messaging integrations and task automation) with Cloudflare's edge computing for reliability and scalability.[1][2]

Why Choose Moltworker with Cloudflare?

Moltworker demonstrates running a full AI agent stack serverlessly: it uses Cloudflare Workers for the core logic, AI Gateway to proxy LLM requests (e.g., Anthropic's Claude) with cost tracking and fallbacks, Sandbox Containers (powered by Docker on Cloudflare's infrastructure) for safe, isolated code execution, and R2 for persistent data storage.[2] Interesting fact: Unlike traditional self-hosting on a VPS or home server, this eliminates Docker management on your machine and handles untrusted code securely via callbacks between Workers and sandboxes—no local Mac mini or VPS needed.[2] It's ideal for privacy-focused users who want agent features like web research or Docker deployments via WhatsApp, but with Cloudflare's global edge for low latency.[1][3]

Note: Moltworker is a proof-of-concept, not a production Cloudflare product, requiring a paid Workers plan ($5/month minimum for Sandbox Containers).[2] Free tiers cover AI Gateway and other components for testing.

Prerequisites

  • Free Cloudflare account.
  • Workers Paid plan ($5 USD/month) for Sandbox Containers.
  • Domain or subdomain for DNS setup (optional but recommended).
  • Moltbot-compatible LLM API keys (e.g., Anthropic Claude, Google Gemini) or Cloudflare credits for AI Gateway billing.[1][2]
  • Basic familiarity with environment variables and Cloudflare dashboard.

Step-by-Step Starter Guide

Follow the official README (linked in Cloudflare's blog) for exact commands, as it's a hands-on Worker deployment.[2] Here's a synthesized guide based on available details:

  1. Sign Up and Prepare Cloudflare Resources
    Create a Cloudflare account if needed. Upgrade to Workers Paid ($5/month). Top up credits for AI Gateway if using pay-as-you-go (or bring your own API keys).[2] Enable products:

    • AI Gateway (free tier generous).
    • Workers and Sandbox Containers.
    • R2 bucket for agent storage (free tier available).[2]
  2. Deploy Moltworker on Cloudflare Workers

    • Fork/clone the Moltworker repo or use the provided template from Cloudflare's announcement.[2]
    • In Wrangler (Cloudflare's CLI): npx wrangler deploy. This spins up the Worker handling Moltbot logic, gateway, and callbacks to Sandbox Containers.[2]
    • Configure DNS: Point a subdomain (e.g., agent.yourdomain.com) to your Worker.
  3. Set Up AI Gateway for LLMs

    • Create a new AI Gateway instance in the dashboard.[2]
    • Enable providers (e.g., Anthropic): Add your Claude API key or use Cloudflare credits for unified billing.
    • Set the environment variable in your Worker: ANTHROPIC_BASE_URL=https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway}/anthropic (replace placeholders).[2]
      Pro Tip: Configure model fallbacks (e.g., Claude → Gemini) for reliability—no code redeploys needed when swapping providers.[2]
  4. Configure Sandbox Containers and R2

    • In the Worker code, integrate Sandbox SDK to run Docker commands in isolated environments.[2]
    • Bind R2 bucket for persistent storage (e.g., agent workspace data).[2]
    • Test two-way communication: Worker issues commands to sandbox via callbacks.
  5. Install and Onboard Moltbot
    Moltworker runs Moltbot under the hood—run the onboarding: moltbot onboard --install-daemon (adapted for Worker env).[1]

    • Gateway: Set bind to your Worker endpoint ("lan" or public URL).[1][3]
    • Workspace: Point to R2 bucket.[2]
    • Channels: Connect Telegram, WhatsApp, etc. (e.g., JSON config for WhatsApp allowlists).[3]
    • Skills/Tools: Install safe profiles; enable web browsing or Docker tools.[1][3]
      Sample config snippet (adapt for Worker env vars):
    {
      "gateway": { "bind": "https://agent.yourdomain.workers.dev" },
      "tools": { "profile": "safe" }
    }
    ```[3]
    
  6. Secure and Monitor

    • Restrict API keys: IP allowlisting, spending limits, rotate every 90 days (moltbot models auth setup-token --provider anthropic).[1]
    • Monitor via AI Gateway dashboard for costs, logs, and usage spikes.[2]
    • For remote/edge access, ensure Worker handles gateway traffic.[1]
  7. Test Your Agent

    • Message via connected app (e.g., Telegram: "Hello! Tell me about yourself"). Expect LLM-powered response.[1]
    • Advanced tests: "Deploy an Nginx container" (uses Sandbox) or "Browse Traefik docs" (web research).[3]
      Full setup: 15-30 minutes for basics, per user reports.[1][3]

Interesting Facts and Tips

  • Edge Advantages: Global distribution beats local VPS latency; auto-scales without server management.[2]
  • Moltbot Evolution: Formerly Clawdbot, it's privacy-first—no cloud subscriptions, runs on your infra (or Cloudflare's).[1][3]
  • Limitations: Proof-of-concept means potential tweaks needed; not for high-scale production yet. Other self-hosting uses Docker Compose on VPS for full control.[3]
  • Alternatives: Pure Moltbot on Docker (no Cloudflare) or voice agents with LiveKit on VPS.[3][4]

This gets you a powerful, self-hosted agent—experiment safely and check the Moltworker README for updates![2]

Sources

Comments (0)

U
Press Ctrl+Enter to post

No comments yet

Be the first to share your thoughts!