← Back to blog
Admin 8 min read

Cloudflare Dynamic Workers: V8 Isolates Spun Up at Runtime

Open Beta

Dynamic Workers: V8 Isolates You Spin Up at Runtime

Cloudflare just shipped the lowest-level compute primitive on the platform. Create, execute, and destroy isolated JavaScript environments on demand — in milliseconds, not minutes.

Containers take seconds to start and megabytes to run — Dynamic Workers give you isolated V8 environments in milliseconds at a fraction of the cost, and they are now in open beta.

TL;DR — What are Dynamic Workers?

What The lowest-level compute primitive on Cloudflare — V8 isolates created, executed, and destroyed at runtime via API.
Speed 100x faster than containers, 10–100x more memory-efficient, millisecond cold starts.
Code Mode LLMs write executable code instead of sequential tool calls — 81% fewer tokens, faster results.
Price 1,000 workers/month included. $0.002/worker/day overage (waived during beta).
D1 + AI Use MyD1 to browse, query, and manage the D1 databases your Dynamic Workers connect to — visually.

What Are Dynamic Workers?

Regular Cloudflare Workers are deployed ahead of time. You write code, run wrangler deploy, and it runs globally. Dynamic Workers flip that model: you create isolated V8 environments at runtime, feed them code, and execute it on demand.

Think of them as the lowest-level compute primitive on the Cloudflare platform. A Dynamic Worker is a sandboxed V8 isolate spun up programmatically from a parent Worker. No containers. No VMs. No cold-start penalty. Your code creates another piece of code and runs it in isolation — in milliseconds.

100x Faster than containers
10-100x More memory-efficient
<1ms Cold starts
$0.002 Per worker/day overage

The open beta announcement positions Dynamic Workers as the foundation for a new generation of AI-powered applications — agents that generate and execute code rather than calling predefined tools.

Two Loading Modes

Dynamic Workers offer two ways to load and execute code, each optimized for a different use case. Both are documented in the getting started guide.

load(code) — One-Time

Pass raw JavaScript as a string. The isolate boots, executes the code, returns the result, and is disposed. No caching, no persistence. Ideal for one-off LLM-generated scripts, data transformations, and throwaway compute.

get(id, callback) — Cached

Register a worker with a stable ID. On first call, the callback provisions the isolate. Subsequent calls reuse the cached instance. Ideal for per-user sandboxes, persistent agents, and multi-tenant environments.

load(code) — One-time execution
import { DynamicWorker } from "cloudflare:workers";

export default {
  async fetch(request, env) {
    const script = 'export default { async fetch(req) { return new Response("Hello from dynamic code!"); } }';
    const worker = await DynamicWorker.load(script);
    return worker.fetch(request);
  }
};
get(id, callback) — Cached instance
import { DynamicWorker } from "cloudflare:workers";

export default {
  async fetch(request, env) {
    const userId = new URL(request.url).searchParams.get("user");
    const worker = await DynamicWorker.get(userId, async () => {
      // This callback runs only on first creation
      const code = await env.R2.get(`workers/${userId}.js`);
      return { code: await code.text() };
    });
    return worker.fetch(request);
  }
};

Code Mode: 81% Fewer Tokens for AI Agents

This is the headline feature for AI builders. Code Mode fundamentally changes how LLMs interact with tools. Instead of making sequential tool calls (ask the model, get a tool call, execute it, return the result, ask again), the LLM writes a single JavaScript program that orchestrates multiple operations in one shot.

📊 Code Mode vs Tool Calling — By the Numbers

81% token reduction — Code Mode uses dramatically fewer tokens compared to sequential tool calling, according to Cloudflare's benchmarks. Fewer tokens means lower cost, lower latency, and fewer points of failure.

TypeScript interfaces are 4x more token-efficient than OpenAPI — Instead of describing available tools with verbose OpenAPI schemas, you give the LLM TypeScript type definitions. The model produces strongly-typed code that your Dynamic Worker executes.

Single round-trip — An agent that previously needed 5–10 back-and-forth tool calls can now generate one JavaScript program that calls all the same APIs in sequence or in parallel. Fewer network hops, faster results.

Cloudflare provides three helper libraries to support Code Mode:

  • @cloudflare/codemode — Core library for Code Mode integration. Handles the LLM prompt engineering, code extraction, and execution lifecycle.
  • @cloudflare/worker-bundler — Bundles multi-file Dynamic Workers with dependencies, so LLM-generated code can import real npm packages.
  • @cloudflare/shell — Interactive shell environment for Dynamic Workers. Useful for building REPL-style developer tools and debugging.

Security Model

Running arbitrary code is dangerous — unless your runtime was designed for it. Dynamic Workers inherit the same V8 isolate sandboxing that powers all Cloudflare Workers. Each isolate gets its own memory space, cannot access the filesystem, and runs with strict resource limits.

🔒 Security Layers

V8 isolate sandboxing — Each Dynamic Worker runs in its own V8 isolate, the same technology that isolates browser tabs in Chrome. No shared memory, no filesystem access, no process spawning.

globalOutbound: null — You can create Dynamic Workers with no outbound network access at all. The isolate can only return data to the parent Worker. This prevents exfiltration of sensitive data by untrusted code.

Credential injection — Secrets and API keys live in the parent Worker's environment bindings. The Dynamic Worker never sees them directly. The parent controls what the child can access, keeping credentials out of LLM-generated code.

This security model is a significant advantage over container-based code execution. Containers share a kernel, require careful seccomp/AppArmor configuration, and have a much larger attack surface. V8 isolates were purpose-built for running untrusted code.

Configuration

Enabling Dynamic Workers requires adding the dynamic_workers binding to your wrangler.jsonc configuration:

wrangler.jsonc
{
  "name": "my-dynamic-app",
  "main": "src/index.ts",
  "compatibility_date": "2026-03-15",
  "dynamic_workers": {
    "enabled": true
  }
}

That is the entire configuration. No provisioning, no capacity planning, no scaling rules. The parent Worker creates Dynamic Workers on demand, and Cloudflare handles the rest across its global network.

Use Cases

🤖 AI Agents & Code Generation

The primary use case. An LLM generates JavaScript to accomplish a task — query a database, call an API, transform data — and a Dynamic Worker executes it in a sandbox. Code Mode makes this dramatically more efficient than traditional tool calling. Zite, an early adopter, reports processing millions of execution requests daily with Dynamic Workers in production.

🚀 AI-Generated Applications

Build platforms where users describe an app and AI generates it. The generated code runs in a Dynamic Worker — fully isolated, instantly deployed, globally available. No build step, no deploy pipeline. Users go from prompt to running application in seconds.

Rapid Prototyping & Custom Automations

Spin up isolated environments for testing code snippets, running user-submitted scripts, or building per-tenant automation workflows. Each Dynamic Worker is its own sandbox — one user's buggy code cannot affect another's.

If your Dynamic Workers connect to Cloudflare D1 databases, MyD1's AI Agent helps you write advanced queries, optimize database performance, and discover insights you would never find without AI — all from a native macOS app.

Why JavaScript?

Dynamic Workers run JavaScript (and TypeScript) exclusively. This is a deliberate choice, not a limitation:

  • Natively sandboxable — JavaScript was designed to run untrusted code in browsers. V8 isolates inherit decades of security hardening from Chrome.
  • TypeScript interfaces are 4x more token-efficient than OpenAPI — When an LLM needs to understand available APIs, a TypeScript interface definition is dramatically more compact than the equivalent OpenAPI JSON schema. Fewer tokens in the prompt means more room for reasoning.
  • Universal language for LLMs — JavaScript is the most widely represented language in LLM training data. Models generate more accurate, idiomatic JavaScript than almost any other language.
  • Instant execution — No compilation step. Raw source code goes in, results come out. This is critical for sub-second agent workflows.

Pricing

Dynamic Workers follow a simple pricing model:

Tier Included Overage
Workers Paid plan 1,000 Dynamic Workers/month $0.002 per worker/day
Open Beta 1,000 Dynamic Workers/month Waived (free during beta)

Standard Workers usage fees (CPU time, requests) still apply to both the parent Worker and each Dynamic Worker. But the Dynamic Worker creation cost itself is minimal — and free while the beta lasts.

Dynamic Workers vs Containers vs Lambda

Dynamic Workers Docker / K8s AWS Lambda
Cold start<1ms1–30s200ms–1.5s
Memory overhead~2 MB per isolate50–500 MB128–512 MB min
Sandbox modelV8 isolateLinux kernel / cgroupsFirecracker microVM
Runtime creationAPI call (ms)API call (seconds)Deploy required
LanguagesJS / TSAnyMany (with runtimes)

Getting Started

The fastest path from zero to a running Dynamic Worker:

  1. Make sure you are on the Workers Paid plan ($5/month)
  2. Add "dynamic_workers": { "enabled": true } to your wrangler.jsonc
  3. Import DynamicWorker from "cloudflare:workers" in your parent Worker
  4. Call DynamicWorker.load(code) or DynamicWorker.get(id, callback)
  5. Deploy with npx wrangler deploy

Full walkthrough: Dynamic Workers — Getting Started.

;)

Building on D1? Manage it visually.

If your Workers (dynamic or otherwise) use Cloudflare D1, download MyD1 to browse tables, run queries, and let the AI Agent optimize your database — all from a native macOS app. No terminal needed. See how it works.

Related: AWS EC2 vs Cloudflare Workers Stack · Build a Full-Stack App on Cloudflare for Free · Cloudflare's Edge Security Paradigm · Why AWS Is So Slow