Bvoxro Stack

Dynamic Workflows: Powering Tenant-Specific Durable Execution on Cloudflare

Cloudflare Dynamic Workflows lets platforms run per‑tenant durable execution by supplying workflow code at runtime, combining dynamic Workers, storage, and source control.

Bvoxro Stack · 2026-05-06 20:29:31 · Cloud Computing

Cloudflare's latest innovation, Dynamic Workflows, bridges the gap between durable execution and dynamic deployment. This powerful tool lets platforms run custom, per‑tenant workflow logic without pre‑defining every class. Below, we answer key questions about how it works, why it matters, and what you can build with it.

What are Dynamic Workflows?

Dynamic Workflows extend Cloudflare's Workflows engine so that the code defining a workflow can be supplied at runtime, rather than being baked into a deployment. Previously, Workflows required you to bind a single class in wrangler.jsonc — fine if you own all the code, but limiting for platforms. With Dynamic Workflows, you hand the runtime a piece of TypeScript (or JavaScript) that implements the run(event, step) function, and it becomes a durable, fault‑tolerant program. This program can sleep for hours, wait for external events, and resume exactly where it left off if the isolate is recycled. The real breakthrough is that each tenant, each agent, or even each session can have its own unique workflow, all running in isolated, sandboxed Workers on the same machine.

Dynamic Workflows: Powering Tenant-Specific Durable Execution on Cloudflare
Source: blog.cloudflare.com

How does this differ from regular Workflows?

Standard Workflows (V2) are designed for single‑application scenarios. You write one workflow class, deploy it, and all instances run that same code. Dynamic Workflows remove that assumption. Instead, the workflow code is supplied dynamically — exactly like how Dynamic Workers let you inject compute code at runtime. This means a platform can allow each customer to write (or have AI generate) their own durable execution logic without your platform having to redeploy or manage thousands of separate applications. It’s the same durable execution guarantees (up to 50,000 concurrent instances and 300 new instances per second per account) but with per‑tenant flexibility.

What problems does Dynamic Workflows solve for platforms?

Consider a CI/CD product where every repository defines its own pipeline. Or an AI agent platform where each agent writes its own durable plan. In these cases, the workflow is different for every tenant, agent, or request. Previously, you'd have to either force all logic into one massive class or deploy thousands of separate Workers (which is slow and complex). Dynamic Workflows solve this by letting you hand the engine code at runtime, so each pipeline or agent gets its own isolated, durable execution environment. This matches the approach of Durable Object Facets for storage (per‑app SQLite) and Artifacts for source control (per‑agent filesystem). Now the compute side is equally flexible.

How does Dynamic Workflows integrate with other Cloudflare dynamic primitives?

Dynamic Workflows completes a trio of dynamic building blocks. First, Dynamic Workers gave you per‑request compute isolation — hand the runtime code, get a sandboxed Worker in milliseconds. Then Durable Object Facets gave each dynamic app its own SQLite database. Artifacts provided a Git‑native versioned filesystem you can create by the tens of millions. Dynamic Workflows ties all this together by adding durable execution. Now a platform can spin up a Worker, give it a database, a filesystem, and a workflow — all customised per tenant, per session, or per agent, running on the same infrastructure with Cloudflare as the supervisor.

Dynamic Workflows: Powering Tenant-Specific Durable Execution on Cloudflare
Source: blog.cloudflare.com

What are the primary use cases?

  • Multi‑tenant SaaS: Every customer runs their own business logic as a durable workflow, with isolated state and compute.
  • AI‑generated code: An AI writes TypeScript for each user request, and that code executes as a workflow that can run for days.
  • Agentic systems: Agents that write and run their own tools need a durable plan; Dynamic Workflows lets each agent define and persist its own orchestration.
  • CI/CD pipelines: Each repository defines its own pipeline logic, and the platform dynamically deploys it as a workflow that survives failures.
  • Onboarding flows and video transcoding: Any multi‑step process that is unique per tenant and must “keep going” past a single request.

How does performance and isolation work?

Dynamic Workflows inherit the performance characteristics of the Workers runtime: code is compiled and run in isolated V8 isolates, with single‑digit millisecond startup once the code is warm. Each workflow instance gets its own sandbox, so one tenant’s loop cannot affect another’s. The platform sits in front as a supervisor, managing resource limits and lifecycle. Because the workflow code is supplied at runtime, you avoid the overhead of deploying separate Worker bundles for each tenant — the dynamic engine handles it all on the same machine. This makes it practical to have thousands (or more) simultaneous durable executions, each with their own code and state, without sacrificing latency or security.

What’s the next step for developers?

Dynamic Workflows are now available in open beta. To get started, you can use the Workflows API combined with the Dynamic Workers primitive: supply the workflow class code as part of your worker’s request handler. The engine will compile it, bind it to the durable execution runtime, and start the workflow. Documentation includes examples of AI‑generated workflows and multi‑tenant CI/CD pipelines. Over time, Cloudflare plans to add even tighter integration with Durable Object Facets and Artifacts, so you can provision an entire tenant environment — compute, storage, filesystem, and durable execution — in a single API call. Try it today and see how easy it is to make every tenant feel like they own the infrastructure.

Recommended