For developers building across multiple cloud environments, legacy infrastructure continues to present familiar hurdles: rigid networking setups, complex connectivity requirements, and steep data egress fees. Cloudflare’s latest announcements directly target these pain points with the release of Workers VPC and Workers VPC Private Link, two new offerings designed to help developers build and connect secure applications across cloud and on-premises systems.
“We constantly hear from developers that they want the freedom to build with Cloudflare Workers, but legacy cloud providers make it challenging,” said Cloudflare CEO Matthew Prince. “Developers deserve to be able to build with whatever tools they want, no matter where their data lives or what infrastructure they rely on.”
Unveiled during Cloudflare’s Developer Week, Workers VPC and Workers VPC Private Link bring virtual private cloud (VPC) functionality to the company’s developer platform, enabling engineers to isolate and connect workloads securely across distributed environments.
“Workers VPC is a modern approach to the traditional VPC model built for a network and compute workloads that are not tied to a single region,” Cloudflare noted in a blog post.
Workers VPC allows developers to create isolated environments where resources like Workers, Durable Objects, and storage can only communicate if they exist within the same VPC. The model mimics traditional cloud isolation practices but removes regional constraints and vendor lock-in.
Workers VPC Private Link, now in closed beta, enables secure, direct communication between a Cloudflare Workers VPC and external VPCs across AWS, Azure, GCP, or private data centers. It’s a critical offering for enterprises building hybrid applications that span multiple infrastructures. Developers can now link Workers directly to their backend systems without exposing traffic to the public internet.
This setup offers an end-to-end secure architecture: compute and storage remain within Cloudflare, while Private Link provides the bridge to external systems. Both features reduce architectural complexity and eliminate the need for manual workarounds that developers previously had to implement.
While Cloudflare has been steadily evolving from a CDN and security vendor into a full-stack developer platform competing with AWS Lambda and Azure Functions it’s also addressing the operational complexities of managing enterprise-scale deployments. That’s where Cloudy comes in.
Cloudy is an AI assistant now integrated into Cloudflare’s Web Application Firewall (WAF) and Gateway. It’s designed to help teams interpret, audit, and optimize their existing configurations.
The idea originated in 2023 when Cloudflare introduced natural language prompts to generate WAF rules. Customers could enter simple commands like “Block all POST requests with file uploads over 10MB,” and the system would produce precise firewall logic. The tool was well-received for lowering the barrier to writing rules.
Cloudy now tackles the next challenge: helping customers manage vast and often outdated rule sets. In Cloudflare Gateway, where enterprises manage traffic policies across thousands of endpoints, Cloudy generates plain-language Policy Summaries, highlights conflicts or redundancies, and flags inactive but potentially important rules.
In the WAF, Cloudy reviews both custom and managed rules, identifies overlaps, flags deactivated rules, and offers suggestions based on live traffic patterns. These tasks, once reliant on teams manually parsing JSON or navigating long rule tables, are now largely automated.
Cloudflare reports that Cloudy has already saved thousands of hours internally, especially within support and engineering teams. Instead of requesting screenshots or logs, support engineers can now prompt Cloudy to summarize a customer’s setup and pinpoint likely issues.
Prince emphasized that the tool is designed to enhance human capabilities, not replace them. “AI has helped us not replace people, but help make people better,” he said. “No code would ever get released without significant human review. No human code would get released without AI review.”
Cloudy is part of a broader internal effort to integrate AI into Cloudflare’s infrastructure and operations. The company’s Trust & Safety team, for instance, now uses large language models to assist with policy enforcement, customer complaints, and threat detection.
Prince pointed to a broader industry shift toward human-machine collaboration, citing Google’s claim that 25% of its new code is AI-generated and Microsoft’s prediction that up to 95% of code could eventually be produced by AI. But at Cloudflare, security and accuracy remain the priority.
Others in the tech industry share this mindset. Cisco’s Liz Centoni, who began her career in coding, has said technical grounding helps her ask better questions. Google Research’s Yossi Matias continues to advocate for strong engineering fundamentals. And Okta CEO Todd McKinnon recently emphasized that AI won’t eliminate software engineering jobs but they’ll evolve.
For Prince, the company’s technical direction is personal. A computer science graduate who later earned both a JD and MBA, he said his engineering background still informs how he leads.
“Even if you’re not the person who has your hands on the keyboard writing the code anymore, I think a basic understanding is helpful,” he told Business Insider. That understanding, he believes, makes him a better CEO able to speak fluently with both engineers and legal teams.
As for fears about job displacement from AI, Prince remains firm: “Especially in a field that is as security conscious as we are, no code would ever get released without significant human review.” And the reverse, too: “No human code would get released without AI review.”
Internal pilots have shown that Cloudflare’s machine learning tools can uncover previously undetected threats, automate support tasks, and improve customer satisfaction scores. And its not fewer jobs just better ones.