I’ve had many conversations over the years with technical leaders from our customers and prospects about how they can better connect and automate their organizations. While the specific applications change and technology trends evolve, the one constant has always been the need to secure and protect data. But in recent years, with the acceleration of AI projects, the tone has shifted from mere corporate policy to critical business imperative.
When I describe the unique architecture of the Boomi Enterprise Platform, with our cloud platform control plane combined with a distributed runtime data plane, someone usually says, “OK, so Boomi has a local agent to connect to on-premises systems. I get it.” That’s when I reply, “There’s more to it than that.”
My explanation goes something like this:
Our runtime is not just a data-fetcher that provides last-mile connectivity but still sends your data back to its mothership for processing. It’s a self-contained engine that does the actual processing and connects directly to your applications, so your data never goes where you don’t want it. It can scale up to elastic enterprise Kubernetes clusters or down to run on a point-of-sale device. You can deploy runtimes to private clouds, public clouds, on premises, in any combination, anywhere in the world, and remotely manage and monitor them from a single control plane.
If you want your AI initiatives to move from hype to meaningful business value, you need to equip them with your data and your tools to take action. You can’t have your “secret sauce” out in the cloud, co-mingling with the rest of the Internet in some public model. You need an architecture that allows you to keep your data locally, where it’s safe. Boomi does that with a distributed runtime that separates the control plane and the data plane. That’s why our customers see measurable results from AI.
Once I’ve connected those dots, the light bulb moment happens. Technical leaders, such as CIOs and CTOs, recognize that while everyone today is talking about the growing significance of digital sovereignty, very few platforms actually have the infrastructure to address it.
That’s why Boomi is perfectly positioned to handle an issue that’s fast becoming a priority challenge for every enterprise.
The Technologist’s Lens on Sovereignty
Digital sovereignty, which refers to how technology and assets, such as sensitive data, are subject to the laws of the country or region where they’re physically located, is a thorny dilemma in the current global environment. Governments are wary of sensitive data leaving their borders and have been erecting higher walls to prevent it. And it’s a concern for businesses, too.
As Boomi’s Principal Product Manager for Integration Services, I’ve always approached sovereignty from a technologist’s perspective. It’s about flexibility and control. Businesses should have the choice about where their data lives, where it’s processed, and which software uses it. When required, information should remain within their physical control networks.
When you know exactly where your data is, who is accessing it, and how it’s used, you never lose custody of your most valuable asset.
That was important long before the explosion of AI or the rise of geopolitics as a factor. It has always been a guiding principle at Boomi.
The Distributed Runtime Story
I joined Boomi in 2006 as one of the first 20 employees. We offered an on-premises EDI solution at the time, but the cloud was beginning to gain mainstream traction. One of our co-founders had the inspired idea to pivot and become the first hybrid integration company.
Back then, there were a few fledgling cloud-only integration products. For any enterprise-grade integration involving on-premises endpoints, you had to install and manage a legacy product on a server in your data center.
But Boomi changed that game with a patented runtime design.
The magic lies in Boomi’s core architecture, which decouples the platform’s operating software into two separate components. The central control plane would be managed in an easily accessible cloud, while the data plane runtime, which performs the work, would be a portable engine. That runtime (originally called the “Atom”) could sit wherever the customer desired and “phone home” to receive instructions. But the actual executions — the mapping, transformations, system connectivity — occurred exclusively with that runtime, locally within the customer’s environment. Here’s the key point: The processing and the persistence of job data and logs remained local.
When we launched the platform in 2009, Boomi effectively created an entire new business category: iPaaS (Integration Platform as a Service). It was elegant because it allowed for complete adaptability, connecting everything, anywhere, and data never left the business’s possession. Because that data was anonymized, Boomi provided the recipe but never saw the ingredients.
Over time, the Boomi platform functionality has expanded to include a wealth of capabilities, including API and data management. It is also a trendsetter in the emerging field of AI agent management. But the needs surrounding data access and governance that we initially solved for integration have laid the perfect foundation for addressing those same sovereignty concerns for today’s APIs and AI agents.
The Infrastructure Landscape Pendulum
A confluence of issues is contributing to the sovereignty dilemma for enterprises. In an interconnected world, business is a global endeavor. Proprietary data can be here, there, everywhere. However, as many nations take action to protect sensitive data, such as citizens’ personally identifiable information (PII), the result is a complex web of competing mandates. It’s increasingly challenging for technology leaders to ensure they stay within the bounds of data rules.
But beyond government regulations, something else has been happening. There’s a fundamental shift in how businesses want to run their operations. Back when I first joined Boomi, it was still primarily an on-premises world. Then came a mad rush to the cloud to leverage the latest-and-greatest technologies and cut costs.
In fact, the initial motivation for our distributed architecture was a response to the emerging hybrid enterprise, with businesses rapidly adopting SaaS applications (such as Salesforce). Customers’ application landscape — therefore their businesses — were now fractured, and it was our job to reconnect and have them operate seamlessly as a whole once again, wherever those workflows needed to run.
Today, with the emergence of AI, the pendulum is swinging back toward moving critical operations back on premises, or at least to private or in-region clouds. With contextual, proprietary data being so crucial to AI for training models and equipping agents with the tools to perform tasks, companies are re-evaluating the wisdom of storing so much of it in a public cloud. Is it more responsible to have it tucked away safely behind firewalls?
Every business is mulling over that question. There’s no right or wrong answer. In fact, the best answer “right now” constantly evolves. But that’s just the point: regardless of your application and infrastructure strategy at the moment, you need a platform that provides flexibility and governance to activate your data for responsible AI and automation. It’s a requirement for every business to be able to place integrations and APIs wherever needed — and still manage them all from a single central control plane.
That’s the Boomi distributed runtime story I’ve been telling for a long time. But those discussions have never been more relevant than they are now.
What’s Next: The third post in our series explores how Boomi Hosted Runtimes meet businesses wherever they are in their data management journey.