The conversation is changing. For the first time ever, the person or thing on the other side of an interaction isn’t always human. Every time I talk with other executives, the “agentic future” comes up. It’s a compelling idea: agents replacing old systems to actually solve problems for us without oversight.

With more than a billion AI agents poised to handle everything from customer complaints to complex trades by 2029, the hurdle isn’t the tech itself. It’s whether we can actually trust it. The reality is that most businesses are stuck in the pilot stage. Not for failure of imagination, but because we don’t have the right tools to move from a cool demo to a smart system that works safely at scale.

The old plumbing, or legacy infrastructure, wasn’t built for an agentic future. Workflows break easily. Data is trapped in silos. Trust is bolted on versus being built in. The result: As we deploy more agents, complexity will turn into chaos.

What’s missing is a trusted, neutral middle ground, a Switzerland for the modern tech stack. As billions of these interactions happen, we need a layer that acts like a nervous system, connecting and coordinating every app and agent. Think of it as a conversational command center that fixes the trust gap by focusing on three things: identity, governance, and visibility.

Identity: Verify Who Is Doing What

Let’s say you task an agent with purchasing an expensive driver that’ll add 20 yards off the tee, or in my case, one with AI to help me find the fairway more often. The retailer needs to know in real time that it was actually you who authorized the purchase, not some bad actor or rogue agent trying to improve their own handicap.

And as agents get more autonomy, the stakes get higher. A several hundred-dollar golf club purchased without approval is a nuisance. An unsanctioned bank transfer or a leaked confidential email is a disaster.

This goes far beyond the traditional machine-to-machine logins and identity tools we’ve used for years. Unlike traditional machines that follow a fixed script, agents use “reasoning” that is fluid and responds to each situation differently. They are built to work around problems and develop new skills. Expecting old school authentication, which is built for systems that react the same every time, to do the job against autonomous agents sets us up for disaster.

Forget one-time logins. Identity in the agentic era has to be alive, dynamic, and real-time, constantly checking user intent and behavior against specific rules. That’s how you make interactions secure, whether you’re talking about a person or a bot.

Governance: Define What Is Happening

Agents are autonomous by design. They’re meant to go off and do things on their own. To do this accurately, they need clear, defined guardrails and policies that say what systems, applications, or data they have permission to access, and for how long.

Let’s revisit the agent buying your driver. Instead of sticking to your budget, it orders a $1,200 model. Without proper governance, how would you know? How would you stop it? Governance isn’t just about setting rules—it’s about enforcing them in real time, ensuring agents operate within boundaries that align with business goals and compliance requirements.

Visibility: See Everything, Control Everything

Visibility is the third pillar. In an agentic future, every action—every decision, every transaction—must be logged, monitored, and auditable. Without visibility, you’re flying blind. You can’t trust what you can’t see.

Imagine an agent processing a high-value trade. Without real-time visibility, you wouldn’t know if it’s acting on outdated instructions or if a security breach is underway. Visibility ensures that every interaction is transparent, accountable, and aligned with organizational policies.

The agentic future isn’t just about deploying more AI—it’s about deploying it safely, securely, and at scale. The tools we’ve relied on for decades won’t cut it anymore. We need a new layer, a neutral Switzerland for the tech stack, to bridge the gap between promise and reality.