From Risk to Reward: Our experts top 4 tips on risk mitigation for AI

Recently we brought together some of the brightest minds in technology, risk and policy to tackle one of the most urgent questions facing business and government today: how do we navigate the evolving risk and compliance landscape in the era of AI?
The conversation was led by a powerhouse panel featuring Major General (Retr’d) Marcus Thompson, AO, PhD, Australia’s first Head of the Department of Defence’s Information Warfare Division. Marcus is one of the country’s most respected voices on cybersecurity, with deep experience across military operations, national security and critical infrastructure. His insights on AI, risk and sovereignty had the room leaning in. Alongside Macquarie Technology Groups Head of Policy and Industry Jamie Morse, and Macquarie Data Centres VP of Sales, Gavin Dudley they gave insight into the risks, and opportunities businesses need to be aware of when embarking on their AI journey.
So, what did we learn? Here are four big takeaways every business, especially those in critical infrastructure, should be paying close attention to.
1. Regulations are coming. Don’t Wait for Canberra to Catch Up.
Regulations around AI and data are tightening, fast. The federal government may have moved cautiously in the past, but this won’t be the case with AI. “The wheels of Government move slowly but grind deeply” explains Dudley
“We’re seeing the government steadily turn up the dial,” said Jamie Morse, Head of Government & Regulatory at Macquarie Technology Group.
Right now, the federal government is developing a National AI Capability Plan, a roadmap for how AI will be rolled out across public services. But this isn’t just about government adoption. It has major flow-on effects for the private sector too, especially those in deemed as critical infrastructure like finance, telco, health and energy.
As Jamie put it: “Government starts to see critical infrastructure organisations as extension of government.” Through emerging frameworks like the System of Government Significance, the lines between public and private sector responsibilities are blurring. If you’re part of that critical infrastructure web, the policies coming out of Canberra won’t just influence you, they’ll apply to you.
Off the back of tightening Australian Federal Government policy, industry bodies will follow suit and develop industry-specific compliance. This is something we’ve already seen with the recent warning around AI given to financial institutions from APRA.
Key Recommendation: Alignment with evolving Government policy will keep you ahead of the curve and reduce your risks down the track.
Take Action: Understand the changes policies that govern your industry and conduct an internal audit against frameworks like PSPF (Protective Security Policy Framework), ISM (Information Security Management), SOCI (Security of Critical Infrastructure) and CPS230.
2. Sovereignty is No Longer Optional.
Where your AI systems live—physically and jurisdictionally—has real consequences. And sovereignty isn’t just about where your data sits. It’s about who owns the infrastructure, who controls the AI models, and importantly, what laws apply to it.
“We’re in a time of massive geopolitical uncertainty,” said Marcus. “This is not the time to leave your most critical infrastructure in someone else’s hands.”
Government policy is evolving too. Jamie pointed out that “sovereign capability” is something that Government departments are being given specific order to prioritize. This has also included the government recently tightening the definition of what it means to be a sovereign Australian company, it’s no longer just any business with an ABN.
From legal jurisdiction to foreign interference, the risks of holding data offshore are growing with rising geopolitical tensions.
Key Takeaway: Where your AI lives matters. Sovereign infrastructure gives you total control and reduces risk.
Take Action: Assess where your data lives, who controls it, and what laws apply to it. Audit your AI and digital infrastructure footprint against sovereignty requirements.

3. AI Risk Belongs on the Board Agenda.
One of the strongest messages of the night? AI risk isn’t just an IT problem.
“Risk is to be managed, not avoided. But it needs to be elevated—get it up to the board level.” says Marcus.
Smart organisations are now treating AI as an enterprise risk, not a tech project. That means assessing it for operational, reputational, legal and compliance impacts. It means scenario planning. It means war-gaming what could go wrong before it does.
Marcus’s main advice to enterprises exploring AI was to “Quantify and recognise the risks and have them elevated into the into your organisations risk register so it’s getting attention where it needs to, with the right set of eyes on it, with the right frequency”
Whether it’s a misfiring AI chatbot, a breach in your training data supply chain, or a. compliance issue that forces you to unplug a live model—these aren’t hypotheticals. They’re fast becoming boardroom conversations.
Key Takeaway: Traditional risk frameworks need to evolve. You need to proactively forecast emerging risks. We recommend overlaying three critical dimensions:
- Reputational Risk – In the event of breach’s, AI’s mis-representations like bias or discrimination or poor customer experience, will the supply chain & demonstratable risk management process stand up to scrutiny?
- Technology Risk – AI proof-of-concepts may succeed in isolation—but scaling them introduces a whole new class of long-term technology risks. These risks don’t just sit in the model—they sit in the stack, the supply chain, and the governance. Consider: Security vulnerabilities, data compliance & sovereignty, infrastructure scalability and supply chain vulnerabilities.
- Financial risk – Building in the wrong environment now, and you might be forced to move later, at great expense. Shifting data centres means downtime, replatforming, lost productivity and blown budgets. Add hidden costs like vendor lock-in the risk multiplies fast.
Take action: Involve the right people and create an AI risk register that tracks risks across all current and planned AI initiatives and look to embed a mandatory risk review in every AI project’s business case.
4. Build It Right From the Start.
One of the biggest risks facing organisations today isn’t just what AI they choose, but where they build it.
Deploying AI in a non-compliant or non-sovereign environment today could mean ripping it out tomorrow. Rebuilding from scratch. Losing productivity, IP and trust.
“There’s no premium for using a sovereign data centre,” said Gavin Dudley, VP of Sales at Macquarie Data Centres. “But there’s a massive cost if you get it wrong.”
Companies are already embedding AI in mission-critical functions and decision making. We’re seeing it in everything from customer service, to medical diagnostics, to war logistics as Marcus explained. Choosing the wrong infrastructure is business risk that could stall innovation and leave you exposed. On the flip side, choosing a sovereign, compliant data centre puts you ahead of the curve—ready for regulation, trusted by customers, and primed to win in an AI-driven economy.
Key takeaway: Shortcuts today can mean setbacks tomorrow. Build AI systems in sovereign, compliant and scalable data centres from day one or risk delays and costs down the track.
Take Action: Bake sovereignty and compliance into your procurement and build processes from day one. Ask your providers the hard questions now—before regulation or risk forces your hand.
Tame the Risk. Seize the Opportunity.
AI presents huge opportunities, but also huge responsibilities. Whether you’re in banking, healthcare, defence or enterprise tech, now’s the time to put sovereignty, compliance and risk at the centre of your AI strategy.
As Marcus said: “The better organisations treat AI as a genuine business risk—not one that lives in the IT department.”
If you’d like to explore what a sovereign, compliant AI-ready data centre looks like, we’d love to show you around. Book a tour or download our Sovereign AI eBook to learn more.