Experts in top team and board consulting, training and development
Blog
Posted by Andrew & Nada on 23rd July 2025
Boards don’t need AI experts – but they do need to step up

For boards Artificial intelligence (AI) is continuing to mean fresh risks, new opportunities, and a growing need for clarity of purpose.

AI is reshaping everything, from how we create content to how we analyse data and make decisions.

Tools like ChatGPT can generate entire reports or business plans at the push of a button. But they can also produce errors, invent facts, and leave organisations exposed to misinformation and liability.

The issue for boards isn’t about mastering the technical details. It’s about understanding their duty. AI is not just another tool, it’s a governance challenge.

Governance in the Age of AI
The use of generative AI continues to accelerate. From customer service to compliance, productivity gains are already being realised. But with great potential comes great complexity.

Too much of today’s guidance for boards on AI is vague, almost to the point of uselessness. It’s not enough to be told to ‘stay informed’ or ‘embrace digital transformation.’ Leaders need a practical framework for making informed decisions that deliver real value and manage real risks.

Where boards should focus
Firstly, boards must ensure the business has a clear, strategic approach to AI. This means understanding how AI can enhance service delivery, reduce costs, and help avoid human error.

It also means recognising AI’s unique ability to analyse large volumes of data and predict outcomes more accurately than any team of analysts.

But just as important is identifying where AI can go wrong. Generative AI can mislead users with confident-sounding but false information. That’s a reputational risk – and a governance one.

All too often, organisations focus on training employees to use new tools, but neglect the bigger question: How is this technology being integrated into the business strategy? Who’s responsible for ensuring it adds value? And where are the checks and balances?

A four-part framework
We propose a simple but powerful four-part lens through which boards should evaluate any AI initiative:

  • Competitive advantage: What edge does AI give us? Is it sustainable? Can we measure it?
  • Risk: What new vulnerabilities are we exposed to? Are we prepared?
  • Reputation: How will this affect how we’re seen, by customers, regulators and investors?
  • Value: Taking it all together, does this technology actually deliver real, lasting value?

These principles aren’t new. But applying them rigorously to AI is vital.

Strategy is, after all, only 20% of the equation. The remaining 80% is execution, and that’s where things often fall apart.

Listen to the right people
Boards don’t need to become AI experts. But they do need to engage the right voices.

It’s not just about talking to the CEO or CTO. It’s about including the operational managers, the people who will actually implement the strategy. Their local knowledge is crucial. They know what will work in the real world, and what won’t.

If they’re not part of the conversation, strategy risks becoming abstract – or worse, completely disconnected from reality.

Too many boards create a culture where middle management is afraid to speak up. That’s a missed opportunity. Front-line insight can make or break the success of any AI initiative.

The stakes are high
Let’s be clear: The implications of AI are enormous. The UK’s AI sector is already contributing £9.1 billion to the economy, and that number is only set to grow.

At the same time, AI-enabled cybercrime is projected to reach $10.5 trillion globally by 2025, making it the world’s third-largest ‘economy’ after the US and China.

The fraud prevention market tied to AI is forecast to grow from $30 billion in 2021 to $250 billion by 2030. The protection tools are still lagging behind the technologies they’re meant to govern.

Oversight, not expertise
Board directors don’t need to be coders or data scientists. But they do need to rigorously scrutinise how innovation affects the business.

Whether it’s AI, ESG, blockchain or climate-related advances, the duty of the board is always the same: Provide oversight which is informed, evidence-based, and aligned with the organisation’s strategic priorities.

By grounding every innovation in competitive advantage, risk, reputation, and value, boards can confidently lead through complexity without getting lost in technical jargon.

AI doesn’t demand a new type of director, it demands directors who are already doing their jobs well. And that means one thing above all: Clarity of duty.