top of page

AI is Changing the Fraud Risks Facing Public Sector Organisations

  • nicolaferraritest
  • Jan 27
  • 3 min read

January 2026, Tom Pepper, Partner
Published on: Public Finance




AI is powering both fraudsters and anti-fraud efforts. Each brings new risks to public sector organisations.


Public sector organisations are struggling to stay ahead of criminals using increasingly sophisticated methods as the use of AI to defraud the public purse ramps up.


Daniel Sibthorpe, director of cyber security and counter fraud at advisory firm Crowe, says the scale and pace of criminal activity means that public sector organisations are constantly playing catch-up. “Even if there are controls in place to prevent fraud happening, there are criminals halfway across the world who are spending all of their time trying to find weaknesses in those controls to exploit.”


A report by software company SAS highlighted a sharp rise in AI-powered fraud, including deepfake-driven social engineering; 77% of anti-fraud professionals reported an acceleration over the past two years and 83% expect these schemes to increase further, yet less than one in 10 feel well equipped to respond.


Sibthorpe says, although use of AI makes it easier to detect fraud, novel approaches using AI are also making it easier for criminals to exploit weaknesses in public sector systems.

“What a lot of criminals are doing at the moment is purchasing leaked data on criminal forums and using generative AI tools to create very convincing fake passports, driver’s licences and other forms of ID that will pass a lot of KYC [know your customer] and background checks.


“If they have those identifiers, and then they’re supplementing it with a utility bill or council tax bill developed using AI, how are people on the front line possibly going to notice that?”

Sibthorpe says better data sharing across the public sector is helping to combat some of the risks.


“Tenancy fraud is an example where each council would have their own database, but if someone committed fraud in one local council, there was nothing to flag them up to others. In the past year or so, there have been developments in technologies to pick these things up much quicker.”


Risks to manage


Meanwhile, the rush for public sector organisations to adopt AI could be increasing their risk of fraud, Sibthorpe warns. “We’ve already seen examples of cyber criminals inputting malicious code into chatbots to gain access to databases of information.”


Tom Pepper, partner at Avella Security and security lead at the government’s AI Security Institute, says local authorities are understandably drawn to adopting AI due to the promise of enhanced efficiency, deeper insights, and potential relief for stretched teams. Yet its adoption has the potential to introduce cyber security risks that cannot be ignored.


“AI systems rely and depend on trustworthy data. If an attacker can alter or feed false or malicious information into a model, they can influence outcomes in ways that support fraudulent claims or mask suspicious behaviour. There is also the risk of attackers probing how an AI system works, then shaping their submissions and prompts to avoid detection.

Generative AI adds another layer of complexity, allowing people to create seemingly genuine forged documents or synthetic identities that are harder for traditional checks to spot.”


To mitigate the risks, councils should understand where data comes from, how it is validated and who has access to it. “Regular assurance testing or red teaming can identify weak spots before they are exploited. Finally, strong audit trails and clear ownership ensure issues are picked up quickly.”


Caroline Carruthers, chief executive of data consultancy Carruthers and Jackson, says a clear, robust strategy is the key to reaping the benefits of AI. “Developing a clear framework is vital to act as a ‘north star’ for navigating challenges around resources, knowledge gaps and ethical concerns, but company-wide AI policies are rarely fully polished when first installed. Re-evaluating and evolving a framework is a natural part of the process.”

bottom of page