On 10th June 2025, the UK’s Information Commissioner’s Office (ICO) quietly issued an update that could have wide-reaching implications for financial services. Just days before the Data (Use and Access) Act 2025 received Royal Assent, the ICO confirmed it will publish a new statutory code of practice for AI and automated decision-making—aimed at clarifying transparency, fairness, and accountability when using personal data in AI models.
This is a development we welcome. Not because it adds another compliance hurdle, but because it provides much-needed clarity. And in financial services, clarity isn’t just compliance—it’s confidence. It empowers FS leaders to innovate responsibly and act decisively, without fear of crossing a regulatory line they didn’t see coming.
The Data (Use and Access) Bill is the UK’s long-awaited post-Brexit update to data protection law. It aims to reduce red tape while maintaining EU adequacy and supporting innovation—particularly in sectors like FS where data use is complex, regulated, and high-stakes.
The ICO’s commitment to publish a statutory AI code of practice—expected within 6–12 months—signals a shift from theory to enforcement. And financial services companies are firmly in scope.
This isn’t just about getting ahead of AI regulation. It’s about protecting the trust you’ve built, reducing reputational and legal risk, and using data more intelligently across your business.
At Beyond: Putting Data To Work, we help FS businesses put data to work responsibly—ensuring compliance, driving growth, and reinforcing trust. Here’s what we recommend:
AI holds huge potential for FS firms—from hyper-personalisation to smarter risk modelling. But without the right safeguards, it also opens the door to unintended bias, opaque decisions, and customer mistrust.
The ICO’s upcoming code won’t remove those risks. But it will give you a roadmap to manage them—if you're ready.
We’ve spent years helping FS organisations use data and AI to gain competitive advantage—ethically, effectively, and with full regulatory awareness.
If you’re reviewing your AI governance, or want to benchmark your data readiness, get in touch. We’re here to help you turn uncertainty into action—and regulation into an edge.