UK financial services employees call for AI transparency and safeguards

-

The study, from communications data and intelligence provider Smarsh, found that over a third (37%) of financial services employees in the UK say they frequently use public AI tools such as ChatGPT or Microsoft 365 Copilot in their daily work. However, a majority (55%) report that they have never received formal training on how to use these technologies.

With the widespread use of AI, transparency and compliance are now key concerns. Nearly 70 percent of respondents said they would feel more confident using AI tools if their outputs were monitored and captured for compliance. Yet 38 percent are unsure whether their organisation currently has systems in place to do this, and 21 percent say their employer definitively does not.

Compliance concerns over AI use and agent deployment

The report reveals that AI is not only being used to support internal productivity but is also being deployed in public-facing applications. Almost half (43%) of surveyed employees said their firm uses AI Agents – defined as autonomous systems capable of completing tasks without human oversight – for customer communications, including personalised financial advice. A further 22 percent reported the use of such agents in investment activities like portfolio management or trade recommendations.

HRreview Logo

Get our essential weekday HR news and updates.

This field is for validation purposes and should be left unchanged.
Keep up with the latest in HR...
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

However, concerns about regulatory compliance persist. A third (31%) of employees expressed doubts about their organisation’s ability to meet or apply the correct regulatory standards to AI Agents. In addition, 29 percent said they were unsure where potentially sensitive information was going when these tools were used.

Tom Padgett, President of Enterprise Business at Smarsh, said, “AI adoption in financial services has accelerated rapidly, with employees embracing these tools to boost productivity. But with innovation comes responsibility. Firms must establish the right guardrails to prevent data leaks and misconduct. The good news is that employees are on board – welcoming a safe, compliant AI environment that builds trust and unlocks long-term growth.”

AI growth outpacing oversight structures

The findings come as the Financial Conduct Authority (FCA) prepares to launch its AI live testing service, a programme intended to support the implementation of customer-facing AI tools within the sector. The regulatory development highlights the increasing focus on ensuring AI adoption aligns with consumer protection and compliance requirements.

Paul Taylor, Vice President of Product at Smarsh, raised concerns about uncontrolled use of public AI tools in regulated environments.

“Using public AI tools without controls is digital negligence,” he said. “You’re effectively feeding your crown jewels into a black box you don’t own, where the data can’t be deleted, and the logic can’t be explained. It’s reckless. Private tools like Microsoft 365 Copilot and ChatGPT Enterprise are a step in the right direction. Still, if companies aren’t actively capturing and auditing usage, they’re not securing innovation – they’re sleepwalking into a compliance nightmare.”

Alessandra Pacelli is a journalist and author contributing to HRreview, an HR news and opinion publication, where she covers topics including labour market trends, employment costs, and workplace issues. She is a journalism graduate and self-described lifelong dog lover who has also written for Dogs Today magazine since 2014.

Latest news

Personalising the Benefits Experience: Why Employees Need More Than Just Information

This article explores how organisations can move beyond passive, one-size-fits-all communication to deliver relevant, timely, and simplified benefits experiences that reflect employee needs and life stages.

Grant Wyatt: When the love dies – when staying is riskier than quitting

When people fall out of love with their employer, or feel their employer has fallen out of love with them, what follows is rarely a clean exit.

£30bn pension savings window opens for employers ahead of 2029 reforms

UK employers could unlock billions in National Insurance savings by expanding pension salary sacrifice schemes before new limits take effect in 2029.

Expat jobs ‘fail early as costs hit $79,000 per worker’

International assignments are ending early due to family strain, isolation and poor preparation, as rising costs increase pressure on employers.
- Advertisement -

The Great Employer Divide: What the evidence shows about employers that back parents and carers — and those that don’t

Understand the growing divide between organisations that effectively support working parents and carers — and those that don’t. This session shows how to turn employee experience data into a clear business case, linking care-related pressures to performance, retention and workforce stability.

Scott Mills exit puts spotlight on risk of ‘news vacuum’ in high-profile dismissals

Sudden departure of a long-serving BBC presenter raises questions about how employers manage high-profile dismissals and limit speculation.

Must read

Simon Blake: WFH one year on – What’s the mental health impact?

"It is estimated that we spend a third of our lives at work, so employers are key to creating a society where everyone’s mental health matters."

Pierre Berlin: Supercharging team performance with a pitstop crew mindset

"World-class Formula 1 drivers are the face of the Monaco Grand Prix, but it is arguably the pitstop teams in the background that get them to the finish line."
- Advertisement -

You might also likeRELATED
Recommended to you