HRreview 20 Years
This field is for validation purposes and should be left unchanged.
Subscribe for weekday HR news, opinion and advice.
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

UK financial services employees call for AI transparency and safeguards

-

The study, from communications data and intelligence provider Smarsh, found that over a third (37%) of financial services employees in the UK say they frequently use public AI tools such as ChatGPT or Microsoft 365 Copilot in their daily work. However, a majority (55%) report that they have never received formal training on how to use these technologies.

With the widespread use of AI, transparency and compliance are now key concerns. Nearly 70 percent of respondents said they would feel more confident using AI tools if their outputs were monitored and captured for compliance. Yet 38 percent are unsure whether their organisation currently has systems in place to do this, and 21 percent say their employer definitively does not.

Compliance concerns over AI use and agent deployment

The report reveals that AI is not only being used to support internal productivity but is also being deployed in public-facing applications. Almost half (43%) of surveyed employees said their firm uses AI Agents – defined as autonomous systems capable of completing tasks without human oversight – for customer communications, including personalised financial advice. A further 22 percent reported the use of such agents in investment activities like portfolio management or trade recommendations.

HRreview Logo

Get our essential weekday HR news and updates.

This field is for validation purposes and should be left unchanged.
Keep up with the latest in HR...
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

However, concerns about regulatory compliance persist. A third (31%) of employees expressed doubts about their organisation’s ability to meet or apply the correct regulatory standards to AI Agents. In addition, 29 percent said they were unsure where potentially sensitive information was going when these tools were used.

Tom Padgett, President of Enterprise Business at Smarsh, said, “AI adoption in financial services has accelerated rapidly, with employees embracing these tools to boost productivity. But with innovation comes responsibility. Firms must establish the right guardrails to prevent data leaks and misconduct. The good news is that employees are on board – welcoming a safe, compliant AI environment that builds trust and unlocks long-term growth.”

AI growth outpacing oversight structures

The findings come as the Financial Conduct Authority (FCA) prepares to launch its AI live testing service, a programme intended to support the implementation of customer-facing AI tools within the sector. The regulatory development highlights the increasing focus on ensuring AI adoption aligns with consumer protection and compliance requirements.

Paul Taylor, Vice President of Product at Smarsh, raised concerns about uncontrolled use of public AI tools in regulated environments.

“Using public AI tools without controls is digital negligence,” he said. “You’re effectively feeding your crown jewels into a black box you don’t own, where the data can’t be deleted, and the logic can’t be explained. It’s reckless. Private tools like Microsoft 365 Copilot and ChatGPT Enterprise are a step in the right direction. Still, if companies aren’t actively capturing and auditing usage, they’re not securing innovation – they’re sleepwalking into a compliance nightmare.”

Alessandra Pacelli is a journalist and author contributing to HRreview, an HR news and opinion publication, where she covers topics including labour market trends, employment costs, and workplace issues. She is a journalism graduate and self-described lifelong dog lover who has also written for Dogs Today magazine since 2014.

Latest news

Felicia Williams: Why ‘shadow work’ is quietly breaking your people strategy

Employees are losing seven hours a week to tasks that fall outside their core job description. For HR leaders, that’s the kind of stat that keeps you up at night.

Redundancies rise as 327,000 job losses forecast for 2026

UK job losses are set to rise again as redundancy warnings hit post-pandemic highs, with employers cutting roles amid rising costs and economic pressure.

Rise of ‘sickfluencers’ and AI advice sparks concern over attitudes to work

Online influencers and AI tools are shaping how people approach illness and employment, heaping pressure on employers.

‘Silent killer’ dust linked to 500 construction deaths a year as 600,000 workers face exposure

Hundreds of UK construction workers die each year from silica dust exposure as a new campaign calls for stronger workplace protections.
- Advertisement -

Leaders ‘overestimate’ how much workers use AI

Firms may be misreading workforce readiness for artificial intelligence, as frontline staff report far lower day-to-day adoption than executives expect.

Cost-of-living pressures ‘keep unhappy workers in their jobs’

Many say economic pressures are forcing them to remain in jobs they would otherwise leave, as pay and financial stability dominate career decisions.

Must read

Weston Morris: Brave New World: 2023’s digital workplace

Weston Morris, Director of Global Strategy, Digital Workplace Solutions at Unisys, discusses what’s coming for businesses in 2023, and how it will affect the digital workplace.

Arran Heal: Be ready for the Worker Protection Act

The Worker Protection Act will become law this year, meaning employers have to demonstrate they have taken “reasonable steps to prevent sexual harassment of employees."
- Advertisement -

You might also likeRELATED
Recommended to you