<

!Google ads have two elements of code. This is the 'header' code. There will be another short tag of code that is placed whereever you want the ads to appear. These tags are generated in the Google DFP ad manager. Go to Ad Units = Tags. If you update the code, you need to replace both elements.> <! Prime Home Page Banner (usually shows to right of logo) It's managed in the Extra Theme Options section*> <! 728x90_1_home_hrreview - This can be turned off if needed - it shows at the top of the content, but under the header menu. It's managed in the Extra Theme Options section * > <! 728x90_2_home_hrreview - shows in the main homepage content section. Might be 1st or 2nd ad depending if the one above is turned off. Managed from the home page layout* > <! 728x90_3_home_hrreview - shows in the main homepage content section. Might be 2nd or 3rd ad depending if the one above is turned off. Managed from the home page layout* > <! Footer - 970x250_large_footerboard_hrreview. It's managed in the Extra Theme Options section* > <! MPU1 - It's managed in the Widgets-sidebar section* > <! MPU2 - It's managed in the Widgets-sidebar section* > <! MPU - It's managed in the Widgets-sidebar section3* > <! MPU4 - It's managed in the Widgets-sidebar section* > <! Sidebar_large_1 - It's managed in the Widgets-sidebar section* > <! Sidebar_large_2 - It's managed in the Widgets-sidebar section* > <! Sidebar_large_3 - It's managed in the Widgets-sidebar section* > <! Sidebar_large_4 - It's managed in the Widgets-sidebar section* > <! Sidebar_large_5 are not currently being used - It's managed in the Widgets-sidebar section* > <! Bombora simple version of script - not inlcuding Google Analytics code* >

UK financial services employees call for AI transparency and safeguards

-

The study, from communications data and intelligence provider Smarsh, found that over a third (37%) of financial services employees in the UK say they frequently use public AI tools such as ChatGPT or Microsoft 365 Copilot in their daily work. However, a majority (55%) report that they have never received formal training on how to use these technologies.

With the widespread use of AI, transparency and compliance are now key concerns. Nearly 70 percent of respondents said they would feel more confident using AI tools if their outputs were monitored and captured for compliance. Yet 38 percent are unsure whether their organisation currently has systems in place to do this, and 21 percent say their employer definitively does not.

Compliance concerns over AI use and agent deployment

The report reveals that AI is not only being used to support internal productivity but is also being deployed in public-facing applications. Almost half (43%) of surveyed employees said their firm uses AI Agents – defined as autonomous systems capable of completing tasks without human oversight – for customer communications, including personalised financial advice. A further 22 percent reported the use of such agents in investment activities like portfolio management or trade recommendations.

However, concerns about regulatory compliance persist. A third (31%) of employees expressed doubts about their organisation’s ability to meet or apply the correct regulatory standards to AI Agents. In addition, 29 percent said they were unsure where potentially sensitive information was going when these tools were used.

Tom Padgett, President of Enterprise Business at Smarsh, said, “AI adoption in financial services has accelerated rapidly, with employees embracing these tools to boost productivity. But with innovation comes responsibility. Firms must establish the right guardrails to prevent data leaks and misconduct. The good news is that employees are on board – welcoming a safe, compliant AI environment that builds trust and unlocks long-term growth.”

AI growth outpacing oversight structures

The findings come as the Financial Conduct Authority (FCA) prepares to launch its AI live testing service, a programme intended to support the implementation of customer-facing AI tools within the sector. The regulatory development highlights the increasing focus on ensuring AI adoption aligns with consumer protection and compliance requirements.

Paul Taylor, Vice President of Product at Smarsh, raised concerns about uncontrolled use of public AI tools in regulated environments.

“Using public AI tools without controls is digital negligence,” he said. “You’re effectively feeding your crown jewels into a black box you don’t own, where the data can’t be deleted, and the logic can’t be explained. It’s reckless. Private tools like Microsoft 365 Copilot and ChatGPT Enterprise are a step in the right direction. Still, if companies aren’t actively capturing and auditing usage, they’re not securing innovation – they’re sleepwalking into a compliance nightmare.”

Latest news

Turning Workforce Data into Real Insight: A practical session for HR leaders

HR teams are being asked to deliver greater impact with fewer resources. This practical session is designed to help you move beyond instinct and start using workforce data to make faster, smarter decisions that drive real business results.

Bethany Cann of Specsavers

A working day balancing early talent strategy, university partnerships and family life at the international opticians retailer.

Workplace silence leaving staff afraid to raise mistakes

Almost half of UK workers feel unable to raise concerns or mistakes at work, with new research warning that workplace silence is damaging productivity.

Managers’ biggest fears? ‘Confrontation and redundancies’

Survey of UK managers reveals fear of confrontation and redundancies, with many lacking training to handle difficult workplace situations.
- Advertisement -

Mike Bond: Redefining talent – and prioritising the creative mindset

Not too long ago, the most prized CVs boasted MBAs, consulting pedigrees and an impressive record of traditional experience. Now, things are different.

UK loses ground in global remote work rankings

Connectivity gaps across the UK risk weakening the country’s appeal to remote workers and internationally mobile talent.

Must read

Dr. Lynda Shaw: You shouldn’t need to pull a sickie to have a mental health day

Businesses need to stop penalizing employees when they legitimately take days off for the good of their mental health, and should even introduce ‘mental health home days’ to encourage loyalty, support and good communication in the workplace, according to cognitive psychologist and business neuroscientist, Dr Lynda Shaw.

Dr Andy Cope: Leadership – A Crisis of No-Confidence

Look around at the current crop of leaders and...
- Advertisement -

You might also likeRELATED
Recommended to you

Exit mobile version