Cyber attacks can cut to the core of any organisation and have the potential to severely impact the reputation, performance, and finances of any organisation that experiences an incident, says Parisa Bazl.

Cumulatively, the cost is truly enormous, with one recent estimate putting the global annual cost of ransomware alone at $10.5 trillion by 2025.

Aside from the huge business impact these situations often generate, there is a worrying and often overlooked human element that can have serious personal consequences for those involved, in particular, employees targeted by cybersecurity threat actors and the cybersecurity professionals tasked with mitigating the impact of an attack. In either case, they can suffer from a range of serious and long-lasting harms, which, according to a study from the Royal United Services Institute (RUSI), includes everything from psychological, physical, and financial to reputational and social problems.

As the RUSI study puts it, ransomware can ruin lives, with incidents reported to “have caused individuals to lose their jobs, evoked feelings of shame and self-blame, extended to private and family life, and contributed to serious health issues.”

Data published last year, for example, showed that nearly two-thirds of cybersecurity incident responders seek out mental health assistance due to the demanding nature of responding to cyber attacks. Elsewhere, a 2022 study revealed that one in seven security staff experiences trauma symptoms months after an attack, with one in five considering a job change as a result.

Cultural learning

So, what needs to change to turn this situation around? Key to the whole process is a clear understanding that everyone within modern, digitally-centric organisations can potentially be vulnerable to the impact of a security incident. At the heart of this approach is building an organisational culture that actively embraces opportunities for knowledge sharing and the role of strong communication in both preventing attacks and then mitigating their subsequent impact. Employees should not only share responsibility for effective cybersecurity but also play a supportive role in mitigating the fear and stigma victims will often experience.

In this context, education and training play a crucial role in preventing breaches and limiting their psychological impact on those involved. Given the increasing sophistication of highly convincing AI-powered strategies by threat actors, the urgency is becoming greater than ever at all organisational levels. In practical terms, team training programs should incorporate real-world scenarios that not only detail cybersecurity incidents but also explore their psychological impacts, thereby fostering a deeper understanding and empathy among all employees.

Understandably, effective cybersecurity is highly geared towards understanding deeply technical issues that require specialist knowledge, experience, and sophisticated tools. However, the increasing scope for manipulating just about anyone, irrespective of their technical competence, is making the problem even greater. The rapid development of AI is just one of the reasons for this; for example, it’s much easier for phishers to generate fake emails, create highly realistic deepfakes, and generate malware. Therefore, it’s vital that processes are in place to prevent social engineering strategies from succeeding.

For example, implementing process checks and limitations on areas such as money transfers can help narrow the scope for attacks to succeed. Organisations that plan ahead and look at potential areas of vulnerability for non-technical employees are much more likely to defeat deepfake attacks and other sophisticated tactics than those who assume they won’t be targeted.

This approach should also extend to the support provided to employees who are the victims of an attack or are part of the team responsible for mitigation and recovery. Here, internal support mechanisms play a vital role, where employees are given access to the resources required to support mental health. Rather than blaming individuals for mistakes that anyone could make, from the most junior employee to the CEO, organisations should focus on learning from their experiences collectively.

What is the danger of unreported incidents?

Without this positive cultural system in place, organisations run the very real risk that employees simply will not report cybersecurity incidents to management, particularly out of fear of the repercussions they may face. Indeed, research published last year on the problem showed that over 40 percent of cyber attacks were not disclosed to internal management. Of those people who had failed to report an incident, three-quarters said they felt guilty as a result.

Collectively, it’s clear that employees who are involved in cybersecurity incidents, whether as unwitting victims or as part of the cybersecurity team, can find themselves under immense pressure. In an era where employers are focusing more energy on workplace wellbeing, leaving these issues unaddressed can represent a serious shortfall in care that can lead to devastating personal consequences.

Organisations that proactively implement measures to prevent security breaches while also fostering a supportive environment for those impacted by cyber incidents are not only more resilient but also demonstrate a commitment to comprehensive employee wellbeing. In facing the risks associated with cybersecurity and data protection, this is an extremely powerful organisational quality to demonstrate.


Parisa Bazl is the Head of User Experience at Commvault.






Amelia Brand is the Editor for HRreview, and host of the HR in Review podcast series. With a Master’s degree in Legal and Political Theory, her particular interests within HR include employment law, DE&I, and wellbeing within the workplace. Prior to working with HRreview, Amelia was Sub-Editor of a magazine, and Editor of the Environmental Justice Project at University College London, writing and overseeing articles into UCL’s weekly newsletter. Her previous academic work has focused on philosophy, politics and law, with a special focus on how artificial intelligence will feature in the future.