Security Culture — From “Crime and Punishment” to “Sense and Sensibility”

Vincent Gilbert
Engineering at FundApps
11 min readApr 13, 2023

--

Photo by San Fermin Pamplona - Navarra on Unsplash

Cyber security borrows many terms and concepts from the military world (e.g. Red Teaming, Threat modelling, Right to know). In cyber security teams, one will find many people with military backgrounds, so it shouldn’t surprise anyone if the cyber security culture in many companies has also inherited some of the characteristics of a military organisation, such as chain of command, discipline and opacity.

However, is this security culture adapted to companies in 2023? What are the characteristics of a security culture which will allow you to manage risks in a Tech scale-up, a Financial Institution, or an online retail store?

What can security culture do for me?

Photo by Kelly Sikkema on Unsplash

Firstly, how important is your security culture in keeping your company secure?

According to the UK’s newly-formed National Protective Security Authority,

the benefits of an effective security culture include:

- A workforce that are more likely to be engaged with, and take responsibility for, security issues

- Increased compliance with protective security measures

- Reduced risk of insider incidents

- Awareness of the most relevant security threats

- Employees are more likely to think and act in a security conscious manner

That sounds like any security team’s dream!

If that wasn’t enough to convince you, according to Verizon’s 2022 Data Breach Investigations report, 82% of breaches involved the human element, including social attacks, errors and misuse.

So getting your security culture right should be top of your list.

First, we will examine why a command & control, opaque security culture is ill-suited for a modern-day business. We will then explore how an open and collaborative security culture could help you reap those benefits the NPSA listed above.

Command and Control security culture, or how to prepare to fail.

When I think of a Command & Control culture, I conjure images of Full Metal Jacket, with Sergeant Hartman shouting orders at recruits and subjecting them to company-wide punishments.

In a business context, C&C cultures are ones with the following characteristics:

  • Authoritative,
  • Lack of transparency,
  • Lack of trust in people,
  • Motivate by carrot or stick.

The Equifax breach

In September 2017, Equifax revealed that private records belonging to 148 million Americans had been breached. This breach cost Equifax over $1 billion and led its CEO, CIO and CSO to step down.

A congressional report concluded the breach was due to a “culture of cybersecurity complacency”.

Equifax was criticised for its lack of transparency, as it delayed publicising the breach and made it difficult for its clients to know whether they were affected.

The congressional report also listed the lack of collaboration and communication between Security and IT teams as contributing to the breach.

Security incident at MyFruit

Let’s look at how these attributes could play out through a scenario.

Gwendolyn, the Head of InfoSec at MyFruits, an online shop for seasonal fruit, has a busy morning. Yesterday, a staff member dealing with the company's fruit purchases leaked the company’s list of products, suppliers, and margins for each price. Several people spotted this, retweeted it, and got clients and journalists questioning how robust the security practices of a company that fills in orders for 10 million clients across 20 countries were.

How did this happen? Was it a malicious insider trying to hurt the company they were leaving? Was it due to evil hackers trying to claim fame to their name?

Not really…

Joan, the Finance analyst at MyFruits, needed to process payments from numerous vendors every month. An issue with a specific supplier meant Joan asked the Sourcing team for the list of purchases made last month. As Loubna is busy, she decides it’s easier to give Joan access to the whole file. However, the Finance Team doesn’t have access to it. The file is too big to be shared through Slack or Email. So Loubna decides to share it through an online service called File Bin…

The rest is history.

The scar on the first cut

To help out their CEO before the next board meeting, our well-intentioned security leader comes up with a plan. A plan, which they say will fix that problem once and for all. Access to all file-sharing, copy-pasting, and data-mongering websites will be blocked from now on. All staff’s web traffic will be filtered through a proxy configured to block all traffic to various websites.

“Genius!” says the CEO. “Great job”, chimes in the board.

“Oh no”, sighs Loubna, who is told one hour later that she is being fired for gross negligence.

The conundrum

Because no one has been consulted, the inevitable happens. Ten minutes after the web proxy is installed and configured, Gwendolyn, our head of InfoSec, gets a panicked call from one of the Engineers.

“No one can access AWS anymore. We’ve got an incident going on, and we’re all locked out.”

Gwendolyn checks her new web proxy console, and sure enough, Amazon Web Services is in the file-sharing category… Well, that’s annoying. Gwendolyn is faced with her first conundrum. Telling the Engineering team they can’t connect to AWS anymore is not a viable option. So Gwendolyn whitelists AWS for all staff.

The process

Gwendolyn senses this is not going to be the first call she gets, so to make sure she has an answer for the next problem, she sets up a new process. Staff can request for new websites to be opened as long as their manager signs off on the request. As the requests pile up in her inbox, she decides to recruit a junior security engineer to help her out.

Thuan, a young graduate who has just finished studying computer science and specialised in penetration testing, is happy to land her first job after Uni. Little does she know that her job will consist in approving exceptions for WeTransfer or SharePoint all day long.

At the end of the line

Three years later, Thuan is long gone, as the appeal of a more exciting career found her approximately 26 minutes after she started.

The HR team have had other amazing ideas and has added multiple categories of websites (videos, streaming, games, news) which should be blocked to ensure the staff’s productivity. Wordle has stopped claiming minutes from everyone’s day at MyFruit.

Gwendolyn’s hunch that more requests would follow was spot on. She has now got a spreadsheet with 483 requests for as many websites, plus another sheet for 56 staff members and 18 service accounts which require full internet access.

MyFruit continues to make heavy use of tools which don’t allow collaboration, as most of the collaborative tools are in one of the forbidden categories.

In the meantime, Loubna’s incident and the reason for filtering web traffic are long forgotten. However, sharing information at MyFruit is still as difficult as before, and teams continue to send each other spreadsheets and documents by email/Slack/Teams.

Furthermore, MyFruit has missed out on many of the current collaboration tools. All of this is a source of dissatisfaction and frustration for MyFruit staff, whose rating of their workplace has gradually decreased over time.

And finally, one day, the same incident happens again, though this time, the way the file gets leaked is not through File Bin, but through WeTransfer, from a user who had been added to the exception list two years prior. The only difference is that staff know better than to report the leak to security, as they know someone will have to be blamed.

Could another security culture have helped?

Photo by Tachina Lee on Unsplash

There are a few cultural characteristics that could have saved MyFruit from this painful experience.

  • Collaboration: If the problem at hand had been discussed with the company’s staff, a solution to the incident might have been to make information sharing easier rather than more difficult. The solution might have come from a different team than the security team.
  • Transparency: If the purpose of Gwendolyn’s web filtering control were communicated, it would have been easier for her MyFruit colleagues to point out how unsuccessful it would be.
  • Trust: Rather than treating staff members like 11-year-old children who’ve been handed their first SmartPhone, Gwendolyn could have focused instead on educating staff on why protecting their information is important and what safe behaviours they can adopt.
  • Psychological Safety: Rather than punishing staff who make mistakes, Gwendolyn could have looked for solutions which allow her colleagues to go by their day and make it easy for them to adopt secure behaviour. She could also make it safe for people to raise their hands and let them know when they make a mistake.

Reaping the benefits of an open security culture at FundApps

I want to share a few examples of how embracing an open security culture based on the four traits above has strengthened FundApps’ security framework.

But before I begin, I’d like to make clear that this has been a journey, and we got things wrong before we got them right. The journey is far from over, and there are still many things we can improve on.

Collaboration

One of the aspects I’m most proud of at FundApps is how anyone can contribute to our security framework.

Some of our best security initiatives have been imagined, implemented or advocated by staff outside the security team.

At FundApps, we have a group of Engineers who have an interest in Security. They gather once a month and call themselves Security Champions. One of the things this team wanted to work on was helping their colleagues improve their ability to write secure code. Therefore they found a platform which would allow them to learn how to write secure code and compete against each other. One month later, we held our first secure coding tournament, and now we hold a monthly challenge which rewards the best developers and teams who partake in the challenge.

Another example has been how we tackled privileged access to our infrastructure. FundApps has made use of SymOps to provide just-in-time access to our AWS accounts infrastructure. This means that should an Engineer’s computer be compromised, an attacker wouldn’t be able to get into FundApps’ infrastructure before convincing another Engineer to grant them access. This project was imagined and implemented by one individual who is not part of the Security Team but suggested and implemented SymOps out of interest and passion for improving security at FundApps (and playing with shiny new things). The security team was involved from start to finish and was able to provide input throughout the project.

Transparency

Giving staff, clients and prospects easy access to our policies and security controls allow them to challenge us either by pointing out security practices that don’t bring value or risks that aren’t being covered.

This culture of transparency led clients, a few years ago, to point out the lack of automated security testing in our CI/CD pipeline. We listened to them and implemented Static and Dynamic Application Security Testing.

Even better, three new joiners spotted small mistakes in our policies, and when reading them during their induction, they proposed changes to them through Git.

Trust

At FundApps, we work with adults, not children. Therefore we trust our staff to be responsible adults regarding how they use FundApps’ Information System. However, because we know things sometimes can and will go wrong, we make it easy for us to detect and respond to this.

Let’s take an example: end-user device management.

With this approach in mind, we do everything we can to allow FundAppers to install and configure software on their devices in the way which best works for them. For example, when Windows 11 came out, we let our staff try it out and let them migrate to it when they felt ready or interested.

On the other hand, we make it easy for us to detect and respond when things go wrong so that we can block malicious programs, isolate computers from the network and remotely wipe them.

We also make it very easy for staff to keep their software up to date (automated updates), and conversely, make it hard for them if they don’t (prevent accessing resources if their devices do not satisfy our requirements).

Psychological Safety

This has been the hardest aspect to tackle and one on which we still want to improve ourselves.

Security teams simulate attacks to train the company to respond to life-like attacks. Practice makes perfect, right? However, it’s only practice if it feels safe to fail.

Phishing exercises run by the company’s security team tend to shame staff who fall for them or create the idea that they might get into trouble if they fall for phishing tests again.

Whilst training users to recognise and respond to phishing is important, I have a couple of issues with the impacts this approach has. Firstly, the security team is perceived as trying to trick their colleagues into failing a test, which makes them look bad or gets them in trouble. This undermines trust and discourages collaboration. Secondly, if someone were to click on a real phishing email, they would be discouraged from reporting it, as this usually gets them into trouble.

The result is staff members who are less willing to engage with the security team and are disincentivised to report early signs of a security incident.

A way to continue running these phishing exercises in a safe environment has been for us to focus on the company’s collective response rather than individual performance. This means measuring and disclosing information on the performance of teams rather than individuals.

Another aspect of measuring the company’s performance is looking at how quickly it takes for the company to respond to a phishing campaign. Measuring how quickly staff members make the security team aware of a phishing attempt and how quickly the team can stop the sender from sending more emails or blocking the malicious website from being accessed.

This means we’re focusing on more meaningful metrics, which can help FundApps increase its ability to detect and respond to threats rather than pointing fingers at our peers.

Take Away

Photo by Briana Tozour on Unsplash

By now, I hope you’re convinced you should invest as much time and effort in your security culture as you do in your technical controls. I also hope that when it comes to shaping this security culture, you remember the 4 points we discussed: collaboration, transparency, trust and psychological safety.

By allowing everyone to understand, engage and contribute to your company’s security practices, the bigger your security team is, the easier your job will become.

--

--