Frustrations of an AppSec Engineer Part 2: Lost in Translation
Alongside Osterman Research, we recently embarked on a mission to discover how and why imperfect people lead to vulnerable applications. We listened to the frank observations and experiences of 260 people in application development and security roles in large UK and US-based organizations. Over the coming weeks, we’ll be covering our findings in a series…
Alongside Osterman Research, we recently embarked on a mission to discover how and why imperfect people lead to vulnerable applications. We listened to the frank observations and experiences of 260 people in application development and security roles in large UK and US-based organizations. Over the coming weeks, we’ll be covering our findings in a series of blogs – so let’s get started.
When we set out to research the human elements of cyber risk in the Software Development Lifecycle, we expected the findings to support what we already knew: that security was, in most cases, an afterthought in the development process. We didn’t expect the extent of the issue to be so broad and pervasive, and we certainly didn’t think we’d uncover such a stark disconnect between those on the frontline and their managers.
It’s Us and Them
One of the most surprising areas affected by this disconnect is ownership of security in the SDLC. Only 27% of front-line developers view the security of the applications they’re coding as a critical part of their responsibilities, compared to 80% of their managers. This might seem reasonable at first glance; after all, it’s the devs’ job to build functioning applications, not to defend the organization from malicious intent.
Unfortunately, however, this is leaving organizations open to attack. This disconnect means that those writing the code feel they have little to no responsibility for ensuring it is as secure as it can be. Couple that with what’s happening on the security side – where 50% of on-the-ground security practitioners believe appsec is a critical part of their responsibilities – and it’s no wonder we’re now facing a world where 81% of developers are knowingly releasing vulnerable applications.
A Lesson from Economics: Why the trickle down approach doesn’t work
Looking at the stats, we can see that those at the top – the CISOs, Heads of DevOps, and Directors – feel responsible for security. This is encouraging, but clearly the experiences and expectations of their reports just don’t match up.
I want to be clear: this is not the fault of the devs or the security practitioners. This is a result of organizations pinning their hopes on the idea that if managers care about security, then their reports will too. In other words, organizations are relying on a culture of security ‘trickling down' to those on the ground – and this is leaving them vulnerable.
A quick look at basic economics is the perfect analogy for this. Nowadays, the term ‘trickle-down wealth’ – the notion that tax cuts on the richest individuals and largest corporations will eventually benefit everyone – is met with raised eyebrows and scathing memes, and has failed to deliver this promised panacea of prosperity for the masses over and over again. The primary flaw in the ‘trickle-down’ system is that the wealth (or in our case, the sense of responsibility for security) gets lost, diverted and mistranslated on its way down to the people doing the work and committing the code.
We see this in action over and over again in the report data. Where 72% of Heads of DevOps feel they have the time to learn from their mistakes to develop more secure applications, only half as many (36%) front-line developers feel the same. Similarly, 63% of Heads of DevOps have access to timely threat intelligence compared to just 36% of developers.
Strong foundations, stronger security
If we want to prevent insecure code from being released into the wild, we need to build a culture of security from the ground up. That means the people at the coal face need to understand the role they play in keeping the organization safe.
Often, a disconnect between managers and their reports arises from a lack of information flowing between the two. Closing this gap, therefore, must be based on consistent, two-way communication and collaboration. Managers must ensure their reports understand the importance of security in the apps they’re building, and developers must ensure their managers understand the challenges that prevent them from prioritizing security. They have to be on the same page.
How Immersive Labs can help
We’ve recently released a new AppSec module to address this issue. With 150 hands-on labs dedicated to application security, developers can gain real-life experience of vulnerabilities, how attackers can exploit them and what can be done to prevent them.
Managers can assign bespoke objectives to their teams so they can guide them to the labs that will be most beneficial. This offers a tangible way to create that all-important culture of security. Management can also generate reports to evidence the fact that their team has the appropriate knowledge and skills to develop more secure applications.
All of this is specifically designed to help developers understand the impact they have on the overall security posture of the organization they work for – and build a culture of security from the bottom up. If we ever hope to develop secure, robust applications that can withstand the next SUNBURST, we must empower those writing the code.
Latest Blog posts
McLaren’s vision for optimizing its cyber workforce with Immersive Labs
27 January 2022
Immersive Labs signs MOU with UAE Government to develop nationwide cyber skills
18 January 2022
Patch Newsday: Christmas Chaos or Silent Night?
17 December 2021
Everything you need to know about Log4Shell (CVE-2021-44228)
13 December 2021
Helping McLaren stay cyber resilient off the track, so they can perform on it
9 December 2021
Netgear vulnerabilities could put small business routers at risk
2 December 2021