The Linux Foundation and the Laboratory for Innovation Science at Harvard recently released a Report on the 2020 Free/Open-Source Software Contributor Survey. One of the primary conclusions of this report was the fact that free/open-source software developers often have a very negative approach to security. They spend very little time resolving security issues (an average of 2.27% of their total time spent) and they express no willingness to spend more.

Some of the quotes from the survey were simply disturbing. For example, “I find the enterprise of security a soul-withering chore and a subject best left for the lawyers and process freaks. I am an application developer.” Another example: “I find security an insufferably boring procedural hindrance.”

While the report contains the authors’ strategic recommendations, here are our thoughts about what this situation means for application security and what you can do about it.

How Far and How Wide?

The original report focuses only on free/open-source software (FOSS) but we believe it is important to consider whether this is only a FOSS problem or a problem with all developers.

Based on the survey, most FOSS developers (74.87%) are employed full-time and more than half (51.65%) are specifically paid to develop FOSS. This means that FOSS is often developed by the same people that develop commercial software. We do not believe that the developers change attitude depending on whether the software they work with is free or commercial. Therefore, we believe that this bad attitude towards security extends to all developers.

We also believe that the underlying cause of this attitude is the fact that developers are either taught badly or not taught at all. Most online resources that teach programming completely skip the issue of secure coding practices. Books about programming languages rarely even mention secure coding. Schools also often treat security as an optional subject instead of a core course that should be a prerequisite to all other programming classes.

Therefore, we conclude that the results of this survey may be assumed to apply to all software developers. While in the case of commercial software some security measures may be added by the presence of dedicated security teams, “the root is still rotten”.

Secure Development? Not Likely!

While 86.3% of the respondents of the survey received formal development training, only 39.8% stated that they have formal training in developing secure software. This means that half the developers were taught badly.

Another shock comes from the response to the following question: “When developing software, what are your main sources for security best practices?”. It turns out that only 10.73% learned such best practices from formal classes and courses and 15.51% from corporate training. Nearly half the developers use online articles/blogs (46.54%) and forums (50.66%) as their primary source of information on best practices, which again shows the horrid state of education and the lack of resources about secure coding. And while we at Acunetix pride ourselves on filling the gap and being the teachers (thanks to our articles that explain how vulnerabilities work and how to avoid them), we would much rather have developers learn first from sources that are more reliable than a search engine.

Last but not least, survey results prove that free/open-source software is usually released with no security testing at all. While 36.63% use a SAST tool to scan FOSS source code, only 15.87% use a DAST to test applications. This situation is probably better in the case of commercial software because security teams usually introduce SAST/DAST into the SDLC.

Why the Bad Attitude?

If your application developers have a bad attitude towards security, it is not only due to their education. It may also be because of your business organization, which causes them to feel that they’re not involved in security at all.

Developers don’t feel responsible for security primarily due to the existence of dedicated security teams. If security personnel work in separate organizational units, the developers think that security is not their problem and expect the security researchers to take care of it instead.

Developers also don’t feel responsible because in a traditional organization they rarely are expected to fix their own security-related mistakes. A typical developer writes a piece of code, gets a code review from another developer (probably just as clueless about security), and then forgets about it. Later, a security researcher finds a vulnerability and creates a ticket to fix it. This ticket is assigned to the first available developer – usually not the one who originally introduced the vulnerability.

Such an organization promotes the lack of responsibility for security and fuels negative feelings between developers and security teams. They may view one another as “the ones that cause problems” – and this is what you must aim to change first.

Automation as a Solution

Automating the process of finding and reporting security vulnerabilities as early as possible solves this problem. First of all, errors are reported by a piece of software, not a human – therefore there is no other person to blame. Secondly, the error is reported immediately, usually after the first build attempt, and the build fails, so the developer must fix their own mistake right away. And thirdly, every time the developer is forced to fix their own error, they learn a little more about how to write secure code and how important it is.

The only problem that remains is finding software that can be trusted with this task. Unfortunately, limited capabilities of SAST/DAST software have been the cause of many failures in the past and this is why many developers do not want to use a SAST or a DAST tool.

SAST tools point to potential problems but they report quite a few false positives – the developer spends a lot of time researching something that turns out not to be a vulnerability at all. In the end, developers stop trusting the tool and start hating it. On the other hand, DAST tools report fewer false positives but often don’t provide enough information for the developer to be sure where the vulnerability is and what it can lead to.

Acunetix helps solve such problems. The advantage is that, in the case of the most serious vulnerabilities, Acunetix provides proof of the vulnerability. For example, the developer may receive a report that their code exposed sensitive files from the server – including the content of these sensitive files as evidence.

Conclusions

The most worrying conclusion from this article is that most free/open-source software is inherently insecure and if you want to feel safe using it, you need to do regular security testing yourself.

Another worrying conclusion is that people who should be your first line of defense in IT security are not educated about security and have a bad attitude toward it. This is not something that is going to be easy or quick to change.

Long-term strategic resolutions are needed to solve these major problems and simply implementing an automated solution cannot be perceived as a magic wand. However, if you introduce a reliable automated testing solution such as Acunetix into your DevSecOps SDLC at the earliest stage possible, you will ensure that your software is safe and you will teach your own developers that they need to take responsibility for the security of their code.

SHARE THIS POST
THE AUTHOR
Tomasz Andrzej Nidecki
Principal Cybersecurity Writer
Tomasz Andrzej Nidecki (also known as tonid) is a Primary Cybersecurity Writer at Invicti, focusing on Acunetix. A journalist, translator, and technical writer with 25 years of IT experience, Tomasz has been the Managing Editor of the hakin9 IT Security magazine in its early years and used to run a major technical blog dedicated to email security.