DevSecOps, Threat Intelligence, Governance, Risk and Compliance

It makes sense for the Biden administration to focus on software security – but it’s up to the industry to make it happen  

Share
Software security

The Biden administration’s National Cybersecurity Strategy published earlier this year contains a single, big idea that could shake the software business’s foundation: shifting liability for insecure software away from customers back onto the companies that make the products. 

Does it make sense to have the government meddling in one of the most vibrant parts of our economy? Short answer: yes. 

But doing it right requires focusing on the mindset of software product teams. I’ve worked in and around software product development for decades, but it wasn’t until moving into cybersecurity a few years ago that I’ve come to fully appreciate the problem: the modern software world functions more like a teenager behind the wheel — nowhere near ready for the power and responsibilities it has. 

The Biden administration is right to zero-in on software quality. From airbags to Advil, we have regulatory oversight for products that are potentially dangerous and where it’s not reasonable to expect customers to have the time, tools, or expertise to judge safety on their own. Given that software now powers modern life, more oversight makes sense.

The challenge is how much? Most software engineers are a mix of oblivious and overconfident when it comes to security; we need to shift the culture. We need a profession-wide consciousness raising centered around two important realizations:

First, bugs aren’t just an “oopsie-let’s-release-a-patch.” Bugs create the cracks that hackers hunt for and often lead to a catastrophic breach. 

Second, the size and sophistication of the hacking ecosystem targeting those bugs is way beyond what most imagine. 

We often assume it’s criminals doing the hacking, but over the past 30 years, virtually every country has built cyber espionage or cyber warfare capabilities. This has evolved into a global, government-backed hacking industrial complex on the scale of an Apple or Google dedicated to reverse engineering software and weaponizing security flaws. 

Software teams are unwitting participants in a digital war with nation-states exploiting their work. That should outrage developers, and it would if they were more aware of it. The faster we can get product teams to realize this, the more engineers will do what they do well: solve problems. We need to figure out how to avoid vulnerabilities in the first place, without crushing the pace of innovation. 

Just how buggy are our software products today? Remember the Yugo? Topping many worst car lists, it was described in 1986 by Consumer Reports as a “barely assembled bag of nuts and bolts?” That’s today’s software. And it’s a hacker’s dream. To cope, the security industry has created a standardized database of vulnerabilities (the MITRE-maintained Common Vulnerabilities and Exposures (CVE) system). “Vulnerability management” and “patch management” are entire product categories. Dozens of companies exist solely to help beleaguered security teams plaster over cracks in software. 

What does this say about software? There are more than 180,000 known security vulnerabilities in the CVE database. Collectively, we have become numb to low-quality software and don’t even see these updates anymore. 

And here’s where we need the engineers to buy-in. For decades, the industry has embraced iterative software development approaches that are buggy by design — biased toward agility and speed. So, we’re going to need to change the way we work. That’s not going to happen through top-down directives. Product teams need to commit to security from the ground up. 

To do this, it helps to make the destruction visible. For example, as I write this, a cybersecurity wildfire burns somewhere  — started by software bugs. 

Take the MOVEit case. On May 28, a customer of Progress Software, a large, trusted software vendor, reported anomalous behavior in the MOVEit file management system. Progress confirmed that there was a critical vulnerability, set out to fix it, and quickly released a patch. Two days later, on June 1, threat intelligence organizations revealed that the vulnerability had been exploited and a mass data theft was in progress. 

On June 2, researchers from cyber firm Censys announced that 3,000 MOVEit servers had been accessible from the public internet before the vulnerability was disclosed. So, a lot of organizations had been exposed for a while. That same day, the vulnerability was given a severity score of 9.8 out of 10. In other words, really bad. 

A few days later, Microsoft attributed the MOVEit data theft to the Russian-based Clop ransomware gang. Then, on June 5, the drumbeat of victim disclosures started. Reported breaches included the BBC and British Airways. Researchers discovered the vulnerability had been exploited as early as February, months before it was patched. Then, over the next two weeks, Progress announced more issues with MOVEit. In all, three vulnerabilities were disclosed. 

MOVEit was a thousand hacks in one. The MOVEit case has potentially already run up a bill of $9 billion, roughly twice the estimated damage caused by the recent wildfire in Maui. 

The industry needs to do better.

Moving forward, the government can impose all the regulations it wants, but it will come to nothing if we don’t educate and deputize product teams in the mission to ship fewer bugs and to prioritize security. As an industry, we have to make it happen.  

John Funge, managing director, DataTribe

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.