Security Strategy, Plan, Budget, AI benefits/risks

Why the 80-20 rule no longer works for cybersecurity

Share
Implementing continuous monitoring systems for enhanced cybersecurity awareness.

COMMENTARY: We’ve all heard about the Pareto Principle, the idea that approximately 80% of consequences result from 20% of causes. Organizations have long applied this "80-20 rule" to areas such as productivity, sales, quality assurance, and project management.

Cybersecurity is no exception. Over the last 25 years, cybersecurity leaders have leveraged this principle to manage and secure assets, asserting that adequately monitoring 80% of assets can effectively mitigate risks. Some have gone as far as to believe that closely monitoring the crown jewel assets alone, even if they constitute just 1% of the total exposure, might be "good enough."

[SC Media Perspectives columns are written by a trusted community of SC Media cybersecurity subject matter experts. Read more Perspectives here.]

But imagine a ship captain navigating through treacherous waters. The captain diligently checks and secures 80% of the vessel for leaks, ensuring that most of the hull is watertight. However, the captain neglects to inspect and secure the remaining 20% of the ship. Does this make any sense knowing that one single fault could sink the ship?

Now imagine this in the context of a large organization, with tens or even hundreds of thousands of assets across numerous business units and subsidiaries. Consider all the systems and networks powering a mid-sized company, the personally identifiable information (PII), web applications, and APIs. What does that unexamined 20% mean for these organizations? Most of the risk now gets concentrated in that bracket, which now contains the most attractive and exploitable assets for attackers.

It's worth noting that more than 90% of CISOs understand they are more likely to experience a breach because of an unknown or unmanaged asset rather than one that’s well-monitored. In other words, it’s widely understood that the "last 10%-20%" of unmonitored assets create the majority of the risk.

So why do some security leaders rely on the Pareto Principle to manage external risk?

Breaking the status quo

Here’s the truth: security leaders have settled for the status quo because it’s become the industry standard. Even in highly sensitive industries, here’s the prevailing thought: “If it works for my peers, it’ll work for me.” While it’s an understandable mindset, it’s a slippery slope.

Today, many organizations are failing to discover, manage and test their attack surface. To paint the picture: CISA says that it takes attackers 48 hours to exploit a new vulnerability. It advises organizations to run an attack surface discovery process every 14 days, and test all of their assets every seven days to ensure the exploit never comes to light. Yet, according to recent data by our research team, nearly 75% of organizations test their web applications monthly or less often.

I’m not here to finger-point. In many ways, it’s a technology problem. Legacy tools and methods, such as scanners and manual penetration testing, require massive amounts of manual labor. Even for a Fortune 100 company, it’s a colossal task to continuously look for blind spots, update and contextualize asset inventories, and prioritize the top 5-50 issues out of thousands.

Also, legacy risk detection tools frequently overlook various assets, as they're ineffective at finding "net new assets" that don’t directly sit in known IP ranges. Prioritizing alerts demands a deep understanding of asset criticality, context, and exploitability. Calculating attack paths helps predicting threats that pose substantial risks. The ability to prioritize risk allows for targeted remediation, yet the narrow focus of many security products makes the problem worse, merely taking up space in the security stack as opposed to solving the problem.

Secure what matters most

The Pareto Principle mindset in cybersecurity dates back to a time in the not too distant past when the technology wasn’t sophisticated enough to have asset and risk visibility over all assets. But it’s not the case today. Organizations have access to a massive stack of products, some powered by AI, that can automate and scale vulnerability management and remediation end-to-end. Infrastructure costs are also much lower than 20 years ago, which has made these technologies far more accessible.

In short, there's no excuse to settle for just 80% coverage anymore. Organizations have far too many interconnected assets to take that chance. A single company can have multiple on-premises systems, cloud systems, networks, cloud and web-based applications — all connected by APIs and often containing PII. And any of these systems can represent a potential entry point for attackers, making it crucial for security teams to have a complete inventory of what they need to protect. Without this visibility, all security measures fall short and the team will never feel at ease.

The bottom line: it’s important for teams to know what assets they have so they can do the following:  

  • Define scopes: AI can classify 100% of company assets into meaningful business groups, such as business units and purposes. This lets teams align risk priorities and remediation workflows with what is critical to the business and management team.
  • Run continuous exposure assessment and validation: Continuous active risk assessment lets teams identify weak spots that attackers can exploit. Attackers can target any asset at any time. Thus, exposure assessment has to cover 100% of what’s exposed and get done continuously, not monthly or quarterly.
  • Automate incident response: With precise asset and risk assessment and evidence, security teams can quickly determine the impact of incidents and respond more effectively, minimizing damage and recovery time.
  • Enforce policy compliance: Regulations often require strict controls over specific types of data, such as PII. Comprehensive visibility ensures that organizations meet these requirements and avoid costly penalties.

Security pros may find it easy to lean on the Pareto Principle, until they suffer a breach and it’s too late. Invest in the means to close the coverage gap now to future-proof the organization’s security strategy for the coming years. Leave the 80-20 rule behind.

Rob Gurzeev, chief executive officer, CyCognito

SC Media Perspectives columns are written by a trusted community of SC Media cybersecurity subject matter experts. Each contribution has a goal of bringing a unique voice to important cybersecurity topics. Content strives to be of the highest quality, objective and non-commercial.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.