Six Reasons Why Vulnerabilities are Increasing, Despite Greater Cybersecurity Investments

At NetSPI we are often asked, “Will our cybersecurity spend plateau or decrease?” or “Our security testing quantity and frequency continues to rise year over year, shouldn’t our vulnerabilities findings decrease over time?” – or a variation of these questions.

Many assume that, over time, operational scale and efficiencies would generate this result. But we do not expect to see this correlation in the foreseeable future.

If you look at the macro numbers, we are in a never-ending race against sophisticated adversaries, and organizational attack surfaces are growing exponentially. As companies continue to grow, innovate, acquire, divest, and hire, the threat landscape evolves at the same rate – or faster.

Vulnerabilities, cybersecurity spending, and threats are all on the rise.

According to the NIST National Vulnerability Database vulnerability count has historically fluctuated – until 2017, where we saw a massive spike in total vulnerabilities. Since then, vulnerabilities have steadily increased year-over-year and are on track to trend upwards throughout the remainder of 2022.

Data also shows that high/critical vulnerabilities slightly decreased in 2021. However, the most critical vulnerability category is not included within this data but would impact the number of vulnerabilities drastically: people. According to a study by IBM, human error is the primary cause of 95% of cyber security breaches.

At the same time, cybersecurity remains a budgetary priority and spending continues to increase industry wide. Gartner predicts end-user spending in the information security and risk management market to grow at a compound annual growth rate (CAGR) of 11 percent through 2026.

Adversaries are no longer limited to individual actors and include highly sophisticated organizations that leverage integrated tools and capabilities with artificial intelligence and machine learning. While sophistication increases, so does activity. McKinsey & Company saw an exponential increase in the number and types of threats over the past decade. Additional reports have validated this over the past few years: Phishing increased 220 percent at the onset of COVID-19, governments and healthcare organizations worldwide saw an increase in ransomware attacks (1,885% and 755%, respectively), three in five companies were targeted by supply chain attacks in 2021, the list goes on.

Traditional penetration testing is compliance driven – and most compliance frameworks are 2-5 years behind current threat vectors. By evolving from compliance driven to risk and compliance driven testing, enterprises will identify more critical risks and vulnerabilities… but this does come at an increased cost.

It is highly unlikely that vulnerabilities will decrease year-over-year, based on your security testing investments alone.

It is important to understand that penetration testing does not directly reduce vulnerabilities, it identifies exposures and security issues. To reduce vulnerabilities, it requires fingers on a keyboard to change application code, reconfigure an operating system, update device configurations, among other activities. The number of vulnerabilities can also be impacted by external factors that organizations cannot control.

Let’s explore six core factors, beyond security testing investments, that can be attributed to the increasing number vulnerability findings organizations globally are observing today.

New Attack Vectors and Increasing Sophistication
  • The volume of vulnerabilities is directly related to the increase in volume of attackers.
  • Threat actors are no longer individuals, they are well-funded enterprise organizations. They can develop new technologies at a faster clip than ever before. Even our most sophisticated cyber controls, policies, and regulations will soon be obsolete.
  • Criminal enterprises have discovered that attacks like Ransomware can be very profitable. As a result, more than ever they are focusing on developing and advancing those capabilities.
  • 99 percent of the codebases contain at least one open-source component. While organizations do have control over where and how open-source code is used. The ubiquitous code can open floodgates for future vulnerabilities and the severity of this risk is highly variable (e.g. Log4j, SolarWinds).
  • Supply-chain and third-party risk has led to some of the largest breaches (e.g. Kaseya, Colonial Pipeline, Kronos, SaaS providers). The supply chain continues to expand with new features and functionality that inserts the potential for additional vulnerabilities. The quality of third-party talent, business decisions, and priorities directly impacts the number of potential vulnerabilities they could introduce to your organization.
Adoption of New Technologies
  • The volume of vulnerabilities is directly related to the adoption of new technologies.
  • The adoption of new technologies offers many advantages from a business perspective. But with new technologies comes new risks that may not be fully understood by the groups implementing them. For example, cloud technologies today represent a very wide and deep attack surface that is still not fully understood by much of the security community. Technology helps organizations innovate, but as you layer more systems onto your IT networks to enable business growth, new vulnerabilities may arise.
  • Organizations are seeing a substantial increase in the number of apps in use. On average, companies have 254 SaaS applications, 56 percent of which are managed outside of IT (shadow IT).
Infrastructure and Development Practices
  • An agile development lifecycle pushes smaller snippets of code into production. In agile, testing policies are not triggered given there is not a large amount of code being released. These small snippets over time add up and we have seen organizations discovering more critical and high vulnerabilities in production.
  • Legacy code and systems add complexity and cause exceptions that further increase an organization’s risk.
  • System integrations are much more prevalent than before, such as (e.g., vertical – creating silos, horizontal – enterprise service bus, and point-to-point – star integrations). In addition, organizations continue to increase their dependency on third parties for development practices.
  • Digital transformation can change an organization’s infrastructure and IT footprint drastically. A common example of this is cloud migrations.
  • Lack of collaboration between development and security teams. Security is not always at the table during application design processes.
Organizational Change
  • Major business changes across organizations may bring new vulnerabilities. Changes could include mergers, acquisitions, divestments, or eradication of legacy apps/systems.
  • Historically, security was not mature enough to be part of the change. This mindset has changed, and budgets may need to be realigned to efficiently mitigate risk.
  • Access changes continue to evolve (e.g. work-from-home – as companies shifted, cyber-attacks increased by 20 percent). Mobile platforms, remote work, and other shifts increasingly hinge on high-speed access to ubiquitous and large data sets, exacerbating the likelihood of a breach.
  • With organizational change, comes new attack surfaces, such as cloud migrations and shadow IT. We’re also seeing significant changes to the existing attack surface in the form of new system and applications.
Talent Shortage
  • The technology shortage amplifies all the above preexisting factors.
  • Many organizations lack sufficient cybersecurity talent, knowledge, and expertise—and the shortfall is growing. The number of unfilled cybersecurity positions globally grew by 350 percent over the past eight years. In 2021, a survey reported 3.5 million unfilled roles. The software developer talent shortage is also concerning. By 2030, the number of software job vacancies would rise by almost 22 percent. And the software engineer shortage in the USA is expected to hit 1.2 million by 2026, according to the Bureau of Labor Statistics.
  • Secure coding expertise in developers is highly variable.
  • Outsourcing development vs. in-house development can influence the number of vulnerabilities found in your software. Organizations generally have more visibility and control over in-house development practices.
  • With the talent shortage, comes burn out, insufficient training, employee turnover, and keeping pace with workloads, all of which can influence vulnerability count through recurring vulnerabilities, unpatched issues, and higher likelihood for error.
Evolving Testing Approaches and Technologies
  • Pentesting is time-boxed, so breadth and depth are limited compared to an adversary that has unlimited time to achieve its objective. We must continue embracing technology as our force multiplier.
  • Shifting left is a great practice to catch vulnerabilities earlier in the SDLC, however, this methodology can alter the number of vulnerabilities that are found based on how far left your organization is testing and at what depth – and is dependent on the quality of the code output.
  • Changes in the capabilities, depth of access, and reporting output of automated testing tools. As testing technology advances, it is uncovering more paths to manually discover critical vulnerabilities.

Discover how the NetSPI BAS solution helps organizations validate the efficacy of existing security controls and understand their Security Posture and Readiness.