20 years of Vulnerability Managment - Why we've failed and continue to do so.
Cyber Security: Keeping Pace with Change.
Getting breached can really ruin your day. Actually it normally happens on a friday evening as you are about chill for the weekend. The cause of must breaches is not rocket science, its more to do with the poor approach we have accepted because we underestimate the threat actor. - An attacker does not scan your website/network once a quarter with a commercial or open source scanner or perform an annual penetration test against your systems to see if there is any low hanging fruit, so how do we expect to defend against such an advisory using that approach?
Systems change now more frequently than ever due to the ease of cloud deployments and the speed of software deployments due to iterative development techniques. The rate of change increase results in exposures quickly manifesting and the organisation not even being aware of the exposure in the first place. Many organisations dont know what they have exposed on the public Internet.
We need to keep pace with change be it in a cloud
environment, software deployed, new feature, network architecture change etc etc.
The below applies across the full stack. From network and cloud environments to API's, Web applications and mobile apps.- it's all software!
Lets talk about the root of all risk - Change. Risk is the probability of loss or injury. If the world was static and nothing changed we would not need to continuously assss risk. Change gives rise to risk.....
Change occurs when:
A system does not change: Over time critical vulnerabilities are discovered & patches are released. "Yesterday I was secure, Today I’ve a Critical Risk." - I did not change anything the world around me did.
A system changes: New features deployed, new services exposed, larger attack surface, more exposed, more to attack, more headaches. (obviously).
We need to Keep pace with change. (Keeping pace with potential risks).
Traditional tool based/consultant based approaches have failed to keep pace due to a lack in depth/coverage or frequency of change detection. Scanners alone suffer from coverage, accuracy issues and some "poor sod" spending their days in validation purgatory. False positives are the "white noise" of vulnerability management.
- Validation of severity and prioritization needs to be tasked somwhere in the management cycle. If not by the solution you are using, somwhere else.
- Risk based vulnerabilty Intel is key for priortization. Focus on what is activley exploited in the wild not all the vulnerabilities. All vulnerabilities are not created equal.
So what’s wrong? Why are up the creek without a paddle? Systems still being breached by advanced attackers (AKA Finding exposed remote login services with default credentials or unpatched systems or insecure code!!💀💀😀😎).
Let’s look at current ways to dynamically assess systems for cyber security.
Penetration Test
Manual assessment of a system. Coupling of usage of automated tools,
scripts and expertise.
Strengths: Logical issues. Accurate / (should be) False
positive free. Complex exploits, Support.
Weaknesses: Not scalable, Expensive, Not on-demand, Does
not fit with DevOps etc. Point-in-time scan. No Metrics??
Vulnerability Management
Automation/Software testing software – scanners
Strengths: Scale/Volume,
On-demand, DevOps
Weaknesses: Accuracy, Risk Rating, Coverage, Depth (Logical vulnerabilities). Requires Expertise to validate output. Metrics are poor, require multiple tools.
Hybrid /PTaaS (Penetration Testing as a Service)
Automation augmented with Expertise coupled with Attack Surface Management
Strengths: Complex issues, Logical exploits, False positive Free,
Scale/Volume, On-demand, DevOps, Accuracy, Coverage, Metrics, Support. Scale
via automation. Depth via expertise.
Weaknesses: Potentially more costly up front
than automation (but return on investment is high due to validated
vulnerability data being received, less false positives and better coverage.)
- Reliance on Software to test software (scanners) alone is folly! – Scanners alone don’t work.
- Automation accuracy is not a strong as human accuracy – Our attackers are humans.
- Scale vs Depth – Scanners do scale, Humans “do” depth. – Our enemies do Depth every time and are focused.
- Change is constant – Consultant based security does not keep pace with change. – Our enemies love change.
What vulnerability management should look like…
- On-demand: Assurance of coverage & depth of testing on demand. – DevOps, Security Team, Deployment process
- Continuous & Accurate: Continuous assessments detecting and validating new vulnerabilities all the time.
- Good for: Metrics, Risk lifecycle tracking, TTR Metrics, Root Cause etc etc
- Integration: Continuous flow of validated vulnerability intelligence into your SoC/Bug Tracker/GRC systems – Situational awareness. Cloud integrations to keep pace with systems spinning up and flux.
- Full stack: “Hackers don’t give a S*#t”. Risk can be in web or hosting infrastructure, internal or external systems. Multiple tools for the same purpose? Multiple data sets? No complete picture of risk. We need risk convergence.
- Risk based: We dont need to focus on all vulnerabilities. Even ordering by severity may not yield efficinet results. Focus on what matters, focus on vulnerabilities activley known to be exploited in the wild.
Shift Left?
Shift Left: Enable & Assist developers build and deploy secure code & systems. Prevention. Catch Early, Dont deploy vulnerable systems.
Shift Right: Detection, Vigilance, Detect currently unknown vulnerabilities. Detect “the next CVE” or "Log4shell"/Framework vulnerability and also mop-up anything that we missed in pre-prod.
Even the Risk profile of a static system can change.
Today’s secure environment is at risk tomorrow via a vulnerability were not
aware of yet. - Fight the future.
Comments
Post a Comment