Part 2. Pentesting - how to address the challenges… let’s start with the Process
In my previous article, I talked about what is wrong with the current state of pentesting.
First of all - I want to highlight that NO technology alone will save the day. Like with every other security problem, you need to get all three components right – process, technology and people.
This article is going to focus on what I believe is the first component of the solution – addressing the pentesting process and governance structure for sizeable pentesting programs (more than twenty pentests per year).
There will be another article on addressing the same challenges for small-to-medium enterprises.
So let’s begin with re-iterating the problems:
Pentesting is done too late;
Pentesting scope does not match business threat environment;
Pentesting lacks assurance on methodology and on overall coverage;
Pentesting lacks common vocabulary;
Reporting is static and remediation tracking is very hard; and
Lack of analytics.
It is often a vendor that scopes and provides the actual pentesting. And even if it is an internal team – how much time does that team have to understand the business context and implications of the solution they are about to pentest? Usually not very long, few hours spent in scoping meetings at best.
As for methodology and vocabulary – it is often up to individual pentesters and their ability to cut and paste from previous engagements. At best, organizations would require pentesters to follow something vaguely written around OWASP, PTES and others.
Currently the essential piece – the deliverable of a pentest – is a static report. There is no efficient way to track, aggregate and normalize the results. And reports are usually delivered well after vulnerabilities are found.
So now we all understand the problems – let’s look into how the governance structure and process should look like. What are the key features that should be considered?
From my experience, the only way to establish relevant pentesting program scope, a consistent methodology, uniform terminology, and capability for remediation tracking & analysis – is to build a centralised team with responsibilities which include:
Defining scope and focus of the pentesting program based on organization risk profile and risk appetite.
This process would help to keep the scope for the entire program, and each pentesting engagement, reflecting the organisation’s risk profile. Take scoping from vendors and give it back to the organisation’s own specialists.
Developing and maintaining pentesting methodologies relevant to the organisation and its risk profile. And make sure that the relevant security, devops and other engineering personnel have their voice in improving these methodologies.
This process would help to ensure repeatable testing coverage and provide pentesters with relevant methodologies on every project - irrespectively if they are internal or external.
Defining and maintaining a comprehensive list of relevant vulnerabilities (vulnerability library). Make that library available for pentesters, devops and engineers. Use industry standard definitions provided by organizations such as MITRE & OWASP.
This practice is already adopted in most vulnerability management processes – the concept of “normalizing the data”.
This process would address the challenge of “repeatable language” in the results. It would also allow to have measurable data. Something that could be analyzed and used to make informative decisions.
Providing analysis of the vulnerability data collected through pentesting program, and documenting recommendations on how to address them - including by automation and CI/CD processes.
Having one team as a custodian of pentesting results across the organization would make that team a natural place to analyse pentesting results as a whole, to present the results to executives and to make recommendations based on real recorded data.
The organizations I have seen which have adopted robust governance structure & processes listed above have improved the efficiency of their pentesting program; gained better understanding of organization security posture; and have less trouble justifying their security budget.
The next article will focus on how technology that would support this process should look like.