Looking for the future

Looking to the future

Each security assessment is only a step in your larger security journey. Minimizing the transactional nature of a point-in time test maximizes value. When closing out an assessment, take the opportunity to reflect and revise your strategic plan.

How’d it go?

How do you decide if you had a good experience with the vendor?

Generally, answering this question is easiest with obviously bad outcomes. For example, poor communication or blatant false positives on the report.

If the assessment is successful on its face due to the quality of findings or vendor interactions, it can be harder to critique. Here are a few prompts to consider:

  1. How were their answers to the questions in the readout? Did they demonstrate a high level of diligence in testing? Did they show a strong understanding of your risk profile and business?

  2. How was the report quality? Was the narrative clear and digestible? Did the report show a sufficient depth of coverage? Were findings reproducible and contextualized? Were trends or areas of future discovery identified?

  3. Were there false positives or false negatives? How did the vendor handle false positives? What were the provided reasons for any issues? How egregious are the mistakes?

  4. Are there existing open vulnerabilities within the scope? How did the vendor perform against known risks? Did you have outstanding vulnerabilities? If so, were they reported? How were these “canary bugs” rediscovered in the assessment?

  5. How do you feel about price for value given the category and quality of the vendor chosen? Was there a clear benefit (presuming a non-bargain vendor) over a vulnerability scan?

  6. How was the vendor’s communication? Did you feel prepared by the kickoff? Were you able to easily reach the testers? Were communication rules of engagement respected? Did the readout answer any questions?

Common testing cadences

Whether or not you felt your assessment was a success, the next element of planning focuses on scheduling. When creating an assessment calendar, the first question is frequency. For the average organization, annual assessments are a good starting point. Other common patterns include quarterly, development cycle aligned (e.g per-release), and compliance aligned (e.g pre-audit).

Scope and vendor

You also need to reconsider scope and vendors. For scoping, retro the most recent assessment on both breadth and depth. Consider any limitations listed in the assessment report. Retargeting largely similar scopes can improve overall coverage of deficiencies. Targeting more diverse scopes can ensure broad, minimal coverage.

There are competing beliefs on using the same or different vendors for your assessments. Vendor rotation for assessment traces back to at least a 2010 SANS whitepaper. Vendors including Atredis, SecureIdeas, and Triaxiom Security have blogs on this topic that conclude by recommending vendor consistency. A review of the tradeoffs of rotating vendors:

Pros of rotation:

  • Provides comparison point for cross-vendor performance and value

  • Reasonable based on belief vendors will work hardest for new clients and that firms are of fungible quality

  • May be recommended or required by policy or auditors

  • Different firms may have different specialties and methodologies that surface different vulnerabilities

Pros of repetition:

  • Decrease ramp up time on subsequent engagements

  • Improved communication and project management from long term relationship

  • Improved technical and business impact due to experience with target

  • Potential cost-savings for volume or relationship

  • New vendors carry risk of underperformance

A compromise approach also exists. If you’re happy with your vendor, you can stick with them but request they rotate who performs the testing. This balances fresh eyes and historic context. This works so long as the vendor is sufficiently staffed relevant to your assessment frequency.

Scaling your program

There are special considerations for enterprise assessment programs. Large companies have more significant requirements. They also tend to have additional budget and leverage with vendors. Focusing on a small set of vendors and using large contracts allow the enterprise to optimize their assessment program. They can use this leverage on price, scheduling, and consultant selection. At scale, project management also becomes an outsized element of the program. This includes how the vendor tracks and passes context, range of skill sets on offer, and how much the vendor can manage the program with minimal overhead.

Standardization will be essential as you scale. It ensures repeatability, consistency, and positive (internal customer experience). One of the most sizable decisions will be when you begin to bring penetration testing in house. Your goal should be to staff internally in a way that will allow increased value from internal alignment, heavy utilization, and decreased costs.

You will also need to define ownership and standardize the intake process. This project management function can have all assessments scheduled and arranged by a dedicated team, or can provide utilities and guidelines for distributed management. Guidelines should include:

  1. What needs to be assessed and how frequently - per the Cobalt.io State of Penetration Testing 2021 report, the average respondent assessed “63% of their application portfolio.”

  2. Projects should be prioritized on a risk basis and default scope boundaries such as product, feature, or team, should be established.

  3. Standards for time and budget should be provided, such as “all application versions shall have a 1 week assessment, pre-approved under $15,000.”

  4. Unify the intake process and leverage your size to demand standard and automatable reporting formats from vendors. Develop internal triage guidelines on severity. Deploy internal guidance on remediation for common vulnerability classes using consistent mechanisms.

  5. Develop metrics for your security assessment program. You can bootstrap with metrics on coverage of your portfolio, finding characteristic measures (risk, impact, exploitability), and finding density. Your existing vulnerability management program should account for measurement such as mean time to resolution. 17

  6. Force-multiply the value of your assessments and 10x your security. Do not treat them as transactional bug hunts. Use the opportunity to identify trends in your security posture, build guardrails and secure wrapper libraries around differentiated weaknesses, and kill bug classes at scale.

From the Survey: How are companies calculating return on investment?

  1. Many don’t

  2. Some try to referee quality:

  3. “Look at the overall quality from the pentest provider over time (can’t do it for an individual assessment)”

  4. “Depth of analysis and quality of analysis that goes beyond scanning tools.”

  5. “Quality of findings, specifically those that are scalable across our company.”

  6. “Quality of the assessment, quality of the findings”

  7. Others look at impact:

  8. ““identify and close critical or high bugs … a general sentiment from those who hear about it.”

  9. “Risk reduction” / “Aggregate organizational risk identified”

  10. “Value in contributing to sales success” / “$ business lost from potential risks”

  11. “1. grading the visibility to areas needing improvement, 2. grading the efficacy of monitoring and our response capabilities”

Conclusion

Security service procurement is often done poorly. Security is a market for lemons and assessments are emblematic of this problem. Buyers struggle to define their motivations, to find and distinguish vendors, to contract, to partner in delivery and to close and evolve the assessment in a way that drives business value. Information asymmetry runs rampant and much of the guidance is biased by the origin. Procuring a low-quality assessment results in buyer’s remorse, wasted budget, and residual unknown risks.

I hope this guide helps you in building a successful security assessment program at your company. Constantly interrogate and evolve your approach and good luck keeping pace with information security risk.