Bishop Fox named “Leader” in 2024 GigaOm Radar for Attack Surface Management. Read the Report ›

What Makes a Good Penetration Test?

Illustration of computer monitor with a form stuck to it with arrows. What Makes a Good Penetration Test?.


One of the hardest parts about working with any professional service is evaluating whether the service was performed well. Isn’t it funny how every plumber complains about the work of the plumber before? Although some aspects are subjective, this article will present criteria for you to consider when evaluating a penetration test.


When you purchase a penetration test, the sole output is typically a series of written deliverables. Your team may then share these documents with customers, executives, or newly onboarded developers, so it is important that the documentation be as clear and accurate as possible.

Here are a few questions to consider regarding the deliverables:

  • Can your team understand these documents, reproduce the findings, and follow the remediation guidance?
  • Are there overarching strategic recommendations?
  • Does the executive summary accurately capture the contents of the report?
  • Is important information properly represented, such as dates or the assessment’s context, goals, and limitations?
  • Do the findings have actual security risk (as opposed to innocuous software bugs) and an appropriate risk rating?

An assessment report should clearly summarize the results of the test, as well as arm your team with remediation guidance and forward-thinking recommendations for improving your application’s security.


Assessments that contain only findings miss out on the experience and perspective of your consultants. Even if you decide not to pursue strategic recommendations, it is helpful to get the perspective of a team that was focused on breaking your application. Strategic (or overarching) recommendations can ground the findings with common themes or insecure design patterns, and implementing these changes can result in more systemic change.


There’s nothing worse than receiving an assessment report filled with findings and then being left to your own devices to try and determine how to solve them. Although a certain degree of implementation details will be unique to your product and technologies and beyond the scope of a recommendation, you should be provided with sufficient detail to know how to proceed. The consultants you hire should be experts at not just breaking things, but also knowing how to put them back together.


Consultants should be available for follow-up questions that result from an assessment. These discussions should also be tailored to the product in question. For example, sanitizing user input for a Latin alphabet field is very different from doing so for a multi-language Unicode field.

That said, although conversations can be easily accommodated by most consultancies, multi-hour discussions or requests for additional deliverables outside of the original statement of work may require additional billable hours.


Finally, the best and most subjective measure of a penetration test is your own gut feeling. After reviewing the report, do the findings, recommendations, and summary feel right to you based on your knowledge of the application?

A bad gut feeling mostly occurs for reports with limited or no findings. Here are some red flags:

  • Findings with seemingly no security implications

  • Inappropriate risk ratings (e.g., an insecure SSL configuration is rated high risk)

  • Findings that appear to be copied and pasted from a scanner (e.g., Can you Google the findings’ contents and find hits in online scanner documentation?)

  • Excessive typos in both the narrative and technical details

  • Unprofessional tone

  • Numerous, unexplained out-of-scope findings (note: consultants sometimes stumble across these, but the focus should still be on the original scope)

Mistakes and misunderstandings do happen, especially when there is a breakdown in communication during the project. But other times, the work, the quality, or the approach may not be a good fit for your team.

Subscribe to Bishop Fox's Security Blog

Be first to learn about latest tools, advisories, and findings.

Jake Miller

About the author, Jake Miller

Security Researcher

Jake Miller (OSCE, OSCP) is a Bishop Fox alumnus and former lead researcher. While at Bishop Fox, Jake was responsible for overseeing firm-wide research initiatives. He also produced award-winning research in addition to several popular hacking tools like RMIScout and GitGot.

More by Jake

This site uses cookies to provide you with a great user experience. By continuing to use our website, you consent to the use of cookies. To find out more about the cookies we use, please see our Privacy Policy.