Bishop Fox named “Leader” in 2024 GigaOm Radar for Attack Surface Management. Read the Report ›

Lessons Learned from Years of Red Teaming in Cybersecurity

Illustration of battleship game


Red teaming can mean a lot of things to a lot of people. In the truest sense, and how we will define it here for the purpose of this article, red teaming is the practice of adopting the adversarial mindset — thinking like the bad guy or how a situation can go wrong. Understanding and viewing how things can go wrong and why is extremely important in everyday life. With random acts of violence and terror occurring seemingly every day, it’s important to understand how to think critically and apply it to your life. For red teams, this is paramount, as this type of applied critical thinking is a make it or break it skill set.

After years of building and running red team exercises within the U.S. government, and for many of the Fortune 500, I’ve learned some pertinent (and sometimes difficult) lessons that are critical to keep in mind as you’re running your own red teams. Here are a few key lessons to share with your team:

Don’t Overthink It and Don’t Fight Your Gut

Sometimes you just need to go with your initial instinct. If you start to think something might be amiss or just plain wrong for the environment you’re in, chances are it is. There are times where you need to embrace that feeling of uncertainty and either get the hell out of the area or ensure you approach the situation in front of you with multiple strategies. If you are too reactive to what’s around you, you will be one step behind. A primary goal of a red team is to dictate and develop the situation, not to let others do it for you. The less control over a situation you have, the more opportunities for failure. Good proactive actions will always win the day over bad reactions or freezing in place. One caveat to this is sometimes inaction is in fact an action itself; if you make the decision not to do anything, you are still dictating the tempo of events. If you’re conducting a red team operation, drill and practice the plan over and over. Approach the problem set from multiple angles to better understand what could go wrong and what success actually looks like.

Accept That You Don’t Know Everything

A good red team is composed of unique members with their own specialties, abilities, and perceptions. To build a good team you need to think about what skills and experience you’ll need to get the job done. In that same vein, you cannot settle for less. A team may be successful on one hand for a particular situation but if the playing field were to change, you should consider these modifiers and adjust team composition as necessary. The other benefit of having a variety of talent, skills, and ability on the team is that you are less likely to fall into groupthink, which can lead to blind spots.

Good team members can adapt given enough time; however, if you have timelines that can’t be moved, it’s better to swap in teammates that can multiply your chances of success or call the operation off. The worst-case scenario would be to proceed with an inadequate team where you don’t achieve mission success or someone gets hurt or worse.

To prove an example of a diverse team, below is the notional structure that was leveraged on a real red team operation in the past:

  • Red Team Lead (RTL) – responsible for clarifying goals and objectives and ensuring success through strategy and tactics; also responsible for translating real-world risks into business impacts and communicating them to the client.
  • Project Manager/Analyst/Technical Writer – responsible for ensuring that the red team is executing against the operational order (OPORD) and tracking risks to the operation, and documenting results.
  • Business Analyst – partnered with the client to understand business impacts, similar to the RTL, but also provide a feedback loop to the operators for situational context during the operation which provides strategic opportunities to exploit processes not just tangible assets.
  • Technical Analysts (Networking, Exploit Development) – point for assessing the target for technical vulnerabilities, identifying ingress opportunities, and establishing command and control through covert exploitation.
  • Non-Technical Analyst (Social) – performs open source intelligence (OSINT), analyzes relationships and identified opportunities for social engineering. Also aids in analyzing risk and impact of weaknesses identified.
  • Physical Security Specialist – performs on-site reconnaissance, wireless network sniffing (cross-trained), security control review and testing of deterrence and preventative physical countermeasures, and identifies opportunities for compromise.

Encourage Disagreement and Alternative Planning

Until planning has been completed and you’re in an operational state, encourage feedback and dissenting opinions during the plan design phase. This is where it matters most. During the operation, ensure your operators are continually providing feedback on what is working or not and why. If you have a red team leader (RTL) dictating every step of the plan and operation while ignoring the team, then you have a bad RTL. This doesn’t happen too often; however, just like the above point where you don’t know everything, the same is true for the RTL. This is why it’s a good idea to shift RTLs based on the engagement. If the predominant activity will involve physical entry, then the RTL should be someone with a solid understanding of this area and critical thinking skills, not a computer expert, or someone that has no experience in that area.

A Good Red Team is a Thinking, Adapting Team

It might be obvious to most, but if you’re not adapting to what I like to call the atmospherics around you, things can go very wrong, very fast. When creating your operational plan, it’s important to have a backup plan (which you should also dry run). We like to follow the PACE principal of planning. Having a primary, alternative, contingency and emergency plan based on the original operations order (OPORD) is key. Just because your primary plan goes to shit, doesn’t mean you can’t still be successful — but you have to plan for it to ensure success.

Know When You’ve Reached the Point of No Return or Failure

Often I’ve seen red teams continue to hammer away even though their chance of success is nil. You must be honest with yourself and your teammates when you’ve reached that point. Continuing to execute can bring further harm to your operation, team, or the target itself. Lastly, understand what the point of no return is within each stage of your plan. This is vital in case you need to shift or adapt your tactics or go to plan B. Otherwise it may be too late, and you’ll find yourself staring at failure.


In your red team efforts, you need to apply critical thinking to all aspects of planning and executing to make sure you’re not wasting time and effort, but instead achieving results. The most important thing you can do is reflect on the performance of the team as a whole and on the individual level; this includes self-reflection as well. Conducting an after-action report (AAR) and walking through how the operation was planned and executed will help you understand how each component was leveraged or mis-leveraged based on expectations and the achieved results or failures. These are especially useful for improving tactics and strategy as a whole for the team, but also opportunities for improvement at the individual level.

Subscribe to Bishop Fox's Security Blog

Be first to learn about latest tools, advisories, and findings.

Dan Wood, Bishop Fox Alumnus

About the author, Daniel Wood

AVP of Consulting

Daniel Wood (CISSP, GPEN) is a Bishop Fox Alumni. Daniel was Associate Vice President of Consulting at Bishop Fox, where he led all service lines, developed strategic initiatives, and established the Applied Research and Development program. Daniel has over 15 years of experience in cybersecurity and is a subject matter expert in red teaming, insider threat, and counterintelligence. Daniel was previously the manager of security engineering and technology at Bridgewater Associates, where he shaped the strategic direction of technology for the firm and oversaw technical security assessments of Bridgewater's international office expansions.

Daniel has also served in roles supporting the U.S. government in security architecture, engineering, and offensive operations as a Security Engineer and Red Team Leader. He supported the U.S. Special Operations Command (USSOCOM) on red teaming and digital warfare operations, and the U.S. Army on the Wargaming Cyber Effects on Soldiers' Decision-Making project. Daniel is currently a member of the Ithaca College Cybersecurity Advisory Board. He holds a Bachelor of Science in Administration of Justice from George Mason University.
More by Daniel

This site uses cookies to provide you with a great user experience. By continuing to use our website, you consent to the use of cookies. To find out more about the cookies we use, please see our Privacy Policy.