The OODA loop (for observe, orient, decide, and act) is a concept originally applied to the combat operations process, often at the strategic level in military operations. It is now also often applied to understand commercial operations and learning processes. The concept was developed by military strategist and USAFColonel John Boyd.
The OODA loop has become an important concept in both business and military strategy. According to Boyd, decision-making occurs in a recurring cycle of observe-orient-decide-act. An entity (whether an individual or an organization) that can process this cycle quickly, observing and reacting to unfolding events more rapidly than an opponent, can thereby “get inside” the opponent’s decision cycle and gain the advantage. Frans Osinga argues that Boyd’s own views on the OODA loop are much deeper, richer, and more comprehensive than the common interpretation of the ‘rapid OODA loop’ idea.
Boyd developed the concept to explain how to direct one’s energies to defeat an adversary and survive. Boyd emphasized that “the loop” is actually a set of interacting loops that are to be kept in continuous operation during combat. He also indicated that the phase of the battle has an important bearing on the ideal allocation of one’s energies.
Boyd’s diagram shows that all decisions are based on observations of the evolving situation tempered with implicit filtering of the problem being addressed. These observations are the raw information on which decisions and actions are based. The observed information must be processed to orient it for further making a decision. In notes from his talk “Organic Design for Command and Control”, Boyd said,
The second O, orientation – as the repository of our genetic heritage, cultural tradition, and previous experiences – is the most important part of the O-O-D-A loop since it shapes the way we observe, the way we decide, the way we act.
As stated by Boyd and shown in the “Orient” box, there is much filtering of the information through our culture, genetics, ability to analyze and synthesize, and previous experience. Since the OODA Loop was designed to describe a single decision maker, the situation is usually much worse than shown as most business and technical decisions have a team of people observing and orienting, each bringing their own cultural traditions, genetics, experience and other information. It is here that decisions often get stuck, which does not lead to winning, since
In order to win, we should operate at a faster tempo or rhythm than our adversaries–or, better yet, get inside [the] adversary’s Observation-Orientation-Decision-Action time cycle or loop. … Such activity will make us appear ambiguous (unpredictable) thereby generate confusion and disorder among our adversaries–since our adversaries will be unable to generate mental images or pictures that agree with the menacing as well as faster transient rhythm or patterns they are competing against.
The OODA loop, which focuses on strategic military requirements, was adapted for business and public sector operational continuity planning. Compare it with thePlan Do Check Act (PDCA) cycle or Shewhart cycle, which focuses on the operational or tactical level of projects.
The key is to obscure your intentions and make them unpredictable to your opponent while you simultaneously clarify his intentions. That is, operate at a faster tempo to generate rapidly changing conditions that inhibit your opponent from adapting or reacting to those changes and that suppress or destroy his awareness. Thus, a hodgepodge of confusion and disorder occur to cause him to over- or under-react to conditions or activities that appear to be uncertain, ambiguous, or incomprehensible.
Writer Robert Greene wrote in an article called OODA and You that
the proper mindset is to let go a little, to allow some of the chaos to become part of his mental system, and to use it to his advantage by simply creating more chaos and confusion for the opponent. He funnels the inevitable chaos of the battlefield in the direction of the enemy.
Applicability of the OODA loop
Consider a fighter pilot being scrambled to shoot down an enemy aircraft.
Before the enemy airplane is even within visual contact range, the pilot will consider any available information about the likely identity of the enemy pilot: his nationality, level of training, and cultural traditions that may come into play.
When the enemy aircraft comes into radar contact, more direct information about the speed, size, and maneuverability, of the enemy plane becomes available; unfolding circumstances take priority over radio chatter. A first decision is made based on the available information so far: the pilot decides to “get into the sun” above his opponent, and acts by applying control inputs to climb. Back to observation: is the attacker reacting to the change of altitude? Then to orient: is the enemy reacting characteristically, or perhaps acting like a noncombatant? Is his plane exhibiting better-than-expected performance?
As the dogfight begins, little time is devoted to orienting unless some new information pertaining to the actual identity or intent of the attacker comes into play. Information cascades in real time, and the pilot does not have time to process it consciously; the pilot reacts as he is trained to, and conscious thought is directed to supervising the flow of action and reaction, continuously repeating the OODA cycle. Simultaneously, the opponent is going through the same cycle.
How does one interfere with an opponent’s OODA cycle? One of John Boyd’s primary insights in fighter combat was that it is vital to change speed and direction faster than the opponent. This is not necessarily a function of the plane’s ability to maneuver, rather the pilot must think and act faster than the opponent can think and act. Getting “inside” the cycle—short-circuiting the opponent’s thinking processes—produces opportunities for the opponent to react inappropriately.
Another tactical-level example can be found on the basketball court, where a player takes possession of the ball and must get past an opponent who is taller or faster. A straight dribble or pass is unlikely to succeed. Instead the player may engage in a rapid and elaborate series of body movements designed to befuddle the opponent and deny him the ability to take advantage of his superior size or speed. At a basic level of play, this may be merely a series of fakes, with the hope that the opponent will make a mistake or an opening will occur, but practice and mental focus may allow one to accelerate tempo, get inside the opponent’s OODA loop and take control of the situation—to cause the opponent to move in a particular way, and generate an advantage rather than merely react to an accident. Taking control of the situation is key. It is not enough to speed through OODA faster — that results in flailing.
The same cycle operates over a longer timescale in a competitive business landscape, and the same logic applies. Decision makers gather information (observe), form hypotheses about customer activity and the intentions of competitors (orient), make decisions, and act on them. The cycle is repeated continuously. The aggressive and conscious application of the process gives a business advantage over a competitor who is merely reacting to conditions as they occur, or has poor awareness of the situation.
The approach favors agility over raw power in dealing with human opponents in any endeavor. John Boyd put this ethos into practice with his work for the USAF. He was an advocate of maneuverable fighter aircraft, in contrast to the heavy, powerful jet fighters that were prevalent in the 1960s, such as the F-4 Phantom II andGeneral Dynamics F-111. Boyd inspired the Light Weight Fighter Project that produced the successful F-16 Fighting Falcon and F/A-18 Hornet, which are still in use by the United States and several other military powers into the 21st century.
An article published in the McKinsey Quarterly triggered my thinking a couple of month ago. The author, Lowell Bryant, was highlighting the need for “just in time” decision making in companies.
“Much of the art of decision making under uncertainty is getting the timing right. If you delay too much, investment costs may escalate, and losses can accumulate. However, making critical decisions too early can lead to bad choices or excessive risks”
90% of pilot training include decision making. Professional pilots spend hundreds of hours in simulators and in the cockpit trying to tackle one key challenge: those 3 or 4 minutes where, one day in their career, they will have to make the one decision that will result in the life or death of hundreds of passengers.
The environment a pilot is navigating is not that dissimilar than the corporate environment and its many uncertainties and disruptions: a bird striking your engines, icing accumulating on the wings, an unexpected delay resulting on an aircraft been in your flightpath… as those happen, the hundreds of hours of training are coming into play.
This is how we train pilots to make the right decision.
Understand biases induced by the way our brain, and more specifically our senses are built.
As a pilot feels the skin of his/her back pressing against the back of the seat, he/she will assume the plane is accelerating. However, it can also be the results of a plane which nose has unexpectedly pointed towards the sky, and which is climbing sharply while loosing speed. Polits also are well aware of the Graveyard spiral: an observed loss of altitude during a coordinated constant-rate turn that has ceased stimulating the motion sensing system can create the illusion of being in a descent with the wings level. The disoriented pilot will pull back on the controls, tightening the spiral and increasing the loss of altitude. Mistakes like this have in the past led to terrible accidents, one of them been the decision made by the Air France pilots during that terrible Rio-Paris crash.
See here a documentary by the BBC on the Air France Rio-Paris crash, largely due to pilot error.
Most significantly, we now when we train pilots that biases are accentuated in the following situations:
Expectations based on experience: pilots compare the information they get with what they have learned from past experiences. For example, the image of the runway as they are about the land is compared with the image from runways they are used to. If a runway is narrower or larger than what they are used to, it can result in an inaccurate estimate of airplave height.
The recent decision by HP’s CEO to exit the computer world and focus on developing software is partly driven by Leo Apoteker’s experience at SAP. This logic can have unexpected consequences: it implies that the most experienced CEOs might be the ones who are most likely to make the wrong decisions.
Expectations based on anticipation: as a pilot is getting ready to take off, he/she is waiting for the final green light from the tower. Accidents have happened when the insctruction that followed ” cleared into take off position” can be understood as “cleared for take-off” ( resulting a few years ago in a collision of 2 Boeing 747 in Tenerife). Similarly, we have observed CEOs who after having presented a strategic plan to a less than luke-warm board of directors, have taken the quiet reserve or lack of decision for a green light.
Expectations based on habits: if a pilot is used to parking his/her plane on “Parking Area Alpha” and exceptionally receives the instruction to park in “Parking Area Golf”, the risk exists that the pilot still goes to Alpha, even after having repeated accurately the instructions to the tower. This bias explains why so many executives keep repeating the same patterns or decisions, even when their environment is clearly shifting ( this is currently happening in the telecommunication industry).
You will find additional information about strategic biases in our article : “Strategic Blindspot Index” ( see here ).
When under stress, revert to checklists
To allow the mind to focus on the important task of assessing the situation,checklists have been built to allow pilots to process information fast and get data he / she needs. This is due to the fact that to the best of human abilities, the mind can only process 7 pieces of short information at a time ( this is why in most countries most phone numbers have 7 digits). Offloading pilots on steps to follow in an emergency situation has been crucial to the profession. As far as I know, no such checklists have been developed for CEOs and executives ( except in extreme crisis situations).
If one looks closely at how those checklists have been developed overtime, an interesting process emerges: checklists have been crowdsourced to the entire flight community. Any incident, any accident is logged, discussed and shared in publications and training session so that the knowledge in the industry can collectively grow. No such process exists in the corporate world. Companies struggle to share knowledge internally, and certainly, no repository exists today to share failures, and debate them publicly. A few initiatives have emerged, such as the MIX (Management Insight Exchange), but they have only gained limited visibility.
Be physically shaped for decision making
A significant percentage of human errors occur under stress. There is a code of eating behavior that guides pilots to be physicallly shaped for decision making.
In addition, the aerospace community has studied those factors carefully and are able to anticipate situations when decision making might fail ( for more, see here ).
A questionnaire concerning life changes, personality factors, and adjustmental and leadership qualities of U.S. Naval aircrewmembers involved in aircraft accidents was sent to investigating flight surgeons during 1977-78. The responses were divided into two groups: those who were causally involved in accidents and those who were not. In order to cross-validate the results, data were collected and analyzed. Results indicate that aircrewmembers in the process of deciding about staying in the service are more likely to fall into the causally involved group. So were those who had trouble with interpersonal relationships, had no sense of humor or humility concerning themselves, were immature, or had recently lost a friend or family member through death.
Source: A questionnaire study of psychological background factors in U.S. Navy aircraft accidents ( Alkov RA, Borowsky MS.)
I have not seen one company that has adopted a eating guide, health and fitness code (and requirement) for its executives. Few have tackled the issue of executive support and coaching to deal with stress factors. Finally, I also have never seen a“sense of humor” test as part of the recruiting toolkit…
Understand the role of intelligence
When flying, pilots do have an array of information they can tap into: numerous maps of existing weather and wind conditions, 3-hour forecasts, a radars, insights from the control tower, etc…
Similarly, executives can base their decisions on a flow of information and data. Thecompetitive and strategic intelligence process is therefore a key part of their decision making process. A recent survey of North American companies ( 2011 Global Market Intelligence Survey) shows that 84% of companies have implemented a structured intelligence process in house. On average, North American companies have teams of 10 people with competitive intelligence as their primary role, which cater to 1,162 internal clients. Furthermore, nearly 70% of North American companies intend to increase their investments in competitive intelligence (also known as market intelligence) in 2012-2013.
Flight instructors know the law of intensity: the best way to anchor learning in a student is to experience the effect of mistake emotionally. For example, we often let the student pilot stall the aircraft and go for a spin. A new pilot will never forget this experience – and hopefully recognize the signs early in the future.
Pilots wil rehearse the same crisis situation many times, each time with an added twist or new factors to learn to make decisions without anchoring to a past situation.
Similarly, role playing, business simulations and war gaming can help executive play out possible outcomes in crisis situation. The “World without Oil” is a great example of how this might be done.
WORLD WITHOUT OIL is a serious game for the public good. WWO invited people from all walks of life to contribute “collective imagination” to confront a real-world issue: the risk our unbridled thirst for oil poses to our economy, climate and quality of life. It’s a milestone in the quest to use games as democratic, collaborative platforms for exploring possible futures and sparking future-changing action.
Do not trust your instruments
Pilots learn early not to trust their instruments entirely, and only when they cantriangulate the information. As instruments can lie, or break down. Again, the recent accident of the Paris-Rio Air France flight illustrated how difficult it can be to make the right decision when one instrument ( in that case a frozen speed indicator) fails. It is only by looking at the full picture that they make up their mind about the real situation.
Similarly, companies should revisit their key indicators. One way to avoid blindspots can be to systematically revisit the type of information that is collected and communicated and take some key assumptions off the equation: if my sales go up, and the number of customers as well, does it mean I attract the best ones ( or that I am failing like in the subprime case) ? What if the indicators say the opposite ? Am I monitoring the correct dashboard ?
Train for the situation
Pilots are trained specifically for each type of aircraft they fly. In fact, they will get additional training each time they change the type of aircraft they are flying.
Managers don’t. When new CEO of HP joined from SAP, he applies to his new business the same principles as the ones what seemed to work at SAP. He divested of hardware, and directed the company towards software. I do not know business schools today which differentiate their training based on the type of company managers will work with. What about an MBA in Telecommunication management? An MBA in retail management? ( of course, it would also be interesting – and innovative- for a telecom company to hire a retail management MBA to bring in a different way of doing business….)
Managers and executives are still largely trained today the same way they were trained 50 years ago. Little emphasis is put on decision making, which is one of the key skills in today’s turbulent environment. As other professions have developed an acute understanding of the decision making process, we should, as a business community, learn from them.
Estelle Métayer brings vast experience and fresh perspective to the ever-changing world of Competitive and Strategic Intelligence. A noted expert, her intuitive, precise research provides managers, CEOs, and board members with the right tools to effectively build and hone their competitive intelligence and strategic planning – to avoid blind spots, capitalize on strengths and excel. Estelle is also a commercial pilot and Certified Flight Instructor.
Archived documents are available at
- Boyd, John R. (September 3, 1976). Destruction and Creation. U.S. Army Command and General Staff College.
- Boyd, John, R., The Essence of Winning and Losing, 28 June 1995 a five slide set by Boyd.
- Greene, Robert, OODA and You
- Hillaker, Harry, Code one magazine, “John Boyd,USAF Retired, Father of the F16“, July 1997,
- Kotnour, Jim, “Leadership Mechanisms for Enabling Learning Within Project Teams” in Proceedings from the Third European Conference on Organizational Knowledge, Learning and Capabilities, Proceedings OKLC 2002
- Linger, Henry, Constructing The Infrastructure For The Knowledge Economy: Methods and Tools, Theory and Practice, p. 449
- Metayer, Estelle, Decision making: It’s all about taking off – and landing safely…, Competia, December 2011
- Osinga, Frans, Science Strategy and War, The Strategic Theory of John Boyd, Abingdon, UK: Routledge, ISBN 0-415-37103-1.
- Richards, Chet, Certain to Win: the Strategy of John Boyd, Applied to Business (2004) ISBN 1-4134-5377-5
- Ullman, David G., “OO-OO-OO!” The Sound of a Broken OODA Loop, Crosstalk, April 2007,
- Ullman, David G., Making Robust Decisions: Decision Management For Technical, Business, and Service Teams. Victoria: Trafford ISBN 1-4251-0956-X