“The weather ahead looks worse than forecast. Should I turn around and go back?”
“Those two aircraft look like they’re going to be a potential conflict. Do I need to pass traffic information?”
“This airline passenger is behaving erratically. Do I need to intervene?”
“We have a sick passenger on board. Do we need to divert?”
Decisions, decisions, and more decisions. Everyone who participates in aviation is faced with a multitude of safety-related decisions day to day. Maybe there’s only one option of action, but when there’s multiple options, we need to make a decision.
If aviation decisions were as easy as deciding what to have for breakfast, aviation decision-making would be straightforward. But instead, we’re often faced with a complex range of factors such as:
For more in-depth information about the processes involved in decision-making, see:
Decision-making on Skybrary(external link)
But if you want a bit more of a summary, then keep reading.
Decision-making falls into one of three categories – strategic, tactical, or operational.
This kind of decision-making is for longer-term outcomes, or high-level decisions that, for example, determine the direction of an organisation, and require a lot of forethought. CEO’s and senior managers will be heavily involved in strategic decision-making as they guide their organisation.
This kind of decision-making relates to immediate action-taking, focusing mainly on doing the work in the here-and-now. Pilots, air traffic controllers, engineers, cabin crew, and aviation security officers make a lot of tactical decisions while carrying out their tasks.
This kind of decision-making takes both strategic and tactical goals and puts them into practice. Operational managers who deal with the day-to-day activities associated with aviation will make operational types of decisions.
Situational awareness is a key factor in making effective decisions. We need to clearly understand what’s going on around us and what’s likely to happen in the future, so that we can choose an appropriate plan of action.
Read more about situational awareness
In a complex and dynamic environment such as aviation, it’s not surprising that decision-making can be tricky at times, and we don’t always get it right.
When making decisions or judgements, humans use what are called ‘heuristics’. You might know these by the more common name ‘rules of thumb’. These are shortcuts our brains take to make information-processing more efficient. Being able to short-cut information processing is an amazing feature of our brains, but the downside is the potential for these rules of thumb to lead to biases.
We have a lot of information coming in constantly from our environment, and our brains cannot possibly take the time to carefully process each piece of information that comes our way. So, because searching for information or evidence takes time and energy, our brain looks for these mental shortcuts to make information processing and decision-making more efficient.
Everyone’s prone to bias and it’s difficult to prevent and often identify. The best way for us all to reduce the chances of our decisions being affected by bias is through awareness. If you know and understand what can influence you, then you’re more likely to identify and correct it when it happens to you.
Below are some of the common biases with examples that can affect our decision-making and judgement.
You’re flying your light aircraft towards your home airfield after a weekend away. The weather forecast is a bit marginal but is meant to be above minima the whole way, so you figure you should be fine.
As your flight progresses, the cloud seems a bit lower than perhaps the forecast suggested. Is it going to be okay to get all the way home? The cloud base has been lower than anticipated with a few patches right on minima. The forecast didn’t indicate it would be that low though, so you figure it will probably improve as you continue. There were a few patches of blue sky just behind you. There’ll probably be some more ahead….and those skies ahead do look lighter.
This is confirmation bias. When confronted with unusual situations or factors that we didn’t expect, we tend to look for the information that confirms our expectations, beliefs or ideas, instead of giving our attention to the clues that suggest we need a new plan.
In the example above, the pilot put more weight on the factors telling them the weather is going to be okay ahead – the forecast, the previous patches of blue and the lighter-looking sky. They dismissed the fact they just experienced low cloud right on minima.
Plan continuation bias is an unconscious bias where we continue with a plan or decision to preserve the money, time, or effort that we’ve invested in the flight. And the closer we get to where we’re going, the stronger the bias is.
Plan continuation bias is one form of confirmation bias and is a common factor in fatal accidents, particularly those involving controlled flight into terrain (CFIT). Pilots will know of this bias by its more common name ‘get there it-is’ and, if we’re honest with ourselves, we can all probably recall a flight where we may have experienced this. Perhaps we had an important family event to get to, or we were picking someone up and didn’t want to let them down. Whatever the reason, it’s very easy to fall into the trap of just wanting to get where we’re going even when red flags are popping up telling us that a change of plan might be in order.
Several fatal accidents have occurred in New Zealand where plan continuation bias has been a contributing factor.
Fatal accident examples of plan continuation bias
CAA occurrence 22/4516 [PDF 2.3 MB]
Cessna 182H Skylane
ZK-MGB
Collision with terrain
McCoy Glacier, Froude Range, Southern Alps
04 August 2022
CAA occurrence 19/4241 [PDF 1.1 MB]
Van’s Aircraft Incorporated RV-12
ZK-LSV
Controlled flight into terrain
Kakatarahae Hill, Coromandel Range
14 June 2019
Here's a personal account from Vector magazine about the loss experienced by the family of the pilot in this accident:
CAA occurrence 13/5710 [PDF 1.7 MB]
Hughes Tool Company H269B
ZK-ING
Collision with terrain
Haast Pass, Mount Aspiring National Park
11 April 2014
Combating plan continuation bias
Have you ever expected a certain result but what actually happened wasn’t what you expected at all? You wouldn’t be the first. Expectation bias is exactly that, expecting something to happen in a certain way. This often occurs when tasks are repetitive or routine.
Expectation bias is another form of confirmation bias. When faced with uncertainty, our brain tends to rely on what it knows from experience to process information and make decisions quickly. It happens because our brains have evolved to recognise patterns to make sense of what’s going on around us. This is helpful to cut down on the amount of processing our brains need to do, but it also means our brains sometimes take shortcuts. When reality does not match what we expected, our brain may still see or hear the expected thing, not the reality which can result in errors.
Expectation bias has been a factor in some fatal accidents in New Zealand. You can read about one here:
CAA occurrence 22/2536 [PDF 714 KB]
Piper PA-25-235 Pawnee
ZK-CIG
Tow upset during glider aerotow
Feilding Aerodrome, Feilding
07 May 2022
‘We’ve done it this way before….I know what will happen.’
Outcome bias is similar to expectation bias and is the tendency to assess a pending decision on an outcome that we may have seen or experienced before in a similar situation, rather than assessing the quality of the decision at the time we’re making it.
You may have heard of the term ‘normalisation of deviance’. It’s the name for a fairly common problem where we tend to ignore the hazards, risks, warnings or contradictory evidence, or not follow standard operating procedures. This behaviour becomes ‘normalised’ because, well, it’s always worked in the past, so why would it be different this time?
You can read more about the ‘normalisation of deviance’ concept in previous Vector articles:
Vector magazine: Normalisation of deviance [PDF 164 KB]
Vector Online: Drifting away from safety
Anchoring bias is a tendency to put a lot of weight on the very first piece of information we hear, which then influences how we assess the situation afterwards.
Some studies have shown that if a pilot is initially told the weather is good, they’ll tend to view the conditions as good for flying. If they’re told the weather is bad, they’ll tend to view the conditions as not good for flying – despite the conditions being exactly the same in both cases.
This bias can lead us to make estimates, judgements and decisions based on that piece of anchoring information, using it as an arbitrary benchmark for all subsequent information we receive.
The weather is deteriorating ahead of you on your flight planned route. It’s worse than forecast. This is disappointing. Your plans at the destination will be scuppered if you don’t get there, and you don’t want to let anyone down. You decide to press on and see what it’s like further ahead.
OR . . .
The weather is deteriorating ahead of you on your flight planned route. It’s worse than forecast. While this is disappointing, it would be safer for you to divert and assure your safety rather than press on into worsening weather. The people you were going to meet will understand. You decide to divert and land at a nearby airfield.
Framing bias is the tendency to respond differently to a situation depending on the way information is presented either to, or by, the decision-maker. In the first example, not making it to the destination is framed as wrecking the pilot’s plans and letting people down. The pilot decides to continue due to perceiving a diversion as a ‘loss’. More risky behaviours are likely when a decision is framed negatively.
On the other hand, when the pilot framed the decision to divert as having a positive outcome – landing safely and living to tell the story – then the decision is perceived as a ‘gain’. When a decision is framed positively, safer behaviours are more likely.
If we want to make good decisions, having enough time to consider and evaluate the options would obviously be ideal. But the aviation environment doesn’t always present us with an ideal situation. We can’t just pull over to the side of the road like we might in our car. An air traffic controller can’t put their radar screen on pause and have an extended think about what they need to do next.
In a constantly changing environment, sometimes our decisions are made under time pressure. This means we don’t always perceive or understand all the available information, and we don’t always effectively evaluate the available options before making a decision.
Read a Vector article with an interesting personal account of decision-making under time pressure, with some suggestions for managing this issue:
Fatigue and stress can have a significant effect on decision-making. Fatigue can arise from having had a poor sleep, or several successive poor sleeps. It can occur because we work a late, or night, shift and we’re awake when our bodies are meant to be asleep. When working through the night, the catch-up sleep we get during the day is generally not of the same quality as the sleep we get at night. Fatigue can also arise from ongoing stress that we may be suffering either in our work or personal lives.
Fatigue affects different parts of the brain in different ways. Guess which part is badly affected by fatigue? The part of our brain that makes decisions. It’s called the pre-frontal cortex. So it’s not surprising to find that our decision-making abilities become much poorer when we’re fatigued. Even simple decision-making can be harder, but the kinds of complex decisions we might have to make in the uncertain environment of aviation are at high risk of involving error if we’re fatigued.
If you’re a recreational pilot, you have considerable choice of when you decide to go flying. You need to be aware of how fatigue can affect you and follow I'M SAFE principles when deciding about whether to fly.
Front-line staff in a commercial aviation environment may have less choice and are often rostered at times less than optimal for humans. This is why aviation operators are required to have fatigue risk management plans in place, to mitigate the serious risks to aviation safety that fatigue can pose. All professionals in an aviation environment also need to actively consider I'M SAFE principles when deciding about whether they’re fit to go to work.
Fatigue can have serious implications for aviation safety and because it is so important, we have more information here:
Distraction is when our attention is drawn away from a task. A key skill in roles such as piloting, ATC, and engineering is to manage several tasks concurrently. It’s part of life when you operate in the aviation environment.
Distractions and interruptions can have serious consequences for flight safety if they occur at critical times. For pilots, this might be during critical phases of flight such as take-off and landing. For air traffic controllers, this might be during times of peak traffic. And for engineers, when they’re focused on a complex maintenance task.
Distractions and interruptions can cause us to lose focus, which can lead to decision-making errors that could have catastrophic consequences.
Sometimes distractions are simple attention-grabbing events or annoyances that pull our attention away from critical tasks. But at other times, the distracting event might actually provide us with information critical for safe operations. So we can’t just declare the solution to be ‘ignore all sources of distraction’. The key to preventing distraction from negatively affecting our decision-making is to manage the distractions effectively.
Here's a Vector article with suggestions for managing distractions and interruptions:
Vector magazine: Go back three steps [PDF 84 KB]
Whether you’re a recreational pilot joining at a busy, unfamiliar aerodrome, an air traffic controller with multiple circuit aircraft plus some IFR arrivals and departures, an airline flight crew managing an unanticipated go-around, or an engineer with a heavy taskload for the day, you may find yourself experiencing a heavy mental workload. When we’re under a high workload, our capacity to handle an additional task may become somewhat limited because we’re already using up all our mental resources.
When our workload is very high, our performance can decrease. Effective decision-making becomes more difficult and decision-making errors can creep in.
The best mitigation for workload-induced decision errors is to prepare for high workload periods as much, and as soon, as possible. Anticipate possible challenges – clearance changes for example – and manage distractions.
Low workload can be a problem for decision making. Our brain needs a certain level of stimulation, otherwise it gets a bit bored and stops noticing things going on around us. Essentially we become ‘out of the loop’.
What’s worse, if our brain is suddenly required to move from boredom to dealing with an emergency within a few seconds, that can really cause us a lot of problems because it takes time to figure out what’s going on and decide what to do about it. This time could mean the difference between life and death.
If you’re in a low mental workload situation, such as pilots in the middle of a long-haul flight, or air traffic controllers with a low volume of traffic, it’s worth taking steps to keep your brain active such as making sure there is variation in tasks, setting goals, taking breaks and breaking up tasks when they are routine and monotonous.
Decision-making in groups has both positive and negative potential outcomes. It depends on the situation as to what kind of group dynamic will work the best. Group decision-making can result in a far more robust decision that has been ‘tested’ on its merits through discussion and evaluation.
But there are several downsides of group decision-making. Groups are made up of multiple personality types. Some are leaders, some followers. Some hold positions of power, some don’t. Some are assertive, some less so. These dynamics influence how that team will make decisions.
One of the most well-known problems with group decision making is the trap of ‘groupthink’. Groupthink can often lead to poor decisions for various reasons – people may ignore or silence opposing viewpoints, they may ignore potential dangers or take excessive risks, and the desire for group cohesion may win out over robust discussion.
It’s important to be aware of groupthink so you can recognise it when you see it. This video about the Challenger space shuttle disaster demonstrates in three short minutes, what groupthink is.
You’re heading out in your light aircraft for a local joyride with some friends on board. It’s a beautiful day for flying and your friends are excited to be coming along with you. Shortly after you’ve taken off, at just a few hundred feet, the engine suddenly goes very quiet.
Or perhaps you’re a radar controller skilfully coordinating the flow of air traffic through your designated piece of airspace, when suddenly the short term collision alert goes off. But how can that be? You had all the separations in place….didn’t you?
When something comes out of ‘left field’, it can cause immediate stress, anxiety and confusion that can lead to delayed or incorrect actions and decisions. This response is known as the ‘startle’ response. Pilots and air traffic controllers train for this, but when faced with a real-life situation, the shock can cause our brain to temporarily ‘freeze’, making it harder to make correct decisions and compromising safety. Depending on the degree of startle, it can take several seconds, and in some cases up to a minute before we can overcome the effects of it.
Most of us probably know about the Airbus landing on the Hudson River after a bird strike causing both engines to fail. It took several seconds for the experienced flight crew to fully comprehend what had happened, leaving them only a handful of seconds to decide how to respond.
“The sudden loss of thrust was shocking. The startle factor was huge, and we began looking for a place to land over Manhattan, one of the most heavily developed areas on the planet.”
- Chesley Sullenberger
You can listen to Captain Chesley Sullenberger’s account of what happened, his comments on the startle factor, and how this accident affected the crew, both at the time and after, by watching this short video.
The successful river landing resulted in no lives lost, but that’s not always the case. Startle has been a factor in several fatal accidents in New Zealand over the years.
It doesn’t matter whether you’re new to aviation or a seasoned operator, all humans can be affected by the startle response when something unexpected happens.
We can’t simulate a genuine startle response very easily in training, so how do we prepare ourselves should we ever be faced with this situation?
Airline pilots, like Sully, have simulators to repeatedly practise their emergency scenarios. But recreational pilots and commercial pilots of smaller aircraft don’t usually have this luxury. Being prepared is vital to give yourself the best chance of responding to an emergency in the correct way.
Do some ‘chair flying’, role play with an instructor, rehearse your emergency response checklists, and keep current. The more you can practise on the ground, the more likely it is you’ll be able to push through the startle response to successfully manage an emergency.
Here's a Vector article on startle that includes additional suggestions for preventing and managing startle:
Vector magazine: Startle [PDF 191 KB]
CAA occurrence 21/5661 [PDF 964 KB]
Baby Great Lakes
ZK-ULM
Departure from controlled flight
Benmore Station, near Omarama
25 October 2021
CAA occurrence 20/4164 [PDF 674 KB]
Sonex
ZK-NAF
Departure from controlled flight during emergency landing
Ōtaki Aerodrome
17 August 2020
Decision-making errors have led to a number of fatal accidents in New Zealand. Here's a Vector article about a Cessna 185B accident near Wānaka in 2015, followed by three other fatal accident examples.
Vector magazine: Inexplicable [PDF 89 KB]
CAA occurrence 21/529 [PDF 1.6 MB]
Cessna 172G
ZK-COM
Controlled flight into terrain
Upper Waikaia Valley, Southland
03 February 2021
CAA occurrence 19/6687 [PDF 2.2 MB]
Tecnam P2002 Sierra RG
ZK-SGO
Collision with terrain
Tararua Range, 8nm west of Eketahuna
29 September 2019
CAA occurrence 17/7309 [PDF 3.5 MB]
Schempp-Hirth Discus-2C Glider
ZK-GXG
Departure from controlled flight
Huxley Range, Central Otago
21 November 2017