Get extra lift from AOPA. Start your free membership trial today! Click here

Combating human error

Why did I do that—again?

By Ben Berman

I push the throttles up and the Cessna Citation lunges forward. Runway stripes flash under the nose, and at the right instant I point the nose up at a clear blue sky. We smoothly rise as the ground falls away.

Photo by Chris Rose
Zoomed image
Photo by Chris Rose

A few seconds later, we’re climbing through 400 feet. Airspeed is plentiful. The tower tells me about traffic off the parallel runway. I look for that and then take their invitation to contact departure control. Climbing through 5,000 feet, it’s time for the after-takeoff checklist. Gear is up, lights are out. Flaps? Oh boy, they’re still at the takeoff setting. I pop the selector up. We’re going 190 knots, so I just barely didn’t overspeed the flaps. Freed of that drag, the airplane lunges forward again. I normally raise the flaps at 400 feet. Why not today?

Hindsight is 20/20. Thinking back on the takeoff and climb, I was distracted by the tower’s traffic advisory. I could have, would have, should have retracted the flaps. But let’s use some foresight and try to figure out how to keep this from happening again, so I don’t go on and overspeed the flaps someday. Sadly, I can’t reliably prevent this kind of mistake. The human brain, so capable that it allows us to fly airplanes, also fails us in a few characteristic ways. We can’t eliminate these built-in vulnerabilities of our brain’s processing, but we can fight back.

Multitasking and distraction

Pilots are excellent multitaskers, right? In truth, none of us can multitask at all. Instead, if I could monitor my brain’s workings while I flew, I would see my mind rapidly switching from one thing to the next, never really doing two things at once.

This is why we get distracted. If the brain is focused on the control tower or traffic outside at the moment we climb through 400 feet, the cue to raise the flaps is gone a split second later when we focus back on the airplane.

Sometimes the system sets us up for distraction. For their own procedural reasons, the tower is going to switch us to departure right after takeoff. And centers set up some of their sector boundaries by altitudes, so they’re likely to give us a new frequency just as we need to level off. 

It’s another day, this one not-so-beautiful. My destination is straight ahead and there are thunderstorms in the area. The Cirrus has a great weather radar display, and it’s showing a red cell northwest of the field. The airport is good VFR with calm winds. I have it in sight with a clear shot to Runway 5. Good to go! I turn final, and I keep the speed up just in case.

Checking the storm on the radar, I can see that the red is showing patches of yellow. It looks like the storm is dissipating. That’s an encouraging trend. Just to be sure, I put the radar image in motion. The storm is moving southeast, it seems to be moving slowly, and I can clearly see some red turning to yellow, especially at the edges. This is fine news, so I keep on.

Now I’m at 600 feet, on profile. The tower transmits, “Cirrus 52X, wind three three zero at 20 gusting 30.” As I cross the threshold, the ride gets really wild. I have to throw in full sidestick left and up to touch down in the correct attitude. Unloading my luggage from the airplane a few minutes later, I get in the downpour. How could I have misread those signs?

Confirmation bias

A cognitive psychologist, comfortably seated in a coffee shop that’s not bouncing around and trying to touch the ground nose-low and wing-down, would recognize the seeds of my storm encounter as confirmation bias. I looked at the storm to the northwest and had the airport in sight, and I decided the storm was not a threat. That became my interpretation of reality.

Next, I turned to the weather radar display. What stuck out to me was the slow cell movement and its dissipating intensity, and that made me feel good about continuing. I thought I was being extra careful, checking the radar. Without realizing it, though, I was also looking for reasons to keep believing it was correct to continue the approach. The psychologist would say that it’s human nature for all of us, including pilots, to seek out information that confirms what we already believe. 

I then got the tower’s report of gusty wind conditions. This didn’t trigger the thought that the wind was part of the thunderstorm. I didn’t put together the new reality that the thunderstorm had arrived at the field before I had. In confirmation bias, we also tend to downplay, or not even recognize, information that contradicts our current interpretation of the situation. This is one of the reasons why, once we’ve made up our minds about something, it’s awfully hard to dislodge what we’ve been thinking and change.

This time the thunderstorms are en route, in a solid line across my course. I’m looking at the weather radar display and trying to figure out how to get through, or whether to turn around. There’s a hole about 30 degrees left. It’s a little tight and downwind of one of the storms. I decide to give it a try. The ride is a little bumpy, with some moderate rain, but after a few minutes I break into bright sunshine. Winner!

A couple of months later, here’s another line of storms to cross or avoid. Once again, there is a hole. The radar looks okay, so I decide to go through. This time, though, the ride downwind of the displayed cell is terrible, and there are strong updrafts and downdrafts. I’m really sorry to be in there. How did I get there?

Normalized deviance

Cutting close to thunderstorms and relying on radar composites that might be 10 to 20 minutes old when displayed in the aircraft is taking a risk.

You can take a risk and not pay the price—most of the time the hole in the weather will be wide enough, the storm not moving or developing quickly enough.

It’s human nature. Every time we take a risk and get away with it, our risk/reward matrix tilts toward taking that risk again. This is called normalized deviance, because bit-by-bit what was deviant can become our new normal. If it worked last time, we’re more likely to do it again. It only works in reverse if we scare ourselves badly enough to tilt our internal matrix the other way.

Without our awareness, the bias can develop gradually over time. It’s a creeping reduction of safety margins, until it’s something even worse.

These are just a few of the reasons why, as pilots who are human beings, we make mistakes. Because we can’t eliminate our mistakes, we have to focus on catching them when we can and avoiding their circumstances to the max—whether it’s changing our routines to limit distractions, working hard to be open to a new interpretation of reality, or stopping ourselves from going down a slippery slope to normalizing flying that can bite.

Ben Berman is a former airline captain, NTSB accident investigator, and NASA human factors researcher.

Related Articles