“This is a very powerful psychological tendency. It’s not quite as powerful as some people think, and I’ll get to that later.” –Charlie Munger
This article is part of a multi-part series on human misjudgment by Phil Ordway, managing principal of Anabatic Investment Partners.
Munger cites the Milgrim experiment in which an academic posing as an authority figure convinces ordinary citizens to give what they believe is heavy torture by electric shock. He later adds that it wasn’t just over-influence by authority but also the contrast principle (the shocks were worked up at small increments) and the use of “why,” even though it was a false explanation.
Munger also noted that co-pilots in simulators an intentionally errant pilot to “crash” 25% of the time.
“Man is often destined to suffer greatly when the leader is wrong or when his leader’s ideas don’t get through properly in the bustle of life and are misunderstood.”
“In World War II, a new pilot for a general, who sat beside him in the co-pilot’s seat, was so anxious to please his boss that he misinterpreted some minor shift in the general’s position as a direction to do some foolish thing. The pilot crashed the plane…
“Cases like this one get the attention of careful thinkers like Boss Buffett, who always acts like an over quiet mouse around his pilots.”
Pilots and co-pilots: “There is an old saying in aviation that the reasons you get into trouble become the reasons you don’t get out of it.” Consider three examples: Air France 447 (a loss of airspeed indicators for one minute and 17 seconds ending in the loss of 228 lives) vs. Asiana 214   (simple pilot error resulting in three fatalities) and US Airways 1549   (the complete loss of both engines resulting in zero fatalities). In the case of both AF447 and US1549, the crew had assessed the problem in 10-11 seconds – about as quickly as reasonably possible, but from there the responses deviated dramatically. In all cases, a critical factor was over-influence by authority (among others).
Air France 447 crashed in the Atlantic due to a cascading series of pilot errors and miscommunications, all of which were likely compounded by cockpit culture. “If [the pilots] had done nothing, they would have done all they needed to do” and there would have been no tragedy. Pierre-Cedric Bonin, the Pilot Flying, was the co-pilot (first officer). He was 32 and had relatively low-quality experience. Marc Dubois, the Pilot Not Flying, was the captain. He was 57 and had more than 11,000 hours of experience, much of it high-quality, but he was working on one hour of sleep the previous night and seemed distracted. They addressed each other with “tu” as is typical, but the co-pilot Bonin “was almost too deferential, and perhaps too aware of rank.”
On Asiana 214, pilot error during the final approach was compounded by poor communication and role confusion despite perfect flying conditions. The NTSB found that “the flight crew mismanaged the airplane’s vertical profile during the initial approach…leading to a period of increased workload that reduced the pilot monitoring’s awareness of the pilot flying’s actions. About 200 ft, one or more flight crewmembers became aware of the low airspeed and low path conditions, but the flight crew did not initiate a go-around until the airplane was below 100 ft, at which point the airplane did not have the performance capability to accomplish a go-around. The flight crew was experiencing fatigue, which likely degraded their performance during the approach. Nonstandard communication and coordination between the pilot flying and the pilot monitoring when making selections…resulted, at least in part, from role confusion and subsequently degraded their awareness of AFDS and A/T modes. Insufficient flight crew monitoring of airspeed indications during the approach likely resulted from expectancy, increased workload, fatigue, and automation reliance. The delayed initiation of a go-around by the pilot flying and the pilot monitoring after they became aware of the airplane’s low path and airspeed likely resulted from a combination of surprise, nonstandard communication, and role confusion.” Having just crashed landed in spectacular fashion, the pilots then instructed the crew not to evacuate given their ongoing communications with the tower. A flight attendant reported a major fire but the order to evacuate took a further 90 seconds. The cabin manager, Lee Yoon-hye was the last person off the burning plane; the SF fire chief said, “she was a hero.” As an aside, six people were ejected from the aircraft when it hit the seawall and the tail broke off. Four of them were flight attendants who were properly restrained and survived the crash despite being ejected. Two of the three fatalities were ejected passengers who were not wearing seatbelts – they “would likely have remained in the cabin and survived if they had been wearing them.”
In response to the crash, many pilots chimed in on a popular message board (pprune.org):
“…if I flew with a new F/O, straight out of Line Training, once I’d completed my departure brief I would say: ‘There’s one important thing I need to add which is this. The reason I’m in this left-hand seat is because I’ve been doing this job longer than you. It doesn’t mean that I’m incapable of making mistakes. So if you see or hear anything which you don’t understand or appears to be not right, please speak up and tell me.’”
“I don’t like to use the work ‘Rank’ as it has caused many cockpit issues with the PNF being “Barked” at by the PF for making any remarks/suggestions/observations.”
“The sad thing is the guy who speaks up will maybe avoid a disaster , so you wont [sic] read about it , but mysteriously will fail his next medical and be looking for a job.”
“We could all very easily say ‘idiots….I wouldn’t do that’ and walk away. But the reality is (most likely) that the pilots were not idiots and that they had good intentions and were trying very hard to do a good job. If they were put in a different environment they would most likely be as capable and competent as the next airline pilot. So what do we need to change about the environment they were operating in? If you can answer that question you actually make an impact on flight safety…”
On US Airways 1549, the Captain (the now famous “Sully” Sullenberger) and First Officer were not immune from distractions. On the ground – during engine startup, taxi, etc. – they discussed the horrific state of the industry/economy, wondering if pilots at other airliners had it any better. They had flown together before but were in no way unusual in that regard. And when disaster struck, it took them approximately 11 seconds to declare and confirm the Captain’s sole control of the aircraft and for the First Officer to begin (at the Captain’s instruction) to consult the Quick Reference Handbook for loss on thrust on both engines. Within 30 seconds the Captain called mayday and declared his intention to return to LGA. Less than 60 seconds into the incident they were already working down the list of checklist items while considering a range of options in technical terms and communicating with the ground. In the first minute the Captain also stated the impracticality of returning to LGA and the possibility of a ditch in the Hudson. The First Officer (FO) continued to handle some communications while searching for ideas. Less than two minutes after the bird strike and with almost a minute and a half of warning, the captain told the passengers and crew to brace for impact. After approximately 120 seconds Air Traffic Control (ATC) was still trying to direct the pilots to a landing at Teterboro, NJ but the captain replied, “We’re gonna be in the Hudson.” ATC: “I’m sorry say again…?” The Captain ignored the noise from ATC and the automated warning systems and focused on flying the plane. He also continued with a clear chain of command and community with the FO to restart either engine or find a technical solution. Approximately two minutes and 32 seconds into the crisis, the Captain declared, “Ok let’s go put the flaps out.”
At this point, the Captain has misstated the call sign, the ATC has misstated the call sign, the ATC has generally been in disbelief, and the plane is rapidly descending toward the ground/river. But not once has there been a single instance of anything other than calm, professional, unemotional communication and decision-making. As the emergency warnings blare (“Terrain! Pull up!”) the Captain remained calm and asked FO: “Got any ideas?” FO responds, “Actually not.” Seconds later, the Captain successfully landed in the Hudson at a speed of approximately 130 knots / 240 km/h / 150 mph. “According to the flight attendants, the evacuation was relatively orderly and timely.” Following the evacuation, the captain and first officer inspected the cabin to ensure that no more passengers or crewmembers were on board; the pilots were the last people to leave the aircraft.
And the issue of automation raises other, related issue that Munger raised: attenuation of skill from disuse. “Since [‘fourth generation’ airplanes’] introduction, the accident rate has plummeted to such a degree that some investigators at the National Transportation Safety Board have recently retired early for lack of activity in the field. There is simply no arguing with the success of the automation. The designers behind it are among the greatest unheralded heroes of our time. Still, accidents continue to happen, and many of them are now caused by confusion in the interface between the pilot and a semi-robotic machine. Specialists have sounded the warnings about this for years: automation complexity comes with side effects that are often unintended. One of the cautionary voices was that of a beloved engineer named Earl Wiener, recently deceased, who taught at the University of Miami. Wiener is known for ‘Wiener’s Laws,’ a short list that he wrote in the 1980s. Among them:
- Every device creates its own opportunity for human error.
- Exotic devices create exotic problems.
- Digital devices tune out small errors while creating opportunities for large errors.
- Invention is the mother of necessity.
- Some problems have no solution.
- It takes an airplane to bring out the worst in a pilot.
- Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
- You can never be too rich or too thin (Duchess of Windsor) or too careful about what you put into a digital flight-guidance system (Wiener).
“Wiener pointed out that the effect of automation is to reduce the cockpit workload when the workload is low and to increase it when the workload is high. Nadine Sarter, an industrial engineer at the University of Michigan, and one of the pre-eminent researchers in the field, made the same point to me in a different way: ‘Look, as automation level goes up, the help provided goes up, workload is lowered, and all the expected benefits are achieved. But then if the automation in some way fails, there is a significant price to pay. We need to think about whether there is a level where you get considerable benefits from the automation but if something goes wrong the pilot can still handle it.’”
The dynamic between analysts and portfolio managers, or bosses and subordinates of any kind, often displays over-influence by authority. Throw in some incentive-caused bias, some reciprocation, some liking/disliking tendency, and before long some terrible decisions are made.
In investing the world is rife with authority-influenced decisions. For decades the NRSROs (Moody’s, S&P, Fitch) carried an imprimatur – or at least represented a critical piece of the plumbing – in the financial world that was impossible to replicate. I remember hearing on at least a half dozen occasions in 2007 and 2008 that there was just no way Moody’s would slap a triple-A rating on something that could default. And these were brilliant, high-powered people making that error.
How long would Madoff been able to go without over-influence by authority? There was a load of bias from his feeder funds’ and helpers’ incentives, along with waves of social proof. But his chairmanship of Nasdaq and his enormous size in the market and his name recognition all conveyed a simple air of authority. All of these factors lead many people to look past – or choose not to see – the obvious evidence that nothing goes up and to the right at all times, that there wasn’t nearly enough liquidity in the market to do the trades he claimed, that his so-called auditor barely existed, that nobody bothered to confirm cash balances.
“Cloning” ideas from other investors is also an area of interest in this regard. How much money has been lost because someone’s judgment was short-circuited (or because an otherwise robust investment process was abridged) on the evidence that some other legitimately brilliant and successful investor already owned the security in question? Cloning could work if done with no emotion, with no other psychological tendencies at play. But that’s clearly not the case, and if the over-influence by authority that is an inherent risk of cloning can’t be overcome it will may well produce inferior results.
One trick that I’ve seen used repeatedly by authority figures is the use of complexity. I immediately get uncomfortable if someone can’t explain to me in three sentences what he or she does for a living, or why an investment makes sense. It’s also a red flag when someone defaults to flowery language or unnecessary jargon. I’ve never seen definitive numbers, but there has to be a connection between success and investment professionals or management teams who can speak in clear, concise language. To the contrary, it might be worth avoiding people who automatically slip into dense, nonsensical drivel. When “consultant-speak” or “banker talk” becomes the default mode of communication it can also muddy the actual thought process as well.
Selling negativity also works. It’s not just sex that sells – negativity and pessimism and doomsday warning also get a disproportionate share of attention. Look at all the smart-sounding authorities pitching their wares or just their opinions on public platforms. They take on an air of authority because they use impenetrable jargon and convey exceptional seriousness. It wouldn’t be as vivid, or as entertaining, for someone to get on TV and explain that things are likely to slowly improve.
Prestigious university degrees and professional credentials are another source of mis-influence by authority. Many investors and business leaders get a pass based on their superficial profile, but we all know that the real world doesn’t work that way. We’ve all seen people who are absolutely brilliant and far more effective than their peers in a given field but went to a no-name school, or no school at all. Likewise, we’ve all seen people who have sterling résumés but couldn’t handle the task of managing a sock drawer.
I had an acquaintance who, after roughly a decade in the industry and with a prestigious undergraduate degree in business, a world-class MBA, and a CFA, asked me about the difference between “shareholder equity” and “book value” and also between “tangible shareholder equity” and “tangible book value.” Likewise, in my first job out of college there were about 20 of us in our two-month full-time training session on accounting and finance. At least half of them got a degree in accounting or finance from the best undergrad programs in the country, but when it came time to apply that material they got smoked by a history major, a math major, and a psychology major, none of whom had studied accounting or finance in their lives.
About The Author: Philip Ordway
Philip Ordway is Principal and Portfolio Manager of Anabatic Fund, L.P. Previously, Philip was a partner at Chicago Fundamental Investment Partners (CFIP). At CFIP, which he joined in 2007, Philip was responsible for investments across the capital structure in various industries. Prior to joining Chicago Fundamental Investment Partners, Philip was an analyst in structured corporate finance with Citigroup Global Markets, Inc. from 2002 to 2005, where he was part of a team responsible for identifying financing solutions for companies initially in the global power and utilities group and ultimately in the global autos and industrials group. Philip earned his M.B.A. from the Kellogg School of Management at Northwestern University in 2007 and his B.S. in Education & Social Policy and Economics from Northwestern University in 2002.
More posts by Philip Ordway