This article is part of a multi-part series on human misjudgment by Phil Ordway, managing principal of Anabatic Investment Partners.

download printable version

Recommended Reading

  • Thinking, Fast & Slow by Daniel Kahneman
  • Any/all of the various papers published by Kahneman and Tversky
  • “The Marvels and the Flaws of Intuitive Thinking: Edge Master Class 2011” with Daniel Kahneman[74]
  • Poor Charlie’s Almanack by Charlie Munger, edited by Peter Kaufman
  • The various books and articles by Jason Zweig, especially Your Money and Your Brain
  • “The Base Rate Book” by Michael Mauboussin
  • The Smartest Guys in the Room by Bethany McLean and Peter Elkind
  • Superforecasting by Phil Tetlock
  • Influence by Robert Cialdini
  • Pre-suasion by Robert Cialdini
  • “How Wells Fargo’s Cutthroat Corporate Culture Allegedly Drove Bankers to Fraud” by Bethany Mclean
  • How We Know What Isn’t So by Thomas Gilovich
  • “The Human Factor by William Langewiesche” (Vanity Fair, October 2014)
  • “Investing in the Unknown and Unknowable” by Richard Zeckhauser
  • Fooled by Randomness by Nassim Taleb
  • Seeking Wisdom by Peter Bevelin
  • The Most Important Thing Illuminated by Howard Marks
  • The Undoing Project by Michael Lewis
  • Predictably Irrational by Dan Ariely
  • The various book and articles by Atul Gawande, especially The Checklist Manifesto
  • Against the Gods by Peter Bernstein
  • When Genius Failed by Roger Lowenstein

Potential Sources of Other Examples of Human Misjudgment

(These are meant solely as potential ideas for future case studies or as examples of the tendencies, both good and bad, mentioned above)

  • Puerto Rico
  • NRSROs – Moody’s and S&P in the GFC
  • Valeant
  • Theranos
  • Arthur Andersen
  • WorldCom
  • Adelphia
  • Global Crossing
  • Martha Stewart – ImClone insider trading scandal
  • Fannie/Freddie
  • Healthsouth
  • Worker’s comp fraud
  • Tyco
  • Refco
  • Parmalat
  • Bear Stearns
  • Lehman Brothers
  • Northern Rock
  • AIG
  • Anglo Irish Bank
  • Stanford Financial
  • Livestrong and Lance Armstrong
  • The “Libor rigging” scandal
  • Options back-dating
  • Volkswagen emissions scandal
  • Barings Bank and Nick Leeson
  • Societe Generale and Jerome Kerviel
  • Bruno Iksil, “The London Whale”
  • First Bank of Oak Park
  • Corus Bank
  • Ikea
  • In-n-Out Burger
  • HealthSouth
  • Waste Management
  • Tyco
  • American Express
  • The AOL / Time Warner merger
  • The Ron Johnson strategy at JCP
  • Eastman Kodak’s development of the digital camera in the 1970s
  • Raj Rajaratnam
  • Michael Milken

Munger and Belridge Oil

“In those days, Belridge was a pink-sheet company. It was very valuable. It had a huge oil field, it wasn’t even leased, they owned everything, they owned the land, they owned the oil field, everything. It had liquidating value way higher than the per share price — maybe three times. It was just an incredible oil field that was going to last a long time, and it had very interesting secondary and tertiary recovery possibilities and they owned the whole field to do whatever they wanted with it. That’s rare, too. Why in the hell did I turn down the second block of shares I was offered? Chalk it up to my head up a place where it shouldn’t be. So, that’s why I made that decision. It was crazy. So if any of you made any dumb decisions, you should feel very comfortable. You can survive a few. It was a mistake of omission, not commission, but it probably cost me $300 – $400 million. I just tell you that story to make you feel good about whatever investment mischances you’ve had in your own life. I never found a way of avoiding them all.”[75]

In a different telling Munger adds some detail to the story. “A guy called me offering me 300 shares of [Belridge] Oil and I had the cash and I said, ‘Sure, I’ll take the listing.’ It was selling there [for] maybe a fifth of what the oil companies were. They owned the oil field. So I bought it. Then he called me back and said, ‘I’ve got 1,500 more.’ I didn’t have the money on hand. I had to sell something. I think about it and I said, ‘Hold it for 10 minutes and I’ll call you back.’ I thought about it for 10 minutes and called him back and didn’t buy it. Well, [Belridge] Oil sold about for 35 times the price I was going to pay within a year and a half. If I had made the different decision, the Mungers would be ahead by way of more than a billion dollars, as I sit here now. To count the opportunity cost, it was a real bonehead decision. There was no risk. I could have borrowed. There wasn’t the slightest in borrowing money to buy [Belridge] Oil. The worst that would happen was I would get out with a small profit. It was a really dumb decision. You don’t get that many great opportunities in a lifetime. When life finally gave me one, I blew it. So I tell you that story to say you’re no different from me. You’re not going to get that many really good ones — don’t blow your opportunities. They’re not that common, the ones that are clearly recognizable with virtually no downside and big upsides. Don’t be too timid, when you really have a cinch. Go at life with a little courage. There’s an old word commonly used in the south that I never hear anybody use now, except myself, and that’s gumption. I would say what you need is intelligence plus gumption.”[76]

Air France 447

Summary only, with emphasis added; please read the full article, “The Human Factor,” by William Langewiesche (Vanity Fair, October 2014).

“These were highly trained people, flying an immaculate wide-bodied Airbus A330 for one of the premier airlines of the world, an iconic company of which all of France is proud. Even today—with the flight recorders recovered from the sea floor, French technical reports in hand, and exhaustive inquests under way in French courts—it remains almost unimaginable that the airplane crashed. A small glitch took Flight 447 down, a brief loss of airspeed indications—the merest blip of an information problem during steady straight-and-level flight. It seems absurd, but the pilots were overwhelmed.”

Dubois was listening to opera on a headset, and insisted that Bonin have a listen too. Dubois later bungled a controller’s communication by answering to the wrong call sign; Bonin weakly noted the mistake but backed down when Dubois insisted. “Similar confusions arose over required reporting points and frequencies ahead, but Bonin did not intervene.”

Bonin repeatedly insisted on flying at RECMAX altitudes, where the plane would be operating close to aerodynamic stall, despite standard procedure dictating a lower altitude to afford a margin of safety.

Dubois was reading a magazine, barely engaged in small talk, and as Bonin became more nervous with the approaching thunderstorms, Dubois decided to take his break. “The chief French investigator, Alain Bouillard, later said to me, ‘If the captain had stayed in position through the Intertropical Convergence Zone, it would have delayed his sleep by no more than 15 minutes, and because of his experience, maybe the story would have ended differently. But I do not believe it was fatigue that caused him to leave. It was more like customary behavior, part of the piloting culture within Air France. And his leaving was not against the rules. Still, it is surprising. If you are responsible for the outcome, you do not go on vacation during the main event.’”

*****

“In the late 1970s, a team of researchers at NASA began a systematic assessment of airline-pilot performance. One of them was a young research psychologist and private pilot named John Lauber, who later served for 10 years as a member of the National Transportation Safety Board and went on to run the safety division at Airbus in France. As part of the NASA effort, Lauber spent several years riding in airline cockpits, observing the operations and taking notes. This was at a time when most crews still included a flight engineer, who sat behind the pilots and operated the airplane’s electrical and mechanical systems. What Lauber found was a culture dominated by authoritarian captains, many of them crusty old reactionaries who brooked no interference from their subordinates. In those cockpits, co-pilots were lucky if occasionally they were allowed to fly. Lauber told me about one occasion, when he entered a Boeing 727 cockpit at a gate before the captain arrived, and the flight engineer said, “I suppose you’ve been in a cockpit before.” “Well, yes.” “But you may not be aware that I’m the captain’s sexual adviser.” “Well, no, I didn’t know that.” “Yeah, because whenever I speak up, he says, ‘If I want your f***ing advice, I’ll ask for it.’ ”

*****

“NASA talked the airline into lending it a full-motion simulator at the San Francisco airport with which to run an experiment on 20 volunteer Boeing 747 crews. The scenario involved a routine departure from New York’s Kennedy Airport on a transatlantic flight, during which various difficulties would arise, forcing a return. It was devised by a self-effacing British physician and pilot named Hugh Patrick Ruffell Smith, who died a few years later and is revered today for having reformed global airline operations, saving innumerable lives. John Lauber was closely involved. The simulator runs were intended to be as realistic as possible, including bad coffee and interruptions by flight attendants.

“Lauber told me that at Pan Am some of the operations managers believed the scenario was too easy. “They said, ‘Look, these guys have been trained. You’re not going to see much of interest.’ Well, we saw a lot that was of interest. And it had not so much to do with the pilots’ physical ability to fly—their stick-and-rudder skills—or their mastery of emergency procedures. Instead, it had everything to do with their management of the workload and internal communication. Making sure that the flight engineer was doing what a flight engineer needs to be doing, that the co-pilot was handling the radios, that the captain was freeing himself to make the right decisions.”

“It all depended on the captains. A few were natural team leaders—and their crews acquitted themselves well. Most, however, were Clipper Skippers, whose crews fell into disarray under pressure and made dangerous mistakes. Ruffell Smith published the results in January 1979, in a seminal paper, ‘NASA Technical Memorandum 78482.’ The gist of it was that teamwork matters far more than individual piloting skill. This ran counter to long tradition in aviation but corresponded closely with the findings of another NASA group, which made a careful study of recent accidents and concluded that in almost all cases poor communication in the cockpit was to blame.

“The airlines proved receptive to the research. In 1979, NASA held a workshop on the subject in San Francisco, attended by the heads of training departments from around the world. To describe the new approach, Lauber coined a term that caught on. He called it Cockpit Resource Management, or C.R.M., an abbreviation since widened to stand for Crew Resource Management. The idea was to nurture a less authoritarian cockpit culture—one that included a command hierarchy but encouraged a collaborative approach to flying, in which co-pilots (now ‘first officers’) routinely handled the airplanes and were expected to express their opinions and question their captains if they saw mistakes being made. For their part, the captains were expected to admit to fallibility, seek advice, delegate roles, and fully communicate their plans and thoughts. Part of the package was a new approach to the use of simulators, with less effort spent in honing piloting skills and more emphasis placed on teamwork. This was known as line-oriented flight training. As might be expected, the new ideas met with resistance from senior pilots, many of whom dismissed the NASA findings as psychobabble and derided the early seminars as charm schools. As in the old days, they insisted that their skill and authority were all that stood in the way of death for the public. Gradually, however, many of those pilots retired or were forced to change, and by the 1990s both C.R.M. and line-oriented flight training had become the global standard, albeit imperfectly applied.

“Though the effect on safety is difficult to quantify, because these innovations lie inseparably among others that have helped to improve the record, C.R.M. is seen to have been so successful that it has migrated into other realms, including surgery, where doctors, like pilots, are no longer the little gods they were before. In aviation, the change has been profound. Training has changed, co-pilots have been empowered, and the importance of airplane-handling skills by individual pilots has implicitly been de-valued. But the most important point as it applies to Air France 447 is that the very design of the Airbus cockpit, like that of every recent Boeing, is based upon the expectation of clear communication and good teamwork, and if these are lacking, a crisis can quickly turn catastrophic.

“The tenets of C.R.M., which emerged from the United States, fit naturally into the cultures of Anglo-Saxon countries. Acceptance has been more difficult in certain Asian countries, where C.R.M. goes against the traditions of hierarchy and respect for elders. A notorious case was the 1997 crash of a Korean Air Boeing 747 that hit a hillside on a black night, while on approach to Guam, after a venerated captain descended prematurely and neither the co-pilot nor the flight engineer emphatically raised concerns, though both men knew the captain was getting things wrong. In the impact 228 people died. Similar social dynamics have been implicated in other Asian accidents.

“And Air France? As judged from the cockpit management on display in Flight 447 before it went down, NASA’s egalitarian discipline has devolved within the airline into a self-indulgent style of flying in which co-pilots address the captain using the informal ‘tu’ but some captains feel entitled to do whatever they like. The sense of entitlement does not occur in a void. It can be placed in the context of a proud country that has become increasingly insecure. A senior executive at Airbus mentioned to me that in Britain and the United States the elites do not become airline pilots, whereas in France, as in less developed countries, they still do. This makes them difficult to manage. Bernard Ziegler, the visionary French test pilot and engineer behind the Airbus design, once said to me, ‘First you have to understand the mentality.’ I said, ‘Do you really think they are so arrogant?’ He said, ‘Some, yes. And they have the flaw of being too well paid.’ ‘So there must be no problem in the United States.’ But Ziegler was serious. He said, ‘Second, the union’s position is that pilots are always perfect. Working pilots are perfect, and dead pilots are, too.”

*****

‘Sarter has written extensively about ‘automation surprises,’ often related to control modes that the pilot does not fully understand or that the airplane may have switched into autonomously, perhaps with an annunciation but without the pilot’s awareness. Such surprises certainly added to the confusion aboard Air France 447. One of the more common questions asked in cockpits today is ‘What’s it doing now?’ Robert’s ‘We don’t understand anything!’ was an extreme version of the same. Sarter said, ‘We now have this systemic problem with complexity, and it does not involve just one manufacturer. I could easily list 10 or more incidents from either manufacturer where the problem was related to automation and confusion. Complexity means you have a large number of subcomponents and they interact in sometimes unexpected ways. Pilots don’t know, because they haven’t experienced the fringe conditions that are built into the system. I was once in a room with five engineers who had been involved in building a particular airplane, and I started asking, ‘Well, how does this or that work?’ And they could not agree on the answers. So I was thinking, If these five engineers cannot agree, the poor pilot, if he ever encounters that particular situation . . . well, good luck.’

“In the straight-on automation incidents that concern Sarter, the pilots overestimate their knowledge of the aircraft systems, then do something expecting a certain result, only to find that the airplane reacts differently and seems to have assumed command. This is far more common than the record indicates, because rarely do such surprises lead to accidents, and only in the most serious cases of altitude busting or in-flight upsets are they necessarily reported. Air France 447 had an additional component. The blockage of the pitot tubes led to an old-fashioned indication failure, and the resulting disconnection of the autopilot was an old-fashioned response: trust the pilots to sort things out. There were definitely automation complications in what followed, and to that mix one can add the design decision not to link the two control sticks. But on Air France 447, the automation problem ran still deeper. Bonin and Robert were flying a fourth-generation glass-cockpit airplane, and unlike the pilots who think they know more than they do, these two seemed to fear its complexities. The Airbus was reacting in a conventional manner, but once they ventured beyond the routine of normal cruise they did not trust the nature of the machine. It is hard to imagine that this would have happened under the old Clipper Skippers, the stick-and-rudder boys. But Bonin and Robert? It was as if progress had pulled the rug out from beneath elementary aeronautical understanding.”

*****

“For commercial-jet designers, there are some immutable facts of life. It is crucial that your airplanes be flown safely and as cheaply as possible within the constraints of wind and weather. Once the questions of aircraft performance and reliability have been resolved, you are left to face the most difficult thing, which is the actions of pilots. There are more than 300,000 commercial-airline pilots in the world, of every culture. They work for hundreds of airlines in the privacy of cockpits, where their behavior is difficult to monitor. Some of the pilots are superb, but most are average, and a few are simply bad. To make matters worse, with the exception of the best, all of them think they are better than they are. Airbus has made extensive studies that show this to be true. The problem in the real world is that the pilots who crash your airplanes or simply burn too much fuel are difficult to spot in the crowd. A Boeing engineer gave me his perspective on this. He said, ‘Look, pilots are like other people. Some are heroic under pressure, and some duck and run. Either way, it’s hard to tell in advance. You almost need a war to find out.’ But of course you can’t have a war to find out. Instead, what you do is try to insert your thinking into the cockpit.

“First, you put the Clipper Skipper out to pasture, because he has the unilateral power to screw things up. You replace him with a teamwork concept—call it Crew Resource Management—that encourages checks and balances and requires pilots to take turns at flying. Now it takes two to screw things up. Next you automate the component systems so they require minimal human intervention, and you integrate them into a self-monitoring robotic whole. You throw in buckets of redundancy. You add flight management computers into which flight paths can be programmed on the ground, and you link them to autopilots capable of handling the airplane from the takeoff through the rollout after landing. You design deeply considered minimalistic cockpits that encourage teamwork by their very nature, offer excellent ergonomics, and are built around displays that avoid showing extraneous information but provide alerts and status reports when the systems sense they are necessary. Finally, you add fly-by-wire control. At that point, after years of work and billions of dollars in development costs, you have arrived in the present time. As intended, the autonomy of pilots has been severely restricted, but the new airplanes deliver smoother, more accurate, and more efficient rides—and safer ones too.

“It is natural that some pilots object. This appears to be primarily a cultural and generational matter. In China, for instance, the crews don’t care. In fact, they like their automation and rely on it willingly. By contrast, an Airbus man told me about an encounter between a British pilot and his superior at a Middle Eastern airline, in which the pilot complained that automation had taken the fun out of life, and the superior answered, to paraphrase, “Hey asshole, if you want to have fun, go sail a boat. You fly with automation or find some other job.”

“He kept his job. In professional flying, a historic shift has occurred. In the privacy of the cockpit and beyond public view, pilots have been relegated to mundane roles as system managers, expected to monitor the computers and sometimes to enter data via keyboards, but to keep their hands off the controls, and to intervene only in the rare event of a failure. As a result, the routine performance of inadequate pilots has been elevated to that of average pilots, and average pilots don’t count for much. If you are building an airliner and selling it globally, this turns out to be a good thing. Since the 1980s, when the shift began, the safety record has improved fivefold, to the current one fatal accident for every five million departures. No one can rationally advocate a return to the glamour of the past.

“Nonetheless there are worries even among the people who invented the future. Boeing’s Delmar Fadden explained, ‘We say, ‘Well, I’m going to cover the 98 percent of situations I can predict, and the pilots will have to cover the 2 percent I can’t predict.’ This poses a significant problem. I’m going to have them do something only 2 percent of the time. Look at the burden that places on them. First they have to recognize that it’s time to intervene, when 98 percent of the time they’re not intervening. Then they’re expected to handle the 2 percent we couldn’t predict. What’s the data? How are we going to provide the training? How are we going to provide the supplementary information that will help them make the decisions? There is no easy answer. From the design point of view, we really worry about the tasks we ask them to do just occasionally.’ I said, ‘Like fly the airplane?’ Yes, that too. Once you put pilots on automation, their manual abilities degrade and their flight-path awareness is dulled: flying becomes a monitoring task, an abstraction on a screen, a mind-numbing wait for the next hotel. Nadine Sarter said that the process is known as de-skilling. It is particularly acute among long-haul pilots with high seniority, especially those swapping flying duties in augmented crews. On Air France 447, for instance, Captain Dubois had logged a respectable 346 hours over the previous six months but had made merely 15 takeoffs and 18 landings. Allowing a generous four minutes at the controls for each takeoff and landing, that meant that Dubois was directly manipulating the side-stick for at most only about four hours a year. The numbers for Bonin were close to the same, and for Robert they were smaller. For all three of them, most of their experience had consisted of sitting in a cockpit seat and watching the machine work.

“The solution might seem obvious. John Lauber told me that with the advent of C.R.M. and integrated automation, in the 1980s, Earl Wiener went around preaching about ‘turn-it-off training.’ Lauber said, ‘Every few flights, disconnect all that stuff. Hand-fly it. Fly it like an airplane.’ ‘What happened to that idea? ‘Everybody said, ‘Yeah. Yeah. We gotta do that.’ And I think for a while maybe they did.’

“Sarter, however, is continuing with variations on the theme. She is trying to come up with improved interfaces between pilot and machine. In the meantime, she says, at the very least revert to lower levels of automation (or ignore it) when it surprises you.

“In other words, in a crisis, don’t just start reading the automated alerts. The best pilots discard the automation naturally when it becomes unhelpful, and again there appear to be some cultural traits involved. Simulator studies have shown that Irish pilots, for instance, will gleefully throw away their crutches, while Asian pilots will hang on tightly. It’s obvious that the Irish are right, but in the real world Sarter’s advice is hard to sell. The automation is simply too compelling. The operational benefits outweigh the costs. The trend is toward more of it, not less. And after throwing away their crutches, many pilots today would lack the wherewithal to walk.

“This is another unintended consequence of designing airplanes that anyone can fly: anyone can take you up on the offer. Beyond the degradation of basic skills of people who may once have been competent pilots, the fourth-generation jets have enabled people who probably never had the skills to begin with and should not have been in the cockpit. As a result, the mental makeup of airline pilots has changed. On this there is nearly universal agreement—at Boeing and Airbus, and among accident investigators, regulators, flight-operations managers, instructors, and academics. A different crowd is flying now, and though excellent pilots still work the job, on average the knowledge base has become very thin.

“It seems that we are locked into a spiral in which poor human performance begets automation, which worsens human performance, which begets increasing automation. The pattern is common to our time but is acute in aviation. Air France 447 was a case in point. In the aftermath of the accident, the pitot tubes were replaced on several Airbus models; Air France commissioned an independent safety review that highlighted the arrogance of some of the company’s pilots and suggested reforms; a number of experts called for angle-of-attack indicators in airliners, while others urged a new emphasis on high-altitude-stall training, upset recoveries, unusual attitudes, flying in Alternate Law, and basic aeronautical common sense. All of this was fine, but none of it will make much difference. At a time when accidents are extremely rare, each one becomes a one-off event, unlikely to be repeated in detail. Next time it will be some other airline, some other culture, and some other failure—but it will almost certainly involve automation and will perplex us when it occurs. Over time the automation will expand to handle in-flight failures and emergencies, and as the safety record improves, pilots will gradually be squeezed from the cockpit altogether. The dynamic has become inevitable. There will still be accidents, but at some point we will have only the machines to blame.’”

Abridged text of Munger’s Harvard School Commencement Speech

“I can still recall Carson’s absolute conviction as he told how he had tried these things on occasion after occasion and had become miserable every time…I add my voice. The four closest friends of my youth were highly intelligent, ethical, humorous types, favored in person and background. Two are long dead, with alcohol a contributing factor, and a third is a living alcoholic – if you call that living. While susceptibility varies, addiction can happen to any of us, through a subtle process where the bonds of degradation are too light to be felt until they are too strong to be broken. And I have yet to meet anyone, in over six decades of life, whose life was worsened by over-fear and over-avoidance of such a deceptive pathway to destruction.

“Envy, of course, joins chemicals in winning some sort of quantity price for causing misery. It was wreaking havoc long before it got a bad press in the laws of Moses.

“Resentment has always worked for me exactly as it worked for Carson. I cannot recommend it highly enough to you if you desire misery.

“For those of you who want misery, I also recommend refraining from practice of the Disraeli compromise, designed for people who find it impossible to quit resentment cold turkey. Disraeli, as he rose to become one of the greatest Prime Ministers, learned to give up vengeance as a motivation for action, but he did retain some outlet for resentment by putting the names of people who wronged him on pieces of paper in a drawer. Then, from time to time, he reviewed these names and took pleasure in noting the way the world had taken his enemies down without his assistance.

“Well, so much for Carson’s three prescriptions. Here are four more prescriptions from Munger:

“First, be unreliable. Do not faithfully do what you have engaged to do. If you will only master this one habit you will more than counterbalance the combined effect of all your virtues, howsoever great. If you like being distrusted and excluded from the best human contribution and company, this prescription is for you. Master this one habit and you can always play the role of the hare in the fable, except that instead of being outrun by one fine turtle you will be outrun by hordes and hordes of mediocre turtles and even by some mediocre turtles on crutches.

“I must warn you that if you don’t follow my first prescription it may be hard to end up miserable, even if you start disadvantaged. I had a roommate in college who was and is severely dyslexic. But he is perhaps the most reliable man I have ever known. He has had a wonderful life so far, outstanding wife and children, chief executive of a multibillion dollar corporation. If you want to avoid a conventional, main-culture, establishment result of this kind, you simply can’t count on your other handicaps to hold you back if you persist in being reliable.

“My second prescription for misery is to learn everything you possibly can from your own personal experience, minimizing what you learn vicariously from the good and bad experience of others, living and dead. This prescription is a sure-shot producer of misery and second-rate achievement.

“You can see the results of not learning from others’ mistakes by simply looking about you. How little originality there is in the common disasters of mankind – drunk driving deaths, reckless driving maimings, incurable venereal diseases, conversion of bright college students into brainwashed zombies as members of destructive cults, business failures through repetition of obvious mistakes made by predecessors, various forms of crowd folly, and so on. I recommend as a memory clue to finding the way to real trouble from heedless, unoriginal error the modern saying: ‘If at first you don’t succeed, well, so much for hang gliding.’ The other aspect of avoiding vicarious wisdom is the rule for not learning from the best work done before yours. The prescription is to become as non-educated as you reasonable can.

“Perhaps you will better see the type of non-miserable result you can thus avoid if I render a short historical account. There once was a man who assiduously mastered the work of his best predecessors, despite a poor start and very tough time in analytic geometry. Eventually his own original work attracted wide attention and he said of that work: ‘If I have seen a little farther than other men it is because I stood on the shoulders of giants.’ The bones of that man lie buried now, in Westminster Abbey, under an unusual inscription: ‘Here lie the remains of all that was mortal in Sir Isaac Newton.’

“My third prescription for misery is to go down and stay down when you get your first, second, third severe reverse in the battle of life. Because there is so much adversity out there, even for the lucky and wise, this will guarantee that, in due course, you will be permanently mired in misery.

“My final prescription to you for a life of fuzzy thinking and infelicity is to ignore a story they told me when I was very young about a rustic who said: ‘I wish I knew where I was going to die, and then I’d never go there.’ Most people smile (as you did) at the rustic’s ignorance and ignore his basic wisdom. If my experience is any guide, the rustic’s approach is to be avoided at all cost by someone bent on misery. To help fail you should discount as mere quirk, with no useful message, the method of the rustic, which is the same one used in Carson’s speech.

What Carson did was to approach the study of how to create X by turning the question backward, that is, by studying how to create non-X. The great algebraist, Jacobi, had exactly the same approach as Carson and was known for his constant repetition of one phrase: “Invert, always invert.” It is in the nature of things, as Jacobi knew, that many hard problems are best solved only when they are addressed backward…[Charles] Darwin’s result was due in large measure to his working method, which violated all my rules for misery and particularly emphasized a backward twist in that he always gave priority attention to evidence tending to disconfirm whatever cherished and hard-won theory he already had. In contrast, most people early achieve and later intensify a tendency to process new and disconfirming information so that any original conclusion remains intact. They become people of whom Philip Wylie observed: ‘You couldn’t squeeze a dime between what they already know and what they will never learn.’ The life of Darwin demonstrates how a turtle may outrun the hares, aided by extreme objectivity, which helps the objective person end up like the only player without blindfold in a game of pin-the-donkey. If you minimize objectivity, you ignore not only a lesson from Darwin but also one from Einstein. Einstein said that his successful theories came from: “Curiosity, concentration, perseverance and self-criticism. And by self-criticism he meant the testing and destruction of his own well-loved ideas.

“It is fitting now that a backward sort of speech end with a backward sort of toast, inspired by Elihu Root’s repeated accounts of how the dog went to Dover, “leg over leg.” To the class of 1986: Gentlemen, may each of you rise high by spending each day of a long life aiming low.”[77]

“Investing in the Unknown and Unknowable” by Richard Zeckhauser

“The essence of effective investment is to select assets that will fare well when future states of the world become known. When the probabilities of future states of assets are known, as the efficient markets hypothesis posits, wise investing involves solving a sophisticated optimization problem. Of course, such probabilities are often unknown, banishing us from the world of the capital asset pricing model (CAPM), and thrusting us into the world of uncertainty. Many [great] investments…are one-time only, implying that past data will be a poor guide.”[78]

“From David Ricardo making a fortune buying British government bonds on the eve of the Battle of Waterloo to Warren Buffett selling insurance to the California earthquake authority, the wisest investors have earned extraordinary returns by investing in the unknown and the unknowable (UU). But they have done so on a reasoned, sensible basis. This essay explains some of the central principles that such investors employ.”[79] “Were the financial world predominantly one of mere uncertainty, the greatest financial successes would come to those individuals best able to assess probabilities. That skill, often claimed as the domain of Bayesian decision theory, would swamp sophisticated optimization as the promoter of substantial returns. The real world of investing often ratchets the level of non-knowledge into still another dimension, where even the identity and nature of possible future states are not known. This is the world of ignorance. In it, there is no way that one can sensibly assign probabilities to the unknown states of the world. Just as traditional finance theory hits the wall when it encounters uncertainty, modern decision theory hits the wall when addressing the world of ignorance. I shall employ the acronym UU to refer to situations where both the identity of possible future states of the world as well as their probabilities are unknown and unknowable.”[80]

Another level – Unknown, Unknowable and Unique events (UUU)

Speculation 1: UUU investments – unknown, unknowable and unique – drive off speculators, which creates the potential for an attractive low price.

“The major fortunes in finance, I would speculate, have been made by people who are effective in dealing with the unknown and unknowable. This will probably be truer still in the future. Given the influx of educated professionals into finance, those who make their living speculating and trading in traditional markets are increasingly up against others who are tremendously bright and tremendously well-informed.”[81]

Corporate governance. (Matt Levine, Bloomberg View, May 19, 2017)[82]

Well this, from Alan Palmiter of Wake Forest University School of Law, is lovely:

Recent research in the nascent field of moral psychology suggests that we humans are not rational beings, particularly when we act in social and political settings. Our decisions (moral judgments) arise instantly and instinctively in our subconscious, out of conscious view. We rationalize our moral decisions — whether to feel compassion toward another who is harmed, to desire freedom in the face of coercion, or to honor those matters we consider sacred — after we have made the decision. We layer on a veneer of rationality, to reassure ourselves of our own moral integrity and to signal our moral values to like-minded others in our group. This is particularly so when we operate in the “super-organism” that is the corporation, where specialized roles have led to almost unparalleled human cooperation.

Thus, the decision-making and actions that arise from the shareholder-management relationship are best understood as the product not of rational economic incentives or prescriptive legal norms, but instead moral values. On questions of right and wrong in the corporation, the decisions by shareholders and managers, like those of other human actors, are essentially emotive and instinctive. The justifications offered for their choices — whether resting on shareholder primacy, team production, board primacy, or even corporate social responsibility — are after-the-fact rationalizations, not reasoned thinking.

A Chat With Daniel Kahneman[83]

On persistence: “When I work I have no sunk costs. I like changing my mind. Some people really don’t like it but for me changing my mind is a thrill. It’s an indication that I’m learning something. So I have no sunk costs in the sense that I can walk away from an idea that I’ve worked on for a year if I can see a better idea. It’s a good attitude for a researcher. The main track that young researchers fall into is sunk costs. They get to work on a project that doesn’t work and that is not promising but they keep at it. I think too much persistence can be bad for you in the intellectual world.”

On the benefits of groups: “I’m a skeptic about people’s ability to improve their own thinking or to get control over their own intuition. It can be done but it’s very difficult. But I’m really optimistic about the potential for institutions and organizations to improve themselves, because they have procedures and they think slowly. They can have control over the way they interpret things. They can ask questions about the quality of evidence. Thinking about how to improve the decision-making in organizations is a challenge that I think we’re up to. This is something that can be done.”

On empathy: “There have been many experiments in which you bring together Palestinians and Israelis and good things happen between them. You just bring them together. But it’s an artificial construction. It’s very difficult to turn that into a massive thing. It is absolutely true that when you put together strangers in a positive atmosphere that good things are going to happen. They are going to find that they are more like the other than they were inclined to believe earlier. They’re going to recognize each other’s humanity. Lots of good things happen when people are in close contact. But it’s extraordinarily difficult to generate that in a big way.”

On flip-flopping: “Ideas become part of who we are. People get invested in their ideas, especially if they get invested publicly and identify with their ideas. So there are many forces against changing your mind. Flip-flopping is a bad word to people. It shouldn’t be. Within sciences, people who give up on an idea and change their mind get good points. It’s a rare quality of a good scientist, but it’s an esteemed one.”

On collaborations: “One of the quotes attributed to [his late partner] Amos Tversky is, ‘The world is not kind to collaboration.’ That’s an interesting phrase. What he meant by that is when people look at a joint project, they are very curious about ‘who did it.’ The assumption is that one person did it.

But neither of us could have done what we did by ourselves. We had two people who were both quite good, but our joint work is clearly superior to anything we could have done alone. And yet, either one of us could talk about our work and it sounded as if we had done it alone. It didn’t sound as if we needed somebody else. Amos said, ‘I talk to people about our joint work and people don’t think I need anybody else.’ So there is a problem of how to treat collaborations and how to foster them. Quite often the actions of the environment are destructive to collaborations. The urge is to allocate credit and to single out people and not treat collaborations as units. If Michael Lewis’s book makes people think about the value of collaborations it would be useful.”

On education changing thinking: “There are studies showing that when you present evidence to people they get very polarized even if they are highly educated. They find ways to interpret the evidence in conflicting ways. Our mind is constructed so that in many situations where we have beliefs and we have facts, the beliefs come first. That’s what makes people incapable of being convinced by evidence. So education by itself is not going to change the culture. Changing critical thinking through education is very slow and I’m not very optimistic about it.”

Asked if he could comment on brain trauma. “No, because I don’t know anything about it and I strongly believe people should stay in their own lanes when giving opinions.”

On where the world is going: “People in their 30s know where the world is going because they’re going to do it. I’m in my 80s so I have no idea.”

[74] https://www.edge.org/conversation/daniel_kahneman-the-marvels-and-the-flaws-of-intuitive-thinking-edge-master-class-2011
[75] https://www.forbes.com/sites/phildemuth/2014/10/01/charlie-munger-and-the-2014-daily-journal-annual-meeting-part-three/#71d76db371d7
[76] http://www.rbcpa.com/DJCO_Meeting_Detailed_Notes_2013.pdf
[77] Charlie Munger, Harvard School Commencement Speech; June 13, 1986; reprinted in Poor Charlie’s Almanack. (Abridged, emphasis added)
[78] https://www.hks.harvard.edu/fs/rzeckhau/InvestinginUnknownandUnknowable.pdf
[79] https://www.hks.harvard.edu/fs/rzeckhau/InvestinginUnknownandUnknowable.pdf
[80] https://www.hks.harvard.edu/fs/rzeckhau/InvestinginUnknownandUnknowable.pdf
[81] https://www.hks.harvard.edu/fs/rzeckhau/InvestinginUnknownandUnknowable.pdf
[82] https://www.bloomberg.com/view/articles/2017-05-19/relationships-and-glass-steagall
[83] http://www.collaborativefund.com/blog/a-chat-with-daniel-kahneman/