Jason Zweig has been one of the giants of the investment universe for several decades. He was chosen as the editor of the revised edition of Ben Graham’s The Intelligent Investor, and he spent two years working with Danny Kahneman on his book Thinking, Fast and Slow. Zweig is also the author of a long-running column in The Wall Street Journal and he has published in numerous other publications. The archives on his website are a treasure trove of good reading. Even his Twitter feed is a rare harbor of intellectual honesty in the hurricane of noise that defines social media.
Presented below are the comments that Jason was gracious enough to share when I reached out for his help with this project. As I was thinking through this material he naturally came to mind as the obvious source of expertise on the subject, and I’m extremely grateful for his contribution. I can’t improve upon them so everything is presented as he wrote it.
In the spirit of his avoidance of confirmation bias, I actually stopped reading his comments after his third sentence until I was done with my own work on this project.
Please note that Jason included the following warning: “the best researchers in the field would find all kinds of errors and misinterpretations in what I said…To get these things strictly correct, I’d have to re-read all the papers and talk to the authors.”
From: Zweig, Jason [mailto: ]
Sent: Monday, March 27, 2017 2:16 PM
To: Philip C. Ordway <[email protected]>
Subject: Re: project to update “Psychology of Human Misjudgment”
What an interesting idea.
I’m going to reply without reading any of the details in what you sent or re-reading Charlie’s original. It’s best, in this kind of exercise, not to have references handy (to avoid confirmation bias!). So some of what I set down here for you may be redundant or repetitive, but it’s what feels top-of-mind. It will also be unstructured and a bit stream-of-consciousness, although I hope that won’t bother you for this sort of purpose. The numbers are arbitrary and for my convenience, not to imply any sort of ordinal structure; in fact, most of these thoughts are so interrelated that it’s hard to separate the threads from each other.
I like this idea so much that I’m going to write a lot. Whatever you don’t use, I will. Everything is grist for the mill, and this is a fun exercise.
1) People love stories. Danny Kahneman loves to say “Stories trump statistics.” As I’m sure you know, Yuval Noah Harari argues in Sapiens that storytelling is, above all, what makes us human. We not only love stories, we love perceiving stories that might not even be there. In our minds, objects and events acquire propensities, or apparent tendencies from which we believe we can extrapolate the next outcomes. Once you see an animal in a Rorschach blot, you can’t unsee it; the story you instantaneously told yourself is off and running inside your head. Or think of the classic Fritz Heider experiments, in which geometric shapes start to “behave.” Then think of traders saying that a stock “isn’t acting right,” or attributing intentional behavior to an entire market. What, then, turns a bull market into a bear market, or vice versa? The emergence of a new narrative, as Bob Shiller has often pointed out. Part of what makes market timing so difficult is that these narrative shifts can be so swift…
2) … which leads to a related point, that markets seem to behave somewhat like physical systems in a state of self-organized criticality. Piles of sand can grow breathtakingly high and appear to defy all physical logic, until one last grain of sand is added and the whole dune collapses. Physicists don’t yet seem to understand this process fully; likewise, we know very little about the self-organized criticality of emotion. Following along with the crowd feels safer and safer and safer until the whole pile collapses, at which point following along feels terrifying. What determines critical mass in the social psychology of markets? I don’t think we know. Until we do, we all need to recognize that investing with the herd is almost irresistible, offers an illusion of safety, and tends to end in a smash that astounds almost everyone.
3) The halo effect is ubiquitous. Evaluating one aspect of a person, a product, an event, a company, an industry or an idea will inevitably bias your subsequent evaluations of its other aspects. If I rate a CEO’s physical attractiveness, declaring him a 9 out of 10 on the handsomeness scale will prime me to rate him higher, also, on competency, persistence, leadership, financial sophistication, general ability and so forth. By the same token, focusing on a company’s high stock price will lead me to take a rosier view of the underlying business. The human mind likes to iron out inconsistencies. We will intuitively believe that a more handsome CEO is also more competent and that a higher-priced stock signals a better-run company. Here, Danny Kahneman’s idea of structured evaluation, in which a decision-maker assesses each distinctive attribute separately on the same numerical scale, can be very helpful.
4) Positive illusions about ourselves are beneficial for us as people but toxic for us as investors. Overconfidence, unrealistic optimism, the illusion of control — without these qualities, most of us would probably (and quite logically!) spend our lives curled up in a ball in the dark. It’s misunderstanding of or blindness to the laws of probability that gets us through daily life. Given the high rates of failure, who would ever get married or start a business without a kick in the pants from a positive illusion? Without positive illusions, capitalism itself would clank to a halt. But with them, financial capitalism becomes more dangerous as they drive markets to extremes. We are so accustomed to being successfully guided by positive illusions in many aspects of daily life that it is fiendishly difficult to recognize how dangerous they are in investing. One of Peter Bernstein’s wisest aphorisms is, “The most dangerous moment is when you are right.”
5) Perhaps the most pernicious of all cognitive biases is the bias blind spot. I say: “Phil is overconfident; I, however, am well-calibrated. Phil is loss-averse; I am a rational Bayesian. Phil falls prey to the law of small numbers; I rely only on base rates.” Meanwhile, you are thinking exactly the same thing, in reverse. Studying cognitive biases seems to make it much easier to see them in other people, but barely any easier at all to find them in ourselves. It’s been known for more than 40 years that people find it extraordinarily hard to apply research findings to their own behavior; each of us believes we are the exception. Checklists can help here, but you would be blind to the bias blind spot if you thought you could ever cure it completely.
6) Self-serving attributions of success and failure are cognitively corrupting. Tom Gilovich has shown that I will attribute my success to overcoming obstacles, rather than to the many advantages of (for example) living in a liberal democracy — while I will attribute yours to a beneficial environment. My failures are the result of fierce headwinds, while yours came from your inability to rise to the occasion. The old adage that success has many fathers while failures is an orphan seems to be slightly wrong: My success has only one father — me — while yours has many. And my failure has many fathers, while yours has only one: you. Thus we witness portfolio managers blaming their underperformance on unnaturally high correlations or the unfair advantage of index funds. When (as it surely will) their performance turns, they will declare that their alpha came from security selection. Being honest about this requires almost superhuman strength of character.
7) Agency problems are still underrated as a cause of destabilizing behavior among professional investors. Paul Woolley and Dimitri Vayanos have demonstrated that once investors start pulling money from actively managed funds, it is rational for the active managers to chase overvalued stocks. Most clients still don’t understand that their definition of risk and their managers’ definition of risk are drastically different. It’s impossible to quantify how much of Berkshire’s success is attributable to its structure, which effectively minimizes agency conflicts. But Munger and Buffett both think it played a major role. In my opinion, portfolio managers who don’t design their companies, from Day One and the ground up, to minimize agency conflicts are hamstringing their own results before they even invest a dollar.
8) People are terrible affective forecasters. Dan Gilbert and Tim Wilson have written brilliantly about this. We underestimate how long we will feel good about a positive event or outcome. And we overestimate how long we will feel bad about a negative result. We get our predictions of intensity wrong, too. What Gilbert calls “the psychological immune system” seems to work so well because we are generally so unaware we even have one. When our lover jilts us and we say “I’ll never fall in love again,” we mean every word of it — just as the investor who bails at the bottom means it when he says he’ll never buy stocks again. When we do fall in love again or buy stocks again, we can do so only by pretending that we knew we would all along. Fibbing to ourselves about our own past seems to be the natural way to navigate the present. These emotional habits also seem to train us not to be honest in our intellectual life as well. Put another way, our preferences are constructed, not innate. We have a surprisingly poor grasp on what did make us happy, what does make us happy and what will make us happy. So the idea of individuals as rational utility maximizers is silly. As a result, many of the tools that investors rely on — focus groups, survey data and so on — may be unreliable. And the unreliability of affective forecasting helps explain why bear markets so often seem to end in a startling vertical leap: It’s as if investors everywhere suddenly realize they aren’t going to feel as bad as they expected to for as long as they worried they would. And then, it’s off to the races.
9) Much of what keeps us busy all day long is the attempt to minimize cognitive dissonance. Human beings will do almost anything to avert the collision of empirical evidence against their own cherished beliefs. We will tell stories to ourselves and others. We will hold positive illusions. We will overweight confirming evidence, no matter how weak, and ignore disconfirming evidence, no matter how strong. Danny Kahneman calls the human mind a machine for jumping to conclusions. But it is also, I think, a machine for reducing dissonance by keeping stories simple. And one of the simplest and most appealing of all stories is “That Doesn’t Apply to Me…Because I’m Special.”
Add all of this up, and it seems clear to me that I’ve been wrong for many years in saying that the single greatest challenge for investors is to develop self-control. In fact, the single greatest challenge investors face is to see ourselves as we actually are. What makes Warren Buffett and, perhaps even more, Charlie Munger so remarkable is how honest they are about themselves with themselves.
The rest of us can aspire to only a fraction of their level of self-honesty.
“A frequently asked question is, how do you learn to be a great investor? First of all, you have to understand your own nature, said Munger. Each person has to play the game given his own marginal utility considerations and in a way that takes into account his own psychology. If losses are going to make you miserable-and some losses are inevitable-you might be wise to utilize a very conservative pattern of investment and saving all your life. So you have to adapt your strategy to your own nature and your own talents. I don’t think there’s a one-size-fits-all investment strategy I can give you.” – Charlie Munger
 Damn Right! by Janet Lowe