In every good piece of legal writing, there is a narrative. It’s a plain, easily digestible story that the writer wants you to follow. It cuts through jargon and pomp and circumstance and clues the judge in to what the writer thinks is actually going on. A good narrative will smooth over or strip away all of the weakest details and magnify those that most support the story being told. When this happens, as you can readily see in many legal briefs, the narrative soon turns into one of good versus evil.
This is problematic. The stronger the narrative, the more persuasive the argument. The more persuasive the argument, the more likely the chance of success. But success doesn’t always mean the right party prevailed, only the most persuasive party. By getting swept up in narrative, our objectivity can often be overcome.
The more interesting point is that we actually do this to ourselves, constantly. We craft narratives about why we can skip today’s gym session, or steal from our employers, or cheat on our wives. And, the smarter you are, the better the narrative you can craft to convince yourself you’re on the side of good.
Tyler Cowan gave an interesting TedX talk that touches on this topic. One bit I found particularly good:
One interesting thing about cognitive biases – they’re the subject of so many books these days. There’s the Nudge book, the Sway book, the Blink book, like the one-title book, all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories.
And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.
Often, people buy them as a kind of talisman, like “I bought this book. I won’t be Predictably Irrational.” It’s like people want to hear the worst, so psychologically, they can prepare for it or defend against it. It’s why there’s such a market for pessimism. But to think that buying the book gets you somewhere, that’s maybe the bigger fallacy. It’s just like the evidence that shows the most dangerous people are those that have been taught some financial literacy. They’re the ones who go out and make the worst mistakes. It’s the people that realize, “I don’t know anything at all,” that end up doing pretty well.
We often pick up some new, exciting piece of information and we immediately want to apply it. We learn about a fallacy and we want to pop into our knowledge machine like a new bolt, hoping to make it stronger. And, I believe this is possible. But don’t get carried away; don’t pretend that because you read a few pages on the sunk-cost fallacy in Sway or Blink you now have a deep understanding of the psychology behind it, and therefore won’t be affected by it. That, as Cowan explains, is where the real danger lies.