Jan 182012
 

Last fall, I promised to do a series of posts on cognitive biases. A cognitive bias, in case you’re not familiar with the term, is a kind of built-in mental shortcut that we take when making decisions. Usually, we’re not aware we’re doing it.

Here’s a classic example, the framing bias: You’re sitting in front of a bunch of potential investors. What do you tell them – A: that your product has a 1 in 10 chance of succeeding, or B: that it has a 90% chance of failure?  Doh. 

Here’s another, the anchoring bias. You’re in a store. Everything on the rack is over $200. You see a sweater for $50. What a good deal! Now you’re in another store. Everything on the rack is under $50. You see the exact same sweater, for $50. Not such a good deal after all?

I’ve been doing a bit of reading on the subject, and it turns out we’re brimming with biases. We think we’re rational, that we can look objectively at a situation or listen to someone and weigh his or her words impartially, but we can’t. We’re highly illogical creatures bound up in a complex net of emotion and suggestion, living under the illusion that our brains work like computers.

It’s so bad that I’ve totally lost my desire to write about it. If you’d like to get depressed, too, here’s a short reading list:

Part of my disgust stems from my observation that the current political charade that is going on in the US, otherwise known as the Republican Primaries, is a stunning and vivid example of how to exploit most of these biases. Biases like the exposure effect, in which you tend to increasingly prefer something when you are repeatedly exposed to it (like a face or a slogan), and the illusion-of-truth effect, in which you are more likely to believe a statement is true if you have heard it before – whether or not it really is true. That the machine so blatantly exploits our cognitive weaknesses should perhaps not come as a surprise, but I still find it deeply disappointing and disturbing.

Perhaps the most interesting thing I’ve taken away from all this reading is that humans have an innate need to tell a story. We may vary somewhat in our tendency to make irrational decisions, to act on instincts that lie far beneath the surface of our consciousness, but the one thing we all have in common is our need to create a narrative that ties it all together. We do something or make a choice, and then we make up a story to explain why we did the thing or made the choice. Experiment after experiment has shown this to be the case. That’s what most of the billions of neurons inside those big brains of ours are doing: making up stories to rationalize our knee-jerk reactions.

We’re storytelling machines.

I think it’s high time we stopped settling for fairy tales, where everything is nice and simple – good versus evil, rags-to-riches, a stranger came to town – and started insisting on stories that reflect the real complexity of the world we live in. Then I think we might actually have a fighting chance of making decisions that would benefit us and our children in the long run.

Here’s a snippet from a TED talk on the subject by Tyler Cowen. I highly recommend you watch the talk or read the transcript. And then think about the political landscape. Then pass the Prozac.

One interesting thing about cognitive biases – they’re the subject of so many books these days. There’s the Nudge book, the Sway book, the Blink book, like the one-title book, all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories.

So what are the problems of relying too heavily on stories? You view your life like “this” instead of the mess that it is or it ought to be. But more specifically, I think of a few major problems when we think too much in terms of narrative. First, narratives tend to be too simple. The point of a narrative is to strip it way, not just into 18 minutes, but most narratives you could present in a sentence or two. So when you strip away detail, you tend to tell stories in terms of good vs. evil, whether it’s a story about your own life or a story about politics. Now, some things actually are good vs. evil. We all know this, right? But I think, as a general rule, we’re too inclined to tell the good vs. evil story. As a simple rule of thumb, just imagine every time you’re telling a good vs. evil story, you’re basically lowering your IQ by ten points or more. If you just adopt that as a kind of inner mental habit, it’s, in my view, one way to get a lot smarter pretty quickly. You don’t have to read any books. Just imagine yourself pressing a button every time you tell the good vs. evil story, and by pressing that button you’re lowering your IQ by ten points or more.[...]

If the presidential candidates followed this rule, lowering their IQ by 10 points every time they told a good versus evil story, they’d be brain dead by now. Wait. They are! Welcome to the world of zombie politics.

I saw a video the other day on YouTube, classic Newt Gingrich blasting Mick Romney. You know what he saves for last? The ultimate thing that should slam Romney into the ground, make him a laughing stock, un-electable? Hint: he shares it with John Kerry.

You’ll have to see it to believe it.

Now, enough depressing stuff. Here’s the good news: Somehow I got through the quality control filter and snagged a spot in Friday’s TEDx Lausanne audience!  I’m particularly looking forward to a talk by Steve Edge, who describes himself as “Designer, branding guru, dyslexic and madman.” More on that next week.

Image: wingedwolf

 Leave a Reply

(required)

(required)

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>