The One About Human Brains

The truth is, so many things influence our judgment, attitudes and behavior – most of them we are not even aware of!

That’s the argument of Nobel prize winner Daniel Kahneman in the book Thinking Fast and Slow. Backed up by years of research this book set’s out to make to psychology sexy.

Particularly, shining a spotlight on insights from cognitive science that explain how we process information. From, intuition and statistics, to stock market gambles, and illogical thinking, Daniel Kahneman deals with it all.

Using an analogy of two systems to account for our thinking, it’s a simplified way to avoid delving into the deep biological complexity of our brain structure, but still captures the essence of our decision-making processes.

Essentially, our brains are composed of two characters, one that thinks fast (System 1) and one that thinks slow (System 2). System 1 operates automatically, intuitively, involuntary, and effortlessly—like when we drive or read an angry facial expression. System 2 requires slowing down, solving problems and concentrating — like when we calculate a math problem.

‘ve managed to make it though this book – which I think is really helpful for any creatives looking to develop empowering ideas – but I don’t think everyone should have to cover nearly 500 pages to supercharge their understanding of the human mind.

So, here’s my run down of the top 10 mental short cuts &  biases covered in the book :

  1. Priming – we can all suffer from consciously and subconsciously being exposed to ideas that “prime” us to think about an associated idea. For instance, if we’ve been talking about food we’ll fill in the blank SO_P with a U but if we’ve been talking about cleanliness we’ll fill in the blank SO_P with an A.
  2. Cognitive Ease – things that are easier to grasp will seem more true than things that require hard thought.
  3. Coherent Stories – to make sense of the world we tell ourselves stories about what’s going on. We make associations between events, circumstances, and regula r occurrences. The more these events fit into our stories the more normal they seem. Things that don’t occur as expected take us by surprise. To fit those surprises into our world we tell ourselves new stories to make them fit. We are evidently ready from birth to have impressions of causality, which do not depend on reasoning about patterns of causation. We posit intention and agency where none exists, we confuse causality with correlation, and we make more out of coincidences than is statistically warranted.
  4. The Halo Effect – The warm emotion we feel toward a person, place, or thing leads us to like everything about that person, place, or thing. Just like in the case of fake news from a friendly source.
  5. Substitution – When confronted with a difficult question  we make life easier for ourselves by answering a substituted simpler question. So, instead of answering the mind-bending philosophical question, “What is happiness?” we answer the easier question, “What is my mood right now?”.
  6. The Anchoring Effect  – subconsciously we will make incorrect estimates based on previously heard quantities. This means that when buying a phone for instance £200 would seem high if the asking price was raised from £180k, but low if the asking price was lowered from £220.
  7. The Availability Heuristic – when we have experienced a tragedy we will over-estimate the potential for risk and danger going forward. So, we are prone to under or over estimating the frequency of an event based how easy we find it to retrieve a memory of it happening NOT how often it the stats tell us it happens.
  8. The Hindsight Bias – everything feels more obvious after the fact. Once something has happened we easily forget what we thought before hand.
  9. Loss Aversion – we all hate to lose right?. Turns out that’s something that runs deeper than we may thing! People will work harder to avoid losses than to achieve gains, work harder to avoid pain than to achieve pleasure. This is also linked to something called “The sunk cost fallacy”, which shows how to avoid feeling bad about cutting our losses and being called a failure, we tend to throw good money after bad, stay too long in abusive marriages, and stay in unhappy careers.
  10. Ignoring frames – turns out that how a problem is pitched determines our choices a great deal. We’ll prefer medical help when the outcomes are a “one month survival rate of 90%” rather than those where outcomes are “10% mortality rate.” Both sentences mean the same thing statistically but the frame of “survival” has greater emotional value than “mortality rates.”

So it turns we’re all far more suggestible than we realise! In total I counted 47 different biases covered in book, so do check out Thinking Fast and Slow if you want to wrap your help around them all.

In the meantime I hope this cheatsheet helps you to understand how yourself and those around you think. And empowers you to design more creative solutions geared towards how people really think.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s