In the highly anticipated Thinking, Fast and Slow, Kahneman takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. Kahneman exposes the extraordinary capabilities—and also the faults and biases—of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. The impact of loss aversion and overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the challenges of properly framing risks at work and at home, the profound effect of cognitive biases on everything from playing the stock market to planning the next vacation—each of these can be understood only by knowing how the two systems work together to shape our judgments and decisions.
Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble. Thinking, Fast and Slow will transform the way you think about thinking.
Daniel Kahneman (Hebrew: דניאל כהנמן; born 5 March 1934 - died 27 March 2024), was an Israeli-American psychologist and winner of the 2002 Nobel Memorial Prize in Economic Sciences, notable for his work on behavioral finance and hedonic psychology.
With Amos Tversky and others, Kahneman established a cognitive basis for common human errors using heuristics and biases (Kahneman & Tversky, 1973, Kahneman, Slovic & Tversky, 1982), and developed Prospect theory (Kahneman & Tversky, 1979). He was awarded the 2002 Nobel Prize in Economics for his work in Prospect theory. Currently, he is professor emeritus of psychology at Princeton University's Department of Psychology.
This is a fascinating book. Reading this book means not having to read so many others. For example, you could avoid having to read, Sway, Blink, Nudge and probably a dozen or so other books on Behavioural Economics. And the best part of it is that this is the guy (or, at least one half of the two guys) who came up with these ideas in the first place.
I was thinking that perhaps the best way to explain those other books would be to compare them to Monty Python. I want you to imagine something - say you had spent your entire life and never actually seen an episode of Monty Python's Flying Circus. That wouldn't mean you wouldn't know anything about Monty Python. It is impossible to have lived at any time since the late 60s and not have had some socially dysfunctional male reprise the entire Parrot sketch or Spanish Inquisition sketch at you at some stage in your life. I suspect, although there is no way to prove this now, obviously, that Osama bin Laden could do the Silly Walk like a natural. Well, if you had never seen an episode of Monty Python and your entire experience of their work was via the interpretation of men of a certain age down the pub - then finally getting to see an episode of the original would be much the same effect as reading this book. Hundreds of people have already told all this guy's best stories in their own books - but all the same it is a pleasure to hear them again by the guy that first said, 'this parrot is dead' or rather, 'framing effects make fools of us all'.
You need to read this book - but what is particularly good about it is that you come away from it knowing we really are remarkably easy to fool. It's because we think we know stuff that this comes as a constant surprise to us. Years ago I was talking to a guy who liked to bet. Everyone needs a hobby and that was his. Anyway, he told me he was playing two-up - an Australian betting game - and he realised something like tails hadn't come up frequently enough and so he started betting on tails and sure enough he made money. I told him that coins don't remember the last throw and so the odds of getting a tail was still 50%, as it had previously been. But I had no credibility - I'd already told him I never bet - so, how would I possibly know anything if I wasn't even brave enough to put my own money on the outcome? And didn't I understand the point of this story was he had already WON?
Still, when faced with a series of coin flips that run - H, H, H, H, H, T, H, H, H - it does feel like tails are 'due'. This is the sort of mistake we are all too prone to make. The thing to remember is that while there is a law of large numbers - toss a coin often enough and in the very long run there will be as many heads turn up as tails - that isn't the case in the short run - where just about anything is possible.
We (that is, we humans) are remarkably bad at mental statistics. And what makes it worse is that we are predictably bad at statistics. And this brings me to Bourdieu and him saying that Sociology is kind of martial art. He means that Sociology allows you to defend yourself from those who would manipulate you. Well, this book is the Bruce Lee book of advanced self-defence. Learning just how we fool ourselves might not make you feel terribly great about what it means to be human - but at least you will know why you hav stuffed up next time you do stuff up. I'm not sure it will stop you stuffing up - but that would be asking for an awful lot from one book.
If you want the short version of this book, he has provided the two papers that probably got him the Nobel Prize - and they are remarkably clear, easy to understand and comprehensive. But look, read this book - it will do you good.
In the last few years two books took me FOREVER to get through. The first was Daniel Dennett's "Darwin's Dangerous Idea" and the second is Kahneman's "Thinking, Fast and Slow." What caused this? What do they have in common? Both books explain, in minute detail, simple concepts with immensely far-reaching implications, and both have been... after the slog... the most intellectually rewarding reading of my adult life.
Where to begin... I have a number of theories running around in my head, and occasionally I try to corral them on paper. I organize, sequence and interconnect them in a way that will prevent my reader from meaningfully widening their eyes, in an aside, while winding their finger around one ear... ("Cuckoo!") Good writing about complex topics is very, very difficult, and Kahneman has corraled 30+ years of science, his career and all he has learned into a perfectly arranged sequence that leads the reader into a wilderness... provisioning you in each chapter with the tools you'll need for the next part of the journey.
The second most striking effect on me is the number of times I said, "Yes... YES!!! this is what I've been saying!" In my case it has usually been some sort of "intuitive"(excuse me, Mr. Kahneman... I mean "System 1") recognition of a pattern in my observations about the way we think. In Kahneman's case those intuitions have been converted into theoretical propositions, each meticulously researched in well designed experiments. Clearly, this is at least one difference between me and a Nobel Prize winning researcher.
So why does this stuff matter? In the context of broader discussions of free will, intention, choice and control over the directions our lives take, this book can provide powerful insights that might currently be obscured by these "cognitive illusions" and the inherent limitations of "System 1/System 2" thinking.
Perhaps we're not as "free" in our decisions as we might like to think, if "priming" has such a stunningly reproducible effect. Perhaps we're not so determined, if activities that initially require "System 2" attention, can be turned into second-nature, "technical-expertise intuitions." I.e. learning and training MATTERS in our ability to detect and respond to events that... if untrained... might take advantage of our brain's inherent "blind spots" or weaknesses.
Perhaps childhood religious indoctrination is a very adept recognition of these mental tendencies/flaws, so profoundly (if intuitively/naively) expressed by Ignatius Loyola, founder of the Jesuit order, "Give me the boy until 7, I will give you the man." (paraphrased; forgive me)
Kahneman's discoveries and documentation of mental capacity and biases could form the basis of a "Mental Martial Arts" program: an alternative form of indoctrination, in which students are trained to understand their brains' weaknesses, and learn to take stances or engage in practices that eliminate or reduce the errors to which these weaknesses can lead.
This book will rearrange the way you think... about how you think.
An unrelentingly tedious book that can be summed up as follows. We are irrationally prone to jump to conclusions based on rule-of-thumb shortcuts to actual reasoning, and in reliance on bad evidence, even though we have the capacity to think our way to better conclusions. But we're lazy, so we don't. We don't understand statistics, and if we did, we'd be more cautious in our judgments, and less prone to think highly of our own skill at judging probabilities and outcomes. Life not only is uncertain, we cannot understand it systemically, and luck has just as much to do with what happens to us -- maybe even more -- than we care to admit. When in doubt, rely on an algorithm, because it's more accurate than your best guess or some expert's opinion. Above all, determine the baseline before you come to any decisions.
If you like endless -- and I mean endless -- algebraic word problems and circuitous anecdotes about everything from the author's dead friend Amos to his stint with the Israeli Air Defense Force, if you like slow-paced, rambling explanations that rarely summarize a conclusion, if your idea of a hot date is to talk Bayesian theory with a clinical psychologist or an economist, then this book is for you, who are likely a highly specialized academically-inclined person. Perhaps you are even a blast at parties, I don't know.
But if you're like me and you prefer authors to cut to the chase, make their point, and then leave you with a whopping big appendix if you're interested in the regression analysis of how many freshmen would watch a guy choke to death because they think someone else will come to the rescue, then this book is not for you.
If you want to take the Reader's Digest pass through the book, then Chapter 1 and Section 3 are probably the most accessible and can be read in less than an hour, and still leave you with a fair understanding of the author's thesis.
I kind of want to cut this book in half, praise the first part, and stick the second part in some corner to gather dust. Not that the second part is bad, mind you; the entire book is well-written and obviously the product of someone who knows their field. There’s just a lot of it. Thinking, Fast and Slow is kind of like a guest who shows up to your party and then dazzles everyone with an impromptu, 15-minute oration on the geopolitical situation in South Ossetia; and, everyone applauds and turns to go back to their own conversations, only for the guest to launch into another story about the time they parachuted into the Balkans to break up a nascent civil war, a story which is followed quickly by a similar tale of a visit to Southeast Asia…. Well, I think you catch my drift. Daniel Kahneman spins an interesting tale of human psychology and the way our brains interpret and act on data. But the book overstays its welcome by a few hundred pages.
Kahneman’s thesis breaks our decision-making systems into two pieces, System 1 and System 2, which are the respective “fast” and “slow” of the title. System 1 provides intuitive judgements based on stimulus we might not even be conscious of receiving; it’s the snap signals that we might not even know we are acting upon. System 2 is the more contemplative, cognitively taxing counterpart that we engage for serious mental exertion. Though often oppositional in the types of decisions they produce, Kahneman is keen to emphasize that it’s not about System 1 versus System 2. Instead, he’s out to educate us about how the interplay between these systems causes us to make decisions that aren’t always rational or sensible given the statistics and evidence at hand.
Kahneman takes us through an exhaustive tour of biases and fallacies people are prone to making. He talks about the halo effect, affection bias, confirmation bias, and even regression to the mean. As a mathematician, I liked his angle on probability and statistics; as a logician, I appreciated his brief segues into the logical aspects of our contradictory decision-making processes. Lest I give the impression Kahneman gets too technical, however, I should emphasize that, despite its length, Thinking, Fast and Slow remains aggressively accessible. There are a few points where, if you don’t have a basic grasp of probability (and if Kahneman demonstrates anything, it’s that most people don’t), then you might feel talked over (or maybe it’s those less-than-infrequent, casual mentions of “and later I won a Nobel Prize”). But this book isn’t so much about science as it is about people.
There are two other things I really appreciated about this book, both of which are related to psychology. I’m a fairly easygoing person, and I don’t always like to make waves, but sometimes I like to make some trouble and argue with some of my friends about whether psychology is a science. The problem for psychology is that it’s actually a rather broad term for a series of overlapping fields of investigation into human behaviour. On one end of this continuum, you have Freud and Jung and the various psychoanalysts who, let’s face it, are one step up from astrologers and palm-readers. On the other end, you have the cutting-edge cognitive psychology informed by the neuroscience of MRIs, split-brain studies, and rat research. So claiming that psychology is or isn’t a science is a little simplistic, and I’m willing to grant that there are areas within psychology that are science. For what it’s worth, Kahneman went a long way to reinforcing this: it’s clear he and his collaborators have done decades of extensive research. (Now, yes, it’s social science, but I won’t get into that particular snobbery today.)
The other thing I liked about Thinking, Fast and Slow is its failure to mention evolutionary psychology. Once in a while, Kahneman alludes to System 1’s behaviour being the result of evolutionary adaptation—and that’s fine, because it is true, almost tautologically so. But he never quite delves into speculation about why such behaviour evolved, and I appreciate this. There’s a difference between identifying something as an adaptation and determining why it’s an adaptation, and I’m not a fan of evolutionary psychologists’ attempts to reduce everything to the trauma of trading trees for bipedalism … I’m willing to admit I have an ape brain, but culture must count for something, hmm?
I suppose it’s also worth mentioning that this book reaffirms my supercilious disregard for economics. According to Kahneman, stock brokers and investors have no idea what they are doing—and some of them know this, but most of them don’t. Economists are, for the most part, highly-trained, but they seem bent upon sustaining this theoretical fantasy land in which humans are rational creatures. Aristotle aside, the data seem to say it isn’t so. I occasionally try my hand at reading books about the economy, just so I can say I did, but they usually end up going over my head. I’m a mathematician and I don’t get numbers—but at least I’m not the only one.
So Thinking, Fast and Slow is genuinely interesting. I learned a lot from it. I would rate it higher, but I was starting to flag as I approached the finish line. Truth be told, I skipped the two articles Kahneman includes at the end that were the original publications about the theories he explains in the book. I’m sure they are fascinating for someone with more stamina, but at that point I just wanted to be done. That’s never good: one of the responsibilities of a non-fiction author is to know how to pace a book and keep its length appropriate. Too short and the book is unsatisfying—too long, and maybe it’s more so. And I think this flaw is entirely avoidable; it’s a result of Kahneman’s tendency to reiterate, to circle back around to the same discussions over and over again. He spends an entire chapter on prospect theory, then a few chapters later he’s telling us about its genesis all over again, just from a slightly different angle. Like that party guest, Kahneman is full of interesting stories, but after telling one after another for such a long period of time, it starts sounding like white noise. And he ate all those little cocktail snacks too.
I inevitably ended up comparing Thinking, Fast and Slow to How We Decide, a much slimmer volume along much the same lines as this one. Whereas Lehrer’s focus is on the neurology behind decision-making, Kahneman is more interested in the psychology. Both books boil down to: we suck at automatic decision-making when statistics are involved; therefore, we behave less rationally than we believe we do. Lehrer explains why things go wrong, and Kahneman categorizes all the different way things go wrong. In many ways the books are complementary, and if this is an area of interest for you, I’ll recommend them both. For the casual reader, however, Thinking, Fast and Slow is a rather dense meal. By all means, give it a try, but take it slow.
Also posted on Kara.Reviews, where you can easily browse all my reviews and subscribe to my digest newsletter.
As the blurb summarises very well, in “Thinking, Fast and Slow, Kahneman takes us on a ground-breaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. Kahneman exposes the extraordinary capabilities—and also the faults and biases—of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behaviour.” Kahneman has won the Nobel Prize for economy so expect a lot of technical stuff and experiments in this one. Exactly how I like my non-fiction to be. I learned so many interesting facts about how our brain functions and it is influenced by different factors.
Some aspects mentioned in this volume: - People do not understand statistics well. I am a fan of the subject and base many decisions on statistics. Apparently, most people don’t. I guess, I now understand why people ignore statistics about the pandemic. - Luck plays a major role in success - Our brain tends to be lazy, system 2 does not rush to help - Intuition vs formulas- science usually win - Investment bankers are useless - We overestimate our ability to predict the future - Stereotypes matter more than statistics - We tend to be more risk prone when we have something to lose than when we have something to gain. - What you see is all there is. We tend to form opinions based on only what we know and tend to ignore that there might be other relevant information we might miss. - Priming can be used to influence people. For example pictures of eyes can make people feel watched - And many more
The last part was a bit too technical and a bit boring but I still think the book deserves 5*.
Daniel Kahneman is a Genius. But if you know his work, you know that already.
A Nobel Prize winner, his work is weighty and a bit recondite into the bargain.
But hasn't he ignored the CHRISTIAN worldview, the world of good and evil? For isn't this book SPIRITUALLY rather trite, being addressed only to those sharpies who only wanna learn how to PLAY THE GAME?
Even if that ends in emotional bankruptcy?
I think so. So here's my own, Christian take on it:
We all live in a postmodernist, secular world now. When we come of age into that scenario, many of us learn a bit of caution. Unless this brutal coming of age makes us hip and glib.
So there are two ways of thinking now. One is thinking fast (hip and glib) and the other is thinking slow (religiously cautious). Thinking fast, in this book, is Optimal.
But here’s a point Kahneman neglects...
The hip and glib guys get hurt by those postmodernistic sharp edges more easily than the cautious guys. So the hip side becomes cautious, and, of course as they age, the hard knocks confuse them. They end up more confused and conflicted than the cautious ones most of the time.
Such is the Moral Levelling of Age.
The cautious folks believe in true love, and often eternal verities, though. We’ll call them the sheep, cause they follow their hip friends as only sheep serve. The hip guys, the planners, believe in basically nothing - they’re all fast talk and action. We’ll call them the goats: they love to butt heads with you.
As I say, this sheep/goat take on Thinking Fast is my own. Kahneman never goes there. Where he DOES go is to the value of experience in thinking fast:
To think fast, he says, experience is key. Experience gives us heuristic benchmarks.
The more experienced folks think faster. And because they’re so sure of the facts, we often ill-advisedly trust them.
But now back to my own take: hip guys HAVE some of this experience, because they are hip. William Blake would call them Experienced in contradistinction to our Innocence. It’s an Experience that can’t discern. It has no wisdom.
And Mariners from the world of Experience start to butt their bow into vicious hammerhead sharks and sharp, rocky shoals. Agressive Experience runs out of motivation early, unlike the restful boat of Innocence. Innocence isn’t conflictual. It BENDS rather than confronts.
It lives longer and healthier. And learns discernment.
Remember Aesop’s fable of the Tortoise and the Hare? The lowly rabbit wins the race.
If your objective, like it is when one finishes reading a self-help book, is to implement what Mr. Kahneman has to say in real life and benefit from it, I should warn you, you will be sorely disappointed. Believe it or not, in my opinion, I believe Mr. Kahneman is telling you exactly that in this book - that whether you like it or not, your entire life is guided or may I say decided by two fundamental ideas and that there is very little you can do to change it, period.
Mr. Kahneman is probably the villain in every modern day spiritual guru's life, he argues very effectively that contrary to what these gurus may say the external world/ your environment/ surroundings/ or even society for that matter has a large say in your personal deliberate actions. You don't have a choice.
So, having said that, shelving this book in psychology section would be gross injustice. In my view this is such a good commentary of human nature. The two are different, very much so.
Read it, totally worth it in my opinion. Can get a little too drab but hang in there, this book is an eye opener.
I read almost 30% of a book that is a few years old, but I did not finish it.
I decided to read it again from the first page because it was recommended by many YouTubers, websites, and podcasts.
I want to learn how to think better and make good decisions.
I used my System 1 when I looked at the cover and title of this book. (It seemed easy and attractive) 😉
When I started reading, I spent most of my time using my System 2 to think and understand. (It takes time and brain energy to read and understand) 😅
"Thinking, Fast and Slow" explains the complexities of human decision-making, unveiling two cognitive systems: intuitive System 1 and analytical System 2.
Our minds have two systems: System 1, fast and automatic, guiding intuition and snap judgments, and System 2, slower and deliberate, crunching complex problems.
These systems often clash, leading to biases and errors.
The book explores these inner battles, revealing how our thinking works, why we make mistakes, and how to improve our decisions.
The book is a lengthy, self-conscious and a challenging read but highly recommended if you're interested in why human beings behave the way they behave. It's given me so much 'oh snap, so that's why we're so dumb' moments that at this point I don't even want to admit I'm a human to any space-time traveling race that comes in collision of 21st century Earth.
Citing behavioral research studies, he's convinced me that human confidence is a measure of whether a person has built up a coherent story not that the person truly knows what she's doing. He's convinced me that the feeling of 'ease' is just cognitive familiarity. He's convinced me why first impressions matter more than we think due to the Halo effect. He's convinced me that the human mind doesn't understand non-events. We think we understand the past, but we really don't. We create coherency by attributing causality to events, but not to non-events. In other words we underestimate the role of luck or the role of unknown variables in a given situation. He has given me reason to believe that in low validity environments, it's better to use formula's than to listen to expert human judgment. For example, the stability of a marriage can be better predicted by a simple equation like [stability = frequency of love making - frequency of arguing] than an expert opinion.
But one of the most interesting hypothesis he builds up is the existence of two systems in the mind. System 1 is prone to cognitive biases described above, but it's also where morality comes from. Not to mention intuitive judgment and hueristic answers to life's everyday questions. Would you believe it? Morality is more of an intuitive thing than a logical and reasonable framework! And the funny thing is without system 1, we'd won't survive a day in the life. Not to mention we wouldn't act human. System 2 on the other hand is more introspective, rational and is capable of being aware of the cognitive biases created by System 1. If my understanding is correct then, we can replicate system 2 by a machine or artificial intelligence. But that machine will not have the same extent of morality that we have.... food for thought!
In later chapters of the book, he describes another variation of duality in the human mind. An Experiencing Self and a Remembering Self. With countless examples (both experimental and anecdotal) he vividly paints a picture of how humans have this notion of "I am my remembering self, and strangely my experiencing self is a stranger to me." We're actually okay with letting our Experiencing Self suffer for the good of the Remembering Self!! This ties in to the cognitive bias of "focusing Illusion" (Focalism) and how we tend to overestimate a certain aspect of life.
To put the icing on the cake he finalizes the book by analyzing how we appreciate, value and judge the quality of our lives with all these biases combined. And it's amazing how irrational we are in doing so. Not only have I realized from this book that I should stop worrying about societal standards (because they are mostly based on irrational biases) but that I should spend a significant amount of my time and effort to into creating a value structure ideally suited for myself. Now, only if I had bit more memory and cpu speed on System 2...
Thinking, Fast and Slow was the 2012 winner of the National Academies Communication Award for best creative work that helps the public understanding of topics in behavioral science, engineering and medicine. Engaging the reader in a lively conversation about how we think, Daniel Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking.
عنوانهای چاپ شده در ایران: «تفکر، سریع و کند»؛ «تفکر، سریع و آهسته»؛ نویسنده: دانیل کانمن؛ تاریخ نخستین خوانش: روز نهم ماه اکتبر سال 2016میلادی
عنوان: تفکر، سریع و کند؛ نویسنده: دانیل کانمن؛ مترجم فروغ تالوصمدی؛ کرج نشر دُرّ دانش بهمن، سال1394؛ در681ص؛ شابک9789641741770؛ چاپ دوم سال1395؛ چاپهای سوم و چهارم سال1396؛ چاپهای پنجم تا هشتم سال1397، در556ص؛ موضوع: اندیشه و تفکر - تصمیمگیری - از نویسندگان ایالات متحده امریکا - سده20م
عنوان: تفکر سریع و کند؛ نویسنده: دانیل کانمن؛ مترجم ساناز توتونچی؛ مشهد زرینکلک آفتاب، جلد دوم سال1395؛ در سه جلد، شابک9786008164425؛
عنوان: تفکر، سریع و آهسته؛ نویسنده: دنیل کانمن؛ مترجم نغمه رضوی؛ ویراستار مهران ارهچی؛ تهران هورمزد، سال1398؛ در604ص؛ شابک9786006958637؛
کتاب «تفکر، سریع و آهسته»؛ شامل سه بخش از مراحل کاری «کانمن» است، «کارهای اولیه»، یعنی «سویه گیریهای شناختی»، سپس «نظریه چشم انداز»، و پس از آن «پژوهشهایی در زمینه شادی» است؛ محور اصلی کتاب دوگانگی میان حالت اندیشه است: سیستم دو آهسته تر، خودخواسته تر و منطقی تر است، در حالیکه سیستم یک: سریع، غریزی، و احساسی است؛
این کتاب ترسیم سویه گیریهای شناختی در ارتباط با هر نوع اندیشه است، که با پژوهشهای خود «کانمن» درباره ی نفرت از باخت آغاز میشود؛ این کتاب، با پشتوانه ی چند دهه پژوهشهای دانشگاهی، نشان میدهد که افراد، با تغییر چارچوب پرسش، و جایگزین کردن پرسشهای ساده تر، در حقیقت به قضاوتهای انسانی، اهمیتی بیش از حد میدهند؛ «کانمن»، با شرح کارکرد دو سیستم، تلاش میکنند نشان دهند، که چهئ زمانی میتوانیم، در تصمیم گیریهای مربوط به زندگی و تجارت، به بینش و دریافتهای خود باور داشته باشیم، و در چه زمانی از آن باید بپرهیزیم، و با شیوه ی شناخت «سویه گریهای ذهنی» چگونه میتوانیم از اشتباهاتمان دوری کنیم؛ این کتاب برنده ی جایزه ی بهترین کتاب «آکادمی ملی علوم» و جایزه ی کتاب «لس آنجلس تایمز»، و برگزیده ی ناقدان کتاب «نیویورک تایمز»، به عنوان یکی از ده کتاب سال2011میلادی شناخته شده است
تاریخ بهنگام رسانی 27/04/1399هجری خورشیدی؛ 12/02/1401هجری خورشیدی؛ ا. شربیانی
It is very difficult to judge, review or analyze a book that basically challenges the very idea of human “Rationalism”. Are humans perfectly rational? This dude, Daniel Kahneman, got a Nobel Prize in Economics for saying they are not. An ordinary person might have been treated with glare or a stinging slap if he said that to someone’s face. We simply don’t like being told that we are not very rational and certainly not as intelligent as we think we are. Hidden in the depths of our consciousness, are some ‘actors’ that keep tempering with our ‘rationality’. And we almost consciously allow this to happen. All in all, this book is a tour de force of Behavioral Psychology. Explaining how our mind comes to conclusions and makes decisions, Kahneman explains that our intuition and decision making part of brain has two personalities. These personalities, he says, are not two different or distinct systems but to understand them better, we will have to assign personalities not only to understand them better but also to be able to relate to them on a personal level. The two systems are called system 1 and system 2, for the sake of convenience. System 1 is vigilant, impulsive, judgmental, easily manipulated, highly emotional. System 2, on the other hand is the total opposite of system 1, it is very intelligent, indolent, mostly drowsing off in the back of our head, difficult to convince and extremely stubborn, and it only comes to action when there is some sort of ‘emergency’. Both these systems are susceptible to a number of biases, system 1 more than system 2. I thought Kahneman would build up this narrative systematically but he goes on to give us a tour of his years of research, experiments and surveys exploring every nook of our conscious human mind. He focuses on a diverse set of heuristics and biases that influence our judgments in everyday life. With some brilliant experiments and survey reports, he convincingly elaborates the effects that these biases have on our decisions. Never forgetting to highlight the fallacies of our consciousness, he touches on a number of other important breakthroughs in the world of psychology.
This is a very simple case of visual illusion where we see two lines of same size appearing to be of varying lengths. Even after knowing that they are equal and the illusion is created by the fins attached to them, our system 1 still impulsively signals that one of them is longer then the other. Through this simple illustration, he moves on to introduce Cognitive Illusions, which are more fascinating, and are drastically more effective. Kahneman contends that it is extremely difficult to overcome heuristic biases. Although, through methods like using statistical formulas and deliberate scrutiny we can ‘rationalize’ our decisions to some extent. Still, we are inherently prone to fall for dazzling rhetoric and dashing figures, we believe in myths and incidents that are as improbable as they are ludicrous, because this is the way we see things. But this is not undesirable altogether, some of the intuitive abilities are an evolutionary blessing that help us understand emotions and make correct decision in split seconds. Neither does the author deems it expedient to overcome these biases, but only to recognize them and put our system 2 to work before making crucial judgments. I am afraid that this review is getting a bit too long, and to be honest, I don’t think anyone reads long reviews.(Except some of my nerdy goodread friends who then leave an equally baffling Proustian comment, which of course, takes quite a while to be properly understood.) So I will mention a summary of some critical biases, ideas and psychological phenomenon that I found interesting.
I have attempted to summarize some heuristics, biases and psychological principle that I thought would make a fascinating introduction to tempt a novice like me to further explore the subject. They are just the tip of iceberg and not by any means exhaustive and just comprise a small part of what this book is all about.
Associative machine: System 1 works in surprising ways, read this BANANA VOMIT Now, a lot happened in last few seconds when you read these two words. You wore an expression of disgust and a very bad image came to your mind, your body too reacted in disgust and for short time you might not want to eat bananas. All of this was automatic and beyond your control. It was “The Associative Machine” of system 1. We associate seemingly some unrelated images and with some imagination, form an image. Our brain loves patterns and some times it sees things that aren’t even there. A very interesting clip in which Simon Singh shows associative machine at work : https://www.youtube.com/watch?v=0bG7E...
Priming: Exposure to a word causes immediate changes in the ease with which many related words can be evoked. If you have recently heard the word EAT, you are temporarily more likely to complete the word fragment SO_P as SOUP than as SOAP. The opposite would happen if you had just seen WASH. Similarly, exposure to an idea or event can also have similar temporary effect on our behavior. (“Florida Effect”)
Cognitive Ease: We all love it when we don’t have to work too hard because system 2 doesn’t like being bothered. So we admire and rather look for cognitive ease. Things that are less complex have a positive effect on our behavior. Psychologists use the term “Mind at ease puts a smile on the face”. Similarly, smiling and laughing can also ease our mind (system 1) and make us feel confident and in control. Anything that is easy to understand (read or see) is likely to have a more positive effect on us as compared to anything that we have a hard time understanding or visualizing.
Exposure Effect: We are more likely to choose the thing we are more familiar with. The principle that “Familiarity breeds liking” suggests that we are more inclined towards anything that is familiar and has been exposed to us before in past. The more the exposure is, the more we will be inclined towards it. This principle is excellently illustrated in Will Smith’s “Focus” (2015). a similar trick is used to con a billionaire.
Normality illusion: Things that recur with greater frequency are considered normal, no matter how horrendous they are. Two people killed in a terrorist attack in a western country are more likely to be mourned then a hundreds of children killed in Gaza by a missile strike. Simply due to the fact that children in Gaze get bombed all the time, while a terrorist attack that kills innocents is sort of rarity in Europe and America. (The same concept is present in Orwell’s Animal Farm in which pigs start to dominate other animals and it becomes the norm after a while.)
Substitution: If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. For instance when asked How happy are you with your life these days? Its more likely that we don’t use a broad frame to answer the question and substitute it with a simpler question “What is my mood right now?”. System 1 can readily answer the substitute question but to answer the real question, System 2 would have to be excited, which as we know System 2 doesn’t like. In everyday life, we use this to avoid making decisions and expressions based on factual background and therefore make an impulsive and sometimes irrational comment to a difficult question.
What you see is there is: We take pride in our intuitive abilities which leads us to believe that we know the whole truth, no matter how fallible our sources are, and not withstanding the fact that there is always another side of the picture. When we hear a story or an incident, we tend to accept it as a fact without considering any view dissenting or contradicting it. Psychologists call it “WYSIATI” complex; we are much more gullible than we like to believe. But it is again the mischief of System 1 that leads us to believe a narrative impulsively and without further inquisition as to its authenticity. It is also another example of our intuitive tendency to see things in a narrow frame.
Loss Aversion: Call it a gift of evolution or survival instinct, but we are naturally loss averse in most of our decisions. We are more likely to abandon a huge profit if there is some probability of an equally huge loss. We do want to have more, but not at the cost of putting our own at stake, we relish our possessions more than our desire to have more.
Overconfidence and Hindsight bias: A general limitation of our mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed. We see people everyday saying that what just happened was what they always thought would happen and they, in their overconfidence, start believing that they always knew in hindsight that such an event was probable. (see Halo Effect)
Prospect theory: This theory attempts to explain the way people choose between probabilistic alternatives that involve risk, where the probabilities of outcomes are known. Kahneman illustrates it through this graph
This theory is one of his most important in the field of behavioral economics. Owing to its complexity, I can not summarize it here.
P.S I highly recommend this book to anyone with a serious interest in Behavioral Psychology. Don’t waste your time on self-help books when you can read the real stuff.
Thinking, Fast and Slow is just okay. It's being marketed as a book on psychology (and economic psychology, in particular) for the layperson. I'm not sure if other laypeople agree, but this wasn't really for me. And it's not that the prose is too technical (okay, sometimes it is) but rather that Kahneman is stuck somewhere between academic technicalities and clear expressive prose.
Note: The rest of this review has been withdrawn due to the recent changes in Goodreads policy and enforcement. You can read why I came to this decision here.
In the meantime, you can read the entire review at Smorgasbook
I came across Thinking, Fast and Slow when I was reading about Richard Thaler’s work and his contribution to behavioural economics. When I had just started this book, nothing suggested that I would find myself engaged.
Daniel Kahneman gives a description of our behavioural patterns and the reasons behind the decisions we, human beings, make. Do we always behave in a rational way? What is the difference between “Econs” and “Humans”? Complex theories and concepts are explained in relatively simple language and accompanied by many examples. There is no need for any special knowledge to absorb this work of non-fiction and enjoy the process of reading, although it contains a lot of statistics.
The main characters of the book, according to the author, are two modes of reasoning - System 1 and System 2 - the two systems of our brain. The latter is very slow and prone to analytical reasoning, whereas the former is much faster and intuitive. System 1 often replaces a difficult or an ambiguous question with a simpler one and promptly answers this ‘new’ simplified question. Decisions that System 1 tends to take are often based on intuition. Such an approach may prove itself viable, for example, when it comes to chess grandmasters with vast experience. However, often we should not rely on this mode of reasoning, especially when making important decisions, such as choosing an insurance or retirement plan. System 2, ready to thoroughly analyse facts and compare different options, is at our disposal to help make choices that are going to have a substantial impact on our lives. The tricky part is that to be able to switch between the two systems humans have at least to make an effort to distinguish between them. The best option seems to let these two modes cooperate, but it is not as easy as one might think.
What about rationality? For decades, the leading economists have been telling us about the idea of maximising profits as the key principle propelling people to take an action. Kahneman tests this statement and shows that humans are often irrational in their decisions and actions, not striving to benefit themselves most but driven by their emotions and preconceptions. Moreover, sometimes random factors turn out to be crucial and determine our behaviour. Ordinary people, unlike ‘fictional’ economic agents, are not rational, events do not always have a causal connection, and stories of our lives often lack coherence and formal logic.
Our brain is inclined to produce cognitive illusions that come on the scene on different occasions. The effect of framing is one of the prominent examples of such cognitive traps. It results in people changing their decision or their answer if the offer that has been made to them or question they have been asked is simply reworded. In other words, another formulation of exactly the same thesis can lead to opposite results. This is how our brain works whether we like it or not.
In conclusion, this is one of those books capable of helping us understand ourselves slightly better.
Whew! Wrestled this one down to the ground. It's got so much in it; I've got all I can for now. I'm leaving it out in the living room for now, though--for refreshers.
The author's aim is to prove to us that we are not rational beings to the extent we think we are, that evolution has seen to that. And that being the case, the book outlines what we need to know so as not to mess up decisions like we have been doing--like we all do.
And he's made it accessible. He pulls you in. You will get your share of "Aha!" moments.
You can read it at whatever level you want. You can skim over the more complicated parts and go for the pithy conclusions. Or if you are really into the science and scholarship, there are footnotes in the back--stealth footnotes without the little numbers on the book's pages, so as not to intimidate the general audience.
All based on science. It's true whether you like it or not. And it is applicable to your life. You can't go over it, you can't go under it, so go through it--with this book.
If we all used our brains just a little more, what couldn't we accomplish!
This book had me laughing and smiling, more than many a book described in its blurb as side-splittingly funny or something similar because I recognised the cognitive disillusions described in this book as my own and in any case I am the kind of person who if they fall into a good mood wonders if it's due to the pint and the pie that was eaten earlier.
In my case the preacher wasn't talking to the choir, but I had been to the church before and enjoyed the services. It doesn't set out to be a new book full of new discoveries. It's a comfortable round up of research, investigations and thought, polished off with a couple of Kahneman's early articles as appendices. If you've read The Halo Effect ... and the Eight Other Business Delusions That Deceive Managers(which puts some of these cogitative delusions in a business context, it has an excellent anecdote about the failure of a Lego product), or something along those lines you'll be familiar with some of the ideas here.
By now I'm quite comfortable accepting that I am not rational and that other people aren't either and that statistical thinking is alien to probably to almost everybody and Kahneman's book happily confirms my opinion. And few things make us as happy as having our own biases confirmed to us.
There are however a couple of problems. Firstly there are some people who apparently are wedded to the notion that people are entirely rational. They either will not read this book, read and reject it or indeed read it, accept it's findings but mentally note them as curious aberrations that don't affect their belief - this is discussed in the book.
More seriously society is organised on the tacit assumption that we are not only capable of being rational but will put the effort into doing so when required. Unfortunately studies demonstrating the effect of meals on Judges reviewing parole cases (like the state pawn broker in Down and out in Paris and London they are more lenient after lunch and harsher beforehand and once they get hungry again) or voter behaviour which turns out to be influenced by where the polling booth is located. This makes me wonder. My polling station used to be in the Adult Education Centre, now that's been closed down, if the polling centre was moved to the police station would my voting habits transform into those of a Fishin', Huntin' and Floggin' Tory who froths at the mouth hearing the words 'illegal immigrants'? Maybe I need a snack.
Much in the book is useful, 90% fat free does sound better than 10% fat, there's a lot to be learnt here in how to describe or state a problem to push people towards certain responses by framing or anchoring the information you give. Of course this happens to us all the time as it is.
One of my favourite of Kahneman's examples comes from when he was working with Israeli flight instructors. They were convinced that shouting and swearing at trainee pilots was the best method of improving their performance - experience proved it - when a pilot under performed they swore at him and on the next attempt the trainee would do better. Plainly shouting works. Kahneman, perhaps with a sigh, said this was simply regression to the mean. After poor performance what ever they did would be followed by improved performance, swearing and shouting have no magic power. To demonstrate he had the instructors throw balls of paper over their shoulder's into a waste paper bin and tracked the results on a handy black board showing that performance varied up and down irrespective of swearing. Still I wonder if returning to work the instructors developed an enlightened instruction method or if they rapidly regressed to the mean and shouted and swore again.
I used to think that politicians answered a different question to the one given by the interviewer in an attempt to be evasive. Post Kahneman I wonder if this is just the natural tendency of the brain to substitute an easier question for a harder one. Who knows.
This is an excellent book about how we think, written by a Nobel-prize-winning economist. Kahneman explains how two "systems" in the mind make decisions. "System 1" is the fast, intuitive aspect of the mind. "System 2" is the slower, logical and reasoning part of the mind. We generally make decisions quickly with the System 1, often because System 2 is simply--lazy. It takes effort to think things out rationally, and our rational minds are not always up to the job.
This book is a long, comprehensive explanation of why we make decisions the way we do. Both systems are necessary, but both are subject to fallacies. Kahneman explains many of these fallacies. Most people do not really understand probability, so we are not good at judging relative levels of risk. Our decisions are strongly colored by how we frame questions in our minds. Simply re-framing a question can easily cause people--even professionals like doctors--to reverse decisions. We need to understand these framing issues, to avoid bad decisions. Elements of causality and Bayesian probability are described in some detail.
One of the most interesting aspects of the ways we think, is the concept of availability. Often, when subjected to a difficult question, we answer immediately. But really, we do not answer the question at hand--we have made a subtle switch to a simpler question, without even realizing it. Kahneman describes this quick switch to an available answer, in quite a bit of detail. Another interesting aspect is what he calls "hedonic" theory. Our memories of pleasant and unpleasant experiences are very much colored by their peak intensities and their ends--but definitely not by their durations. In other words, a short, very unpleasant experience is remembered as being much worse than an very long duration, unpleasant experience.
Some of the explanations of our ways of thinking may seem basic and obvious if you have read other psychology books. But then you realize--Kahneman and his colleague Amos Tversky discovered these aspects of psychology, by conducting a wide variety of clever experiments. Very well written, and understandable to the non-specialist, I highly recommend this book to anybody interested in psychology.
What a monstrous chore to read! I've been working on this book since September or August (4-6 months) and just could not take reading it for more than a few minutes at a time. Many times did it put me to sleep.
The book covered a lot of great material and really fascinating research, but oftentimes in such plodding, pedantic, meticulous detail as to nearly obfuscate the point. I have heard of the majority of the research (or at least their conclusions) as well, so while I thought it offered excellent insight and useful material for a lot of people to learn, I didn't think this collection of it--more of a history of the field than an introduction--added anything novel or unique for one already well-versed in the material. I guess I didn't care for the details in how the studies were conducted for every minor point in the author's theories--though I largely agreed with the theories and interpretations.
A line near the end of the book struck a dissonant chord with me and I wonder if that offers an additional cause for my dislike: "That was my reason for writing a book that is oriented to critics and gossipers rather than to decision makers." I wouldn't count myself among 'decision makers' in any important sense (it's surprising how little responsibility a person can have sometimes!), but I often felt like the book wasn't speaking to me. Many times the author wrote "we think..." or "we act..." in such a way that I don't think I'd do. This isn't to say I'm a purely 'rational agent' or 'Econ' or anything like that--the majority of the authors theories (thinking can be either instinctual or effortful, rational agents act differently than emotional humans, and the experiencing self and the remembering self are different things) are immanently true--but I do think he was generalizing for a WEIRD (Western, Educated, Industrialized, Rich and Democratic) audience, and despite my background, I don't think I think that way.
Recommendation: read the introduction and the conclusion (and perhaps the major section intros), cherry-pick anything else of interest.
Nothing in life is as important as you think it is when you are thinking about it.
I think this book is mistitled. For years, I assumed that it was some kind of self-help book about when to trust your gut and when to trust your head, and thus I put off reading it. But Thinking, Fast and Slow is nothing of the sort. As I finally discovered when the book was gifted to me (the ecstatic blurbs in the front pages were the first clue), this book is the summary of Daniel Kahneman’s study of cognitive errors. The book should probably be called: Thinking, Just Not Very Well.
Granted, my initial impression had a grain of truth. Kahneman’s main focus is on what we sometimes call our gut. This is the “fast thinking” of the title, otherwise known as our intuition. Unlike many books on the market, which describe the wonders of human intuition and judgment, Kahneman’s primary focus was on how our intuition can systematically fail to draw correct conclusions. So you might say that this is a book about all of the reasons you should distrust your gut.
Every researcher of the mind seems to divide it up into different hypothetical entities. For Freud it was the conscious and unconscious, while for Kahneman there are simply System 1 and System 2. The former is responsible for fast thinking—intuition, gut feelings—and the second is responsible for slow thinking—deliberative thought, using your head. System 2, while admirably thorough and logical, is also effortful and sluggish. Trying any unfamiliar mental task (such as mental arithmetic) can convince you of this. Thus, we must rely on our fast-acting System 1 for most of any given day.
System 1 generates answers to questions without any experience of conscious deliberation. Most often these answers are reasonable, such as when answering the question “What you like a hamburger?” (Answer: yes). But, as Kahneman demonstrates, there are many situations in which the answer that springs suddenly to mind is demonstrably false. This would not be a problem if our conscious System 2 detected these falsehoods. Yet our default position is to simply go with our intuition unless we have a strong reason to believe our intuition is misleading. Unfortunately, the brain has no warning system to tell you that your gut feeling is apt to be unreliable. You can call these sorts of situations “cognitive illusions.”
A common theme in these cognitive illusions is a failure of our intuition to deal with statistical information. We are good at thinking in terms of causes and comparisons, but situations involving chance throw us off. As an example, imagine a man who is shy, quiet, and orderly. Is he more likely to be a librarian or a farmer? Now consider the answer that springs to mind (librarian, I assume): how was it generated? Your mind compared the description to the stereotype of a librarian, and made the judgment. But this judgment did not take into account the fact that there are many times more farmers than male librarians.
Another example of this failure of intuition is the mind’s tendency to generate causal stories to explain random statistical noise. A famous example of this is the “hot hand” in basketball: interpreting a streak of successful shots as due to the player being especially focused, rather than simply as a result a luck. (Although subsequent research has shown that there was something to the idea, after all. So maybe we should not lament too much about our intuitions!) Another well-known example is the tendency for traders to attribute their success or failure in the stock market to skill, while Kahneman demonstrated that the rankings of a group of traders from year to year had no correlation at all. The basic point is that we are generally hesitant to attribute something to chance, and instead invent causal stories that “explain” the variation.
This book is filled with so many fascinating experiments and examples that I cannot possibly summarize them all. Suffice to say that the results are convincing, not only because of the weight of evidence, but mainly because Kahneman is usually able to demonstrate the principle at work on the reader. Our intuitive reactions are remarkably similar, apparently, and I found that I normally reacted to his questions in the way that he predicted. If you are apt to believe that you are a rational person (as I am) it can be quite depressing.
After establishing the groundwork, Kahneman sets his sights on the neighboring discipline of economics. Conventional economic theory presupposes rational actors who are able to weigh risks and to act in accordance with their desires. But, as Kahneman found, this does hold with actual people. Not only do real humans act irrationally, but real humans deviate from the expected predictions of the rational agent model systematically. This means that we humans are (to borrow a phrase from another book in this vein) predictably irrational. Our folly is consistent.
One major finding is that people are loss-averse. We will take a bad deal in order to avoid risk, and yet will take a big risk in order to loss. This behavior seems to be motivated by an intense fear of regret, and it is the cause of a certain amount of conservatism, not only in economics, but in life. If an action turns out badly, we tend to regret it more of it was an exceptional rather than a routine act (picking up a hitchhiker rather than driving to work, for example), and so people shy away from abnormal options that carry uncertainty.
Yet, logically speaking, there is no reason to regret a special action more than a customary one, just as there is no reason to weigh losses so much more heavily than gains. Of course, there is good evolutionary logic for these tendencies. In a dangerous environment, losing a gamble could mean losing your life, so it is best to stay to the tried-and-true. But in an economic context, this strategy is not usually optimal.
The last section of the book was the most interesting of all, at least from a philosophical perspective. Kahneman investigates how our memories systematically misrepresent our experiences, which can cause a huge divergence between experienced happiness and remembered joy. Basically, when it comes to memory, intensity matters more than duration, and the peaks and ends of experiences matter more than their averages. The same applies with pain: We may remember one experience as less painful than another just because the pain was mild when it ended. And yet, in terms of measured pain per minute, the first experience may actually have included more experiential suffering.
As a result of this, our evaluations of life satisfaction can often have very little to do with our real, experiential well being. This presents us with something of a paradox, since we often do things, not for how much joy they will bring us in the moment, but for the nice memory they will create. Think about this: How much money would you spend on a vacation if you knew that every trace of the experience would be wiped out as soon as the vacation ended, including photos and even your memories? The answer for most people is not much, if anything at all. This is why so many people (myself included) frantically take photos on their vacations: the vacation is oriented toward a future remembering-self. But perhaps it is just as well that humans were made this way. If I made my decisions based on what was most pleasant to do in the moment, I doubt I would have made my way through Kant.
This is just a short summary of the book, which certainly does not do justice to the richness of Kahneman’s many insights, examples, and arguments. What can I possibly add? Well, I think I should begin with my few criticisms. Now, it is always possible to criticize the details of psychological experiments—they are artificial, they mainly use college students, etc. But considering the logistical restraints of doing research, I thought that Kahneman’s experiments were all quite expertly done, with the relevant variables controlled and additional work performed to check for competing explanations. So I cannot fault this.
What bothered me, rather, was that Kahneman was profuse in diagnosing cognitive errors, but somewhat reticent when it came to the practical ramifications of these conclusions, or to strategies to mitigate these errors. He does offer some consequences and suggestions, but these are few and far between. Of course, doing this is not his job, so perhaps it is unfair to expect anything of the kind from Kahneman. Still, if anyone is equipped to help us deal with our mental quagmires, he is the man.
This is a slight criticism. A more serious shortcoming was that his model of the mind fails to account for a ubiquitous experience: boredom. According to Kahneman’s rough sketch, System 1 is pleased by familiarity, and System 2 is only activated (begrudgingly, and without much relish) for unfamiliar challenges. Yet there are times when familiarity can be crushing and when novel challenges can be wonderfully refreshing. The situation must be more subtle: I would guess that we are most happy with moderately challenging tasks that take place against a familiar background. In any case, I think that Kahneman overstated our intellectual laziness.
Pop psychology—if this book can be put under that category—is a genre I dip into occasionally. Though there is a lot of divergence in emphasis and terminology, the consensus is arguably more striking. Most authors seem to agree that our conscious mind is rather impotent compared to all of the subconscious control exerted by our brains. Kahneman’s work in the realm of judgments closely parallels Johathan Haidt’s work in morals: that our conscious mind mostly just passively accepts verdicts handed up from our mental netherworld. Indeed, arguably this was Freud’s fundamental message, too. Yet it is so contrary to all of our conscious experiences (as, indeed, it must be) that it still manages to be slightly disturbing.
Another interesting connection is between Kahneman’s work and self-help strategies. It struck me that these cognitive errors are quite directly related to Cognitive Behavioral Therapy, which largely consists of getting patients to spot their own mental distortions (most of which are due to our mind’s weakness with statistics) and correct them. And Kahneman’s work on experiential and remembered well being has obvious relevance to the mindfulness movement—strategies for switching our attention from our remembering to our experiencing “self.” As you can see from these connections, Kahneman’s research is awfully rich.
Though perhaps not as amazing as the blurbs would have you believe, I cannot help but conclude that this is a thoroughly excellent book. Kahneman gathers many different strands of research together into a satisfying whole. Who would have thought that a book about all the ways that I am foolish would make me feel so wise?
Reading "Thinking, Fast, and Slow", ....(book choice for this month's local book club), was not exactly bedtime reading for me.
I had already pre- judged it before I started reading... ( certain I would discover I'm a FAST INTUITIVE - type thinker ... ( quick, often influenced by emotion). Once in awhile I use basic common sense - logic .... but even, it is usually with 'righteous emotions'. Just being honest!
I understand this is an intellectual -giant- of - a -book about "How we think"... Thinking 'deeply' about how we think... but this book hasn't changed me - transformed me-- or enlighten me. Not so far.
It's too technical. I understand the author is brilliant --but I found myself skimming pages-- However, what I understood - I enjoyed.
Kahneman has a great talent at being a slow, rational, logical, and reflective thinker. However, fast thanking, intuitive thinking, is more influential in what experience tells us he says---being contrary to the belief that we are very rational-decision making people.
A few things in the book...interesting information ... Yet I still 'believe it's incomplete ... That their are other ways in speaking about the way our minds work - that is not found in this big book.
1) Two basic systems of thinking: System 1 is the intuitive, quick, thinking System 2 is the slowest rational logical and reflective thinking
.....20% of our Energy goes into our brain. .....We tend to be lazy thinkers. ( lazy controller he calls it), and do not involve our slow thinking brain and less it is needed. ...... A running theme in the book is that although the brain does contain a statistical algorithm, it is not accurate. The brain does not understand basic normal distribution. ...... Our brain often jumps to conclusions. ...... Our brain knows how to answer easy questions, like "what did you have for breakfast"?... but it is more challenging to answer the question, "how do you feel about yourself today"? ......We have biases ...... Often stereotypes will override statistics. ( again providing we have influential, lazy judging brains) ....... He talks about predictions. For example, if a child gets great grades in the lower grades of school... We often tend to over estimate our ability to predict the future. ...... When it comes to intuition versus formulas ... Often the formula does win. ..... We also are incline to expect regularity much more in our lives and really exist.
You won't find any data in this book about "The Power of Now" thinking, or discussion about "You are not your Mind", Chakras, or myths about healing ... but it's a book about THE WAY WE THINK... (technical.. some of it I resonated with- but when it got too scientifically technical, he lost me).
I look forward to my book club discussion- 25 people will be attending this month- (many bright people)... I'm sure to gain value and more insights.
Often I find myself in conversations with people who are criminally opinionated, but have little in the way of empirical grounding. It’s common, in these situations, to hear them malign opponents of their views by reducing the conflict to a single factor; My opponent is so dumb they couldn’t follow a chemical gradient if they were bacteria! Now, putting aside the fact that single factor analysis is a mugs game when discussing things of any complexity (which is basically everything), when resorting to these oversimplifications with human behavior, you asymptotically approach infinite incorrectness. My common refrain in these times is to dip into my quote bag and castigate the misguided with Popper’s glib witticism: “A theory that explains everything, explains nothing.” Or, channeling the Arch Bishop of astuteness, John Stuart Mill, I rise up, gesturing dramatically and pitching my voice just so: “He who knows only his side of the case knows little of that.” Hoping their snotty self assurance will recede before my rational indignation like an anabolic hairline.
This shit never works. Putting aside the fact that I’m subject to the same cognitive limitations, quotations often arrive on the scene like a flaccid member, with intimations of a proper impression hidden somewhere in that bloodless noodle, if only the other party would play with it. But, much like idioms, there’s just not enough chemistry to warrant heavy petting.
Next I will resort to recalling numerous studies which have totally pin-cushioned the quaint notion that we are dispassionate, logical thinkers. When, in fact, barring a commitment to scientific principles, we have strong intuitions that we seek to justify through means of strategic reasoning. “We’re more like lawyers than Vulcans.” I say solemnly. Starring off into the distance for dramatic effect. Pensive. Avoiding eye contact for an appropriate interval before turning to peer into their soul and nod as we grasp, however tenuously, our feeble position before the Logos. Inevitably, when this numinous moment arrives, I am instead greeted with a vacant stare, or, much worse, an objection! Which, if you’ve been following me so far, means that I switch from attempting to persuade and instead silently chide my opponent for being a hopeless imbecile.
How do we opt out of being unrelenting self righteous pricks? Well, we probably can’t fully, the gravity is just too strong. Escape velocity would require some fundamental redesigns to a cognitive apparatus which evolved to intuit a subset of phenomena on the African savanna which bore a relationship to our reproductive success. But we can sure as hell beat one another with books like this until we piss blood and can’t hold our toothbrushes due to nasty rotator cuff injuries. That’ll teach us.
I consider this to be the Mac Daddy of bibliophilic bludgeoning implements on this topic. I once blasted a man in the chest so hard with the spine of this book that, in addition to the bastard rolling clean over a Pizza Hut table like it was the hood of a speeding vehicle, the pages burst from between the covers like a fox vomiting hen feathers. So incensed by this needless destruction of literary property, I stood over the man and berated him on the importance of properly breaking in the spines of hardcovers. As he wormed about in pepperoni and soda, nodding (if for no other reason than to avoid another terrible sounding of his sternum) I also took the time to explain the central message of this book:
“Look, man. You need to realize that we’ve got these two modes of cognition. One is accessible to us. It’s slow and deliberative and subject to systematic interventions of logic if we but choose to learn and apply them. The other does pretty much whatever it damn well pleases based on input it receives from the environment that you’re often not consciously aware of. It’s good when it’s helping you get out of the way of deranged book wielders, but it’s bad when it goes awry in matters that are deeply counter intuitive (much of modern life) and mucks about with your ability to properly steer the system you have access to.”
This is an important book. Humanity would be much improved if these insights could percolate through society and really take hold. But they probably won’t. Because we’re assholes.
Thinking, Fast and Slow by Nobel Prize Winner Daniel Kahneman is a lifetime's worth Of Wisdom. I purchased a paper copy of this book after I came across this book preparing for IELTS. The tone of the book is very official. Thus, it is a good read before preparing for the English exam. Thinking, Fast and Slow introduces two systems of our brains. System one and System two. The first one is fast and the second one is slow. There are a plethora of authors' examples, that he accumulated thought his professional life. It was pleasant, yet, difficult to read. I struggled to finish because of the official language.
"Thinking, Fast and Slow" is one of the best books I ever read. I have read it 3x now. It's the gift that keeps on giving.
The conclusions of the specific studies in the book are the meat. I constantly reference them in practical human matters all the time, especially in which we easily delude ourselves: the endowment effect, expert intuition, the law of small numbers, confirmation bias, the planning fallacy, risk aversion, loss aversion, sunk cost (throwing good money after bad), etc.
These insights help us think more rationally and make better decisions, including in financial matters, where we might be prone to impulse, allowing our emotions get the better of us and really cost us.
===========
Kahneman questions the certitude of success based on formulae in business books and other business publications....
1- دانیل کانمن در این کتاب به خطاهایی میپردازد که ما انسانها به دلیل کارکرد فعال سیستم 1 و تنبلی سیستم 2 انجام میدهیم. خطاهایی که حتی ما انسانها بعضیها را به عنوان خطا نمیشناسیم، باور داری�� و البته به طور روزمره با آنها مواجهیم. از نظر کانمن راه جلوگیری از خطاهایی که از سیستم 1 سرچشمه میگیرند، در کل ساده است. نشانههایی که حاکی از میدان مین ذهنی است شناسایی کنید، آرام شوید و بکوشید از سیستم 2 کمک بگیرید. متأسفانه این فرایند عاقلانه در اوقاتی که به آن نیاز است، بسیار کم مورد استفاده قرار میگیرد. همهی ما وقتی نزدیک به ارتکاب خطای جدی هستیم، به زنگ خطری نیاز داریم که با صدای بلند نواخته شود. اما چنین زنگ خطری موجود نیست و خطاهای ذهنی، در کل، بسیار دشوارتر از خطاهای درکی تشخیص داده میشوند. 2- کتاب ۳۷ فصل دارد و در هر فصل یک موضوع مطرح میشود. به همین دلیل این کتاب حجیم (ترجمه ۶۸۰ صفحه ای) هرچه جلوتر میرویم جذابتر میشود. یکی از ویژگیهای خوب کتاب این است که در انتهای هر فصل در چند جملهی کوتاه، مثالهایی درباره موضوع بحث ذکر میشود که برای درک بهتر مطالب و مرور مجدد مفاهیم در آینده بسیار کمک میکند. 3- متاسفانه ترجمه خانم تالوصمدی خوب نیست و بعضی جاها کلافه میکند؛ اشتباههای املایی و حتی محتوایی هم دیده میشود که شاید به خاطر سبک نگارش سخت کتاب توسط کانمن باشد. امیدوارم در چاپهای مجدد این مشکلات مورد بازبینی و تصحیح قرار بگیرند. جاهایی که شک داشتید یا متوجه نشدید را با متن انگلیسی تطبیق بدهید. فایل کتاب را برای دانلود در آدرس زیر قرار دادم. https://goo.gl/52G5q2 4- کانمن مینویسد: «تصمیمگیری مانند سخن گفتن به نثر است. افراد بدون اینکه از آن آگاه باشند و حتی هنگام آگاه بودن، پیوسته این کار را انجام میدهند.» همانطور که انتظار داریم جان مطالب کتاب در فصل «نتیجهگیریها» آورده شده است. نگارنده در این فصل به این موضوع میپردازد که با وجود اینهمه خطا در رفتار انسانها، آیا باید آنها را برای انتخاب و تصمیمگیری آزاد گذاشت یا جامعهی مسئول باید به افراد برای انتخاب صحیح کمک کند؟
Freeman “Dyson Sphere” Dyson wrote the New York Times review, which has me swooning right there. Dyson was a particularly apt pick because Kahneman helped design the Israeli military screening and training systems back when the country was young, and Dyson at 20 years old cranked statistics for the British Bombing Command in its youth. Dyson was part of a small group that figured out the bombers were wrong about what mattered to surviving night time raids over Germany; a thing only about a quarter of the crews did over a tour. Dyson figured out the Royal Airforce's theories about who lived and died were wrong. But no data driven changes were made because “the illusion of validity does not disappear just because facts prove it to be false. Everyone at Bomber Command, from the commander in chief to the flying crews, continued to believe in the illusion. The crews continued to die, experienced and inexperienced alike, until Germany was overrun and the war finally ended.” http://www.nybooks.com/articles/archi...
Why did the British military resist the changes? Because it was deeply inconsistent the heroic story of the RAF they believed in. Suppose there are stories I’d die for too. But not the myth that Kahneman dethroned. Kahneman got the Nobel Prize for Economics for showing that the Rational Man of Economics model of human decision making was based on a fundamental misunderstanding of human decision making. We are not evolved to be rational wealth maximizers, and we systematically value and fear some things that should not be valued so highly or feared so much if we really were the Homo Economicus the Austrian School seems to think we should be. Which is personally deeply satisfying, because I never bought it and deeply unsettling because of how many decisions are made based on that vision.
If that was all this book was, it’d just be another in a mass of books that have as their thesis “You’re wrong about that!” Which I appreciate knowing, but there’s a point where it’s a little eye rolling because they don’t offer any helpful suggestions on how not to be wrong, or why these patterns of wrongness exist and endure. But Kahneman has a theory. He theorizes that humans have two largely separate decision-making systems: System One (the fast) and System Two (the slow). System One let us survive monster attacks and have meaningful relationships with each other. System Two got us to the moon.
Both systems have values built into them and any system of decision-making that edits them out is doomed to undercut itself. Some specifics that struck me:
Ideomotor Effect: (53) Concepts live in our heads in associative networks. Once triggered, they cascade concepts. Make someone walk slow, they think about old age. Make someone smile, and they’ll be happier. Seeing a picture of cash makes us more independent, more selfish, and less likely to pick up something someone else has dropped. Seeing a locker makes us more likely to vote for school bonds. Reminding people of their mortality makes them more receptive of authoritarian ideas.” (56) “Studies of priming effects have yielded discoveries that threaten our self-image as conscious and autonomous authors of our judgments and our choices.” (55).
Halo Effect (82) “If you like the president’s politics, you probably like his voice and appearance as well.” We find someone attractive and we conclude they’re competent. We find emotional coherence pleasing and lack of coherence frustrating. However, far fewer things are correlated than we believe.
What You See Is All There Is (WYSIATI) (85). Our system one is pattern seeking. Our system 2 is lazy; happy to endorse system 1 beliefs without doing the hard math. “Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI. . . System 1 is radically insensitive to both the quality and quantity of information that gives rise to impressions and intuitions.” (86). Absolutely essentially for not getting eaten by lurking monsters, and “explains why we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story we put together is close enough to reality to support reasonable action.” Except when it doesn’t. Like in our comparative risk assessments. We panic about shark attacks and fail to fear riptides; freak out about novel and unusual risks and opportunities and undervalue the pervasive ones.
Answering an Easier Question (97). If one question is hard, we’ll substitute an easier one. It can be a good way to make decisions. Unless the easier question is not a good substitute. I have an uneasy awareness that I do this. Especially since it often REALLY ANNOYS me when people do it to me.
The Law of Small Numbers. (109) The counties with the lowest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Good clean living? The counties with the highest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Lack of access to health care? Wait, what? The System 1 mind immediately comes up with a story to explain the difference. But once the numbers are cranked, apparently, it’s just an artifact of the fact that a few cases in a small county skews the rate. But if you base your decision on either story, the outcomes will be bad.
Anchors (119). We seize on the first value offered, no matter how obviously absurd it is. If you want to push someone in a direction, get them to accept your anchor.
Regression to the Mean. (175) There will be random fluctuations in the quality of performance. A teacher who praises a randomly good performance may shape behavior, but likely will simply be disappointed as statistics asserts itself and a bad performance follows. A teacher who criticizes a bad performance may incentivize, but likely will simply have a false sense of causation when statistics asserts itself and a good performance happens. Kahneman describes it as “a significant fact of the human condition: the feedback to which life exposes us too is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.” (176).
The Illusion of Understanding (204) The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. We all have a need for the reassuring message that actions have appropriate consequences, and that success will reward wisdom and courage.” But it doesn’t . (212). For example, we’re totally wrong about whether you can beat the stock market. Formulas are often much more predictive than learned intuition. I’m going to have to wrestle with this one, but he alluded to a claim by Robyn Dawes that “marital stability is well predicted by a formula: frequency of lovemaking minus frequency of quarrels.” (226) Snicker.
Premortems Can Help. (264) before making a decision, assign someone to imagine it’s a year into the future and the plan was a disaster. Have them write a history of the disaster.
We value losses more than gains. (349) Which is fine except when that means we expose others to more risk because we did the math wrong.
The Focusing Illusion (402) “Nothing in life is as important as you think it is when you are thinking about it.” We overvalue what’s in our mind at the moment, which is subject to priming.
He closes by stressing he does not mean to say that people are irrational. But, he says, “rational” in economic terms has a particular meaning that does not describe people. “For economists and decision theorists, [rationality] has an altogether different meaning. The only test of rationality is not whether a person’s beliefs and preferences are reasonable, but whether they are internally consistent. A rational person can believe in ghosts, so long as all her other beliefs are consistent with the existence of ghosts. . . . Rationality is logical coherence – reasonable or not. Econs are rational by this definition, but there is overwhelming evidence that Humans cannot be. . . .
“The definition of rationality as coherence is impossibly restrictive; it demands adherence to rules of logic that a finite mind is not able to implement. Reasonable people cannot be rational by that definition, but they should not be branded as irrational for that reason. Irrational is a strong word, which connotes impulsivity, emotionality, and a stubborn resistance to reasoned argument. I often cringe when my work with Amos [Tversky] is credited with demonstrating that human choices are irrational, when in fact our research only showed that Humans are not well described by the rational-agent model.” (411)
Si somos tan inteligentes y tenemos tanta experiencia, ¿por qué tomamos tan malas decisiones? ¿A qué se debe que repetidamente transitemos por un camino erróneo y sin salida? ¿En qué estamos fallando?
Este libro es una joya maestra para entender el verdadero funcionamiento de nuestro cerebro. Es largo, tedioso —no siempre—, e incluso repetitivo en muchos tramos, pero este libro es posiblemente la mejor opción, para comprender en profundidad, lo que necesitamos aprender sobre la herramienta más poderosa que poseemos. Es un libro que puede cambiar nuestra forma de tomar decisiones para siempre.
Se viene una reseña súper extensa en la que espero explicar las razones por las que leí esta obra y lo que aprendí, pero no será nada fácil hacerlo porque este, no es un libro normal. Este libro es el resultado de muchos años de trabajo, experimentación y fracasos de un Premio Nobel de Economía que dedicó su vida a ello. Es un trabajo digno de respeto.
I could not bring myself to finish this book. The book is filled with shady experiments on undergraduates and psychology grad students and wild extrapolations of the associated results. I find it exceedingly difficult to take many of the conclusions seriously. I can't read into them. I can't trust them. I can't base my decisions on them and I resist incorporating them into my world view with anything more than 0.01 weight. In fact, several of the experiments that this book mentions were also found to be not reproducible by a recent meta-study on reproducibility in psychology studies.
Here's a characteristic example of me reading the book. The author says: "Consider the word EAT. Now fill in the blank in the following: SO_P. You were much more likely to fill in the blank with a U to make SOUP than with an A to make soap! How amazing. We call this phenomenon priming, system 1, something something". In fact, no, SOAP came to my mind immediately.
All I could think about when I read this book is my own experience of participating in a friend's psychology study once. He designed an experiment and asked me to do some things and answer some questions, but at some point it became extremely clear to me what the experiment was about, or how he hoped I would behave. I went along with it, but I couldn't believe that this would eventually become part of a paper. It was a joke. I'm afraid you can't go through a similar experience and take these studies seriously from then on.
All that being said I do find the broad strokes of the system1/system2 division proposed in this book to be interesting and appealing. A small few of the examples were fun to contemplate, and it was okay. 3/5, aborting reading.
My issue with this book, which is one I've tossed aside after 60 pages, is not so much that it's poorly done or that it's hard to understand - in fact, the exact opposite is true.
The issue is that this book is simply more in depth about psychology and psychological processes than I truly have a short-term interest in. This is more the type of book you keep near your desk or bedside, read a 12 page chapter or so, and digest. This may be a book I need to own and do that with as opposed to tear through it after borrowing it from the library and then hating myself as a slog through it.
A long book that requires real mental exertion, Thinking, Fast and Slow is a worthwhile read by Nobel laureate Daniel Kahneman. It delves into the two complex systems of the mind. System 1 is impulsive, emotional, and often led astray, while System 2 is rational, thoughtful, and takes more time to makes decisions. He analyzes how humans use (and sometimes fail to use) both systems, and the resulting implications on topics ranging from how we perceive happiness to behavioral economics.
Thinking, Fast and Slow is one of the most in-depth Psychology books I've read. I fell in love with the subject after taking AP Psychology last year as a junior in high school, and am currently craving more books and articles related to the field. Daniel Kahneman satisfied my thirst. I had a solid understanding of some concepts beforehand, like the confirmation bias and hindsight bias, but had never heard of other terms like base rate or the illusion of validity. The sheer amount of statistics and experiments referenced throughout the book proved Kahneman's thoroughness and dedication.
I recommend Thinking, Fast and Slow to anyone who wants to learn about how we think, or about psychology in general. I liked how Kahneman progressed from simple ideas like heuristics to more complex concepts, like prospect theory. Even if you have no background in psychology or economics, a mere interest in either should suffice for this book.
Mr. Kahneman, a Nobel Prize winner, explores the general subject of how and why we frequently make irrational decisions. We've all seen articles over the years on various aspects of this phenomenon, but I venture to say that never before have the various aspects and permutations been explored in this depth and specificity. Mr. Kahneman has spent much of his life researching the subject, and since the book includes both his research and that of others, it must stand as the definitive compendium on the subject. His credentials are indisputable, and he tries gamely to bring the subject to life, but -- mea culpa -- I just couldn't stay interested in the myriad of data and specific examples. The book is good for someone really interested in the details, and it does contain real life examples, but after 400 pages it's hard to remember them. My takeaway: Our intuition is frequently wrong, and even our experience (or what we believe our experience to have been) may not be reliable as a decision guide. So, be careful!