A full discussion of the study in written form is available here.
It really can't be stressed enough -- the more loyal we are to a political party, or a football team, or a religion, or virtually anything else, the less likely we are to be able to objectively sort fact from fiction and less likely we are to reach the best conclusions on matters of fact and logic relating to that subject of our loyalty. The human brain, as I tried my best to explain from my own layman's understanding, creates avenues of self-delusion that we often aren't even aware of. Only by becoming aware of our own biases can we then work to overcome them in order to truly reach objective conclusions on issues that we have any degree of emotional attachment to.
This explains why so many talking heads and bloggers and less-than-objective media outlets spend so much time, money and effort distributing false (or misleading) information -- they know that putting out that kind of propoganda can be very effective in creating an alternative reality in the minds of their viewers/readers/listeners, and that fictional reality happens to be one in which their party/ideology/etc. is always right.
This book looks like a disturbing/entertaining collection of examples of "confirmation bias" in play. When people want to believe something, they always seem to find ways to filter the world around them so that all that enters their own personal world is material that supports that they want to believe.
There are many more like it, but this survey adds more fuel to the fire.
On a related note, this Q&A is pretty interesting, and makes me want to add this book to my long list of "books I want to read but never seem to have the time."
This explains why so many talking heads and bloggers and less-than-objective media outlets spend so much time, money and effort distributing false (or misleading) information -- they know that putting out that kind of propoganda can be very effective in creating an alternative reality in the minds of their viewers/readers/listeners, and that fictional reality happens to be one in which their party/ideology/etc. is always right.
This book looks like a disturbing/entertaining collection of examples of "confirmation bias" in play. When people want to believe something, they always seem to find ways to filter the world around them so that all that enters their own personal world is material that supports that they want to believe.
There are many more like it, but this survey adds more fuel to the fire.
On a related note, this Q&A is pretty interesting, and makes me want to add this book to my long list of "books I want to read but never seem to have the time."
No comments:
Post a Comment