a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by aeontech
aeontech  ·  4454 days ago  ·  link  ·    ·  parent  ·  post: How to confuse a moral compass

I think they accounted for that, it wouldn't be much of a study otherwise. I am going to guess that they did not ask "How would you defend this position", but rather something like "Can you explain your answer to this question in detail".

In first case, yes the person would play devil's advocate, but that's not really news worth publishing as a study. The interesting part is that when they do it the second way, it basically seems to trigger the mental shortcut where the mind essentially says "oh, I already thought about this question and came up with the answer, there's no reason to think about it deeply again" and simply justifies the answer instead of re-examining the question.

Our minds are much less logical, consistent, or rational than we like to imagine. The concept of cognitive shortcuts is not new, analytical thinking requires effort and energy, so our minds tend to filter out majority of input and process it subconsciously instead of expending energy on consciously analyzing and processing every little decision. Unfortunately the filtering mechanisms are automatic and kick in even for things that should be considered thoughtfully.

Some further reading on cognitive biases and shortcuts: - Motivated Tactician model tries to explain why people use stereotyping, biases, and categorization in some situations and more analytical thinking in others. - Framing) of problem/question affects how we process it, and even the answer we arrive at. - Affect heuristic is "going with your gut", or deciding based on emotion evoked by the question. - Availability heuristic is the "if you can think of it, it must be important" heuristic, which leads people to fear flying more than driving and terrorism more than flying, even though their chances of dying from a car accident are far higher than ever being involved in a plane crash or a terrorist attack.





NotPhil  ·  4454 days ago  ·  link  ·  

    Our minds are much less logical, consistent, or rational than we like to imagine.

Thanks for the links.

But, I know of no one who imagines that our minds are consistently logical. As you say, we've understood for millennia that people take cognitive shortcuts, and that these shortcuts are generally useful, occasionally dangerous, and have always been subject to manipulation by unscrupulous actors.

What I was expressing skepticism about is the validity of some psychological experiments and the interpretations that experimenters assign to them.

aeontech  ·  4454 days ago  ·  link  ·  

Hmm, actually I think a belief in one's own rationality is a pretty normal assumption to make for a self-aware being. If it was a commonly accepted belief that we are not wholly rational, we as a humanity would not be constantly surprised by experiments that demonstrate bugs in our cognition, and we would be less susceptible to manipulation and errors of judgement.

It would certainly be interesting to read the original paper, right now all we have is a pop-science summary that omits the details that would clarify your concerns about the study.

NotPhil  ·  4454 days ago  ·  link  ·  

    I think a belief in one's own rationality is a pretty normal assumption to make for a self-aware being.

I keep a blog and was thinking about writing an entry about rationality. I was planning to contrast economists' ideas about rationality (material self-interest) with the general meaning of the term (agreeableness to reason), but I think, perhaps, contrasting the idea which you think is common (all people are always exercising only reason), and the idea which I think is common (people have the capacity to reason along with the capacity to exercise many other ways of thinking and feeling) might be more interesting.

If you could tell me a little more about why you think people assume they're always being logical, and when and where you think this idea became prevalent, it might help point me in the right direction for some research on the subject.

aeontech  ·  4450 days ago  ·  link  ·  

Maybe I didn't quite word myself clearly - I am not saying "all people are always exercising only reason", but I am saying that "no person likes to believe that they are crazy" - and the common definition of sanity in this day and age is behaving "rationally". This is why you often see people coming up with justifications/excuses for their irrational behavior rather than accepting the fact that they act irrationally. I think that the reason that we want to believe in a rational mind, is because it provides a framework within which we can predict other people's reactions and responses, and therefore allows us a consensus in which we can operate as a society.

I would think that the fact that your mind and cognition has flaws that are invisible to yourself is an uncomfortable thought for most people, and furthermore, most people are not introspective to that degree of self-awareness for the question to come up in the first place. This is why the evidence of these flaws is surprising and distressing each time we encounter it, and why these flaws are so dangerous, because when they are exploited, our thinking can be modified without awareness of being manipulated.

If you're interested in rationality and cognition, take a gander at lesswrong.com - it has an extensive library of discussion and writing on the subject. I'm starting to work through it myself now, I'd love to have some discussion about the concepts.

NotPhil  ·  4450 days ago  ·  link  ·  

    I think that the reason that we want to believe in a rational mind, is because it provides a framework within which we can predict other people's reactions and responses, and therefore allows us a consensus in which we can operate as a society.

My initial research indicates that this is probably the case, if you substitute "researchers in particular disciplines" for "we."

Apparently, this began in earnest shortly after WWII, when economists, strategists, AI researchers, and geneticists all adopted a re-interpretation of Enlightenment-era views of how people behaved in society, based on a highly-selective and out-of-context reading of their work, in order to make their own work more easily quantifiable. Being experts, their work informed public policy in the late 20th century, and so their ideas filtered out into society at large.

I've posted a couple of documentary essays I've found while researching this, if you're interested. One is on game theory, and another is on the computer metaphor.