[N]early a thousand scientists from every corner of the globe were preparing to gather in Santa Fe, New Mexico, on May 23-25 for what was billed as “the cold fusion shootout.” Tens of thousands of others would gather to watch the proceedings live via satellite. The secretary of energy, Admiral James Watkins, had asked a distinguished Nobel laureate, J. Robert Schrieffer, to organize the Santa Fe meeting and to ensure that all points of view were fairly represented. The objective was to exchange every scrap of theoretical and experimental evidence that might shed light on the question. Schrieffer invited Pons and Fleishmann to lead off the conference, and they accepted, promising to present the results of their new tests. But that was before the raccoon attack.
The tale of cold fusion is hilariously recounted in Robert L. Park’s Voodoo Science, quoted above. You see, there was this raccoon which had wandered into a transformer. The transformer exploded and the raccoon received a surprise cooking. Unfortunately for the future of humanity, the transformer was connected to Pons’s laboratory, causing a power outage and disrupting the crucial experiment whose results were promised at the Santa Fe meeting. At least this was the story coming from Pons and Fleishmann.
The media excitement over cold fusion was enormous. Congress got involved; politicians were taking photo-ops with Pons and Fleishmann; millions of dollars in public and private donations were being thrown about; venture capitalists came clamoring for deals. All while scientists were becoming increasingly skeptical of Pons and Fleishmann, not only because of their lack of results but because of what increasingly looked like shenanigans on their part. When at last the raccoon attacked, scientists had not been entirely surprised.
The point I wish to make about Pons and Fleishmann is that they did not start out as frauds. The subtitle of Park’s book is The Road from Foolishness to Fraud. It gives some interesting accounts of individuals who traveled along this road.
It starts out with a few not-quite-true but well-intentioned statements, rationalized as serving a greater purpose (and which, after all, do not matter anyway). Some time later a foolish decision is made, perhaps in part to compensate for one of the previous not-quite-truisms. Eventually, a raccoon is introduced.
Now I’ll switch gears to another book, Mistakes were made (Point of Inquiry podcast, For Good Reason podcast), one which I’ve flogged before.
The guy depicted above has just made a small mistake. Say, he just formed an opinion on some issue using faulty data or he said something which was technically true but misleading. Since he has only taken a small step in the wrong direction, there is still hope for him. He can be persuaded to change his newly formed opinion; he won’t be too embarrassed to backtrack on what he said.
Mistakes were made describes the fascinating psychological apparatus in place which causes our protagonist to start sliding down that pyramid. As he slides he becomes more invested in his viewpoint while becoming ever more insulated from reality.
We already know a phenomenon of this sort exists just by looking at the Amazon reviews of a book on a controversial topic, although the dichotomy there is not necessarily reality/unreality but simply PositionA/PositionB. All things being equal, one should expect the 1- to 5-star histogram to roughly resemble a bell curve. But when the book is about a contentious issue the graph is often U-shaped, and steeply so! There is an attractive force pulling readers toward one side or the other, thinning out the middle.
Since the reason for the sliding is not the purpose of this post, I won’t delve into the explanation. If you are interested then you can read the book and/or listen to the podcast(s). Most likely you have observed this kind of sliding, and the fact that it happens is not in dispute. So for now I will simply assume: that guy is sliding.
The most important thing about that guy is that he could be you. And he could be you at a moment when you are not quite thinking straight–when you are angry, upset, or whatever. If so, then in effect you’re about to start sliding without your full consent. Your own psychological apparatus is going to screw you. The only preventative measure I can imagine is to be attentive and to recognize the situation when it arises.
Next consider someone you know who is in that position. What do you do? I don’t have an answer except that if you can recognize yourself in that person then you’ll be more able to pull them back.
And finally, how do we deal with those who have traveled far down the pyramid, far away from reality? Most likely they are beyond reach, but not always. For the truly obstinate ones, we can hold them up as warnings to us all. We can show the clear, unassailable evidence of why they are wrong, but also–and this is what I wish to add–we can provide a full explanation of how they arrived at such unreality and why they persist in it.
The above image is based on art from N by Metanet Software.
Yes, it could be any of us and probably most of us have been there some time or another.
Thi can be done and it can require a lot of effort. I am not sure what it will accomplish.
Suppose Alice and Bob are eyewitnesses to a hit-and-run accident. Neither were close to the accident, but they could at least see the color of the offending car.
However Alice and Bob disagree on the color of the car. A local security camera shows almost definitively that Alice is right. Given camera evidence with a confirming eyewitness, we are inclined to believe the color of the car is that which Alice attested.
But what about Bob? He is insistent that he is right. But why? We can chalk it up to Bob just being nutty, but there’s still something unsatisfying about that.
If we can find a consonant reason for Bob’s behavior then we’ll have a much better case. Lo and behold, it turns out Bob is color blind. Now we have an explanation for the testimony of both Alice and Bob.
By “full explanation” in the post above, I just meant pointing out how beliefs operate under the snowball effect, the process of sliding down that pyramid.
It is common for non-skeptical thinkers to be impressed by the sheer passion of, say, creationists. It is tempting to split the difference: “Well, there must something in what they’re saying.”
But we have an explanation for the behavior of creationists, and it says that splitting the difference is the worst possible conclusion. Creationists are just normal people who have been sliding down that pyramid, away from reality. That’s a more satisfying explanation than simply declaring them to be nutty (like in the example with Bob above).
[…] You, the questioner, do not have to be an expert on X. You are merely probing to see if your friend has done his homework. There is a well-understood explanation for why he would avoid counter-arguments to X, if that turns out to be the case. The explanation is cognitive dissonance, and it is the reason people can all-too-easily slide away from reality. […]
I like to think about similar things though I think of it in a different category.
The reaction of former fans of YNH is case in point. They still defend the blog on most counts but the one he simply can no longer defended against.
Why is this?
My theory is that people are very very unlikely changing their position to which they have heavily committed, even if it turns out to not be grounded in reality, or true, or fair.
Take the anti-vaccine community. Reality doesn’t change the perspective of the most ardent followers. As you rightly point out, same with cold fusion.
And those who do change their mind do so silently. It’s embarrassing in our society to come out and point blank say “You know I was completely wrong, pretty delusional, didn’t evaluate evidence correctly”
Why? Because people will use that as a rhetorical attack against you. Claim that you have proven once to have bad judgment, so it must be true in a future argument.
We as a society do not really help people admit that they are wrong about something. We encourage people to present a pristine “I am right and smart” image, that has to be defended at virtually all cost.
But yes, if we can prevent early that people commit heavily in lack of reality. Believe a story without sufficient evidence etc, we don’t have to content with the symptoms of how hard it is to get them to uncommit later.
The idea that admitting a mistake leaves one vulnerable to rhetorical attack is generally a misconception. That’s likely to be a rationalization people employ to continue with the avoidance strategy.
One example cited in Mistakes Were Made was this article, which mentions that patients are less likely to sue when their doctor apologizes for a mistake.
The book also cites a study which says that polls show the public values integrity and transparency at an “ever-higher premium,” though I wish it were more exact. Of course everyone detested Regan’s notpology of Iran-Contra (also mentioned).
It does make sense when you think about it. You make a mistake, so the opposition’s argument is–what?–that they don’t make mistakes? If it’s an honest blunder–not negligence or part of a pattern–then it doesn’t have that much sticking power in the hands of opponents.
I agree that being honest is always good. In fact some health providers don’t only have an admissions but also an immediate negotiation for reparation policy. They found that the average cost is lower if they offer to negotiate damage joint with the patient and that mutual satisfaction with the whole process is much higher.
But this is the case where two parties have decided to be honest and found mutual utility in that honesty.
There are many cases where the system is adversarial. There are many upstanding honest whistle-blowers who got shafted. Honesty alone is no protection from being ripped to shreds in our system.
Heck if honesty would win, our political parties would completely rock and everything would be awesome. Unfortunately it’s not that simple.
If a politician ever admits they lied or were wrong they are basically out of the system. Ads will run with just the sound bite: “I lied”. Reelectability of confessing transgressors is ziltch. And it is wrong and bad, but that is it. And I’m not defending Reagan or anything. I’m saying that we have a system that discourages honesty. It does not broadly reward it.
We do have to fight for honesty and we do have to guard ourselves even if we are honest, and to be afraid to be unfairly attacked after having made oneself completely vulnerable is not completely irrational.
We should strive to make it irrational though.
But all that said, the main part of it all is simple psychology. Some guilt is too big to admit. You think in the anti-vaxx debacle Jenny McCarthy will ever come out and say: “I was all wrong, and I have part responsibility for hundreds of deaths and preventable illnesses, and I’m sorry.”? People that deeply invested in one story do have a hard time to come out and simply state what would be the right and honest thing. Anna Freud discovered the strength of these psychological defense mechanisms in children. It’s rare to see people grow out of it.
I agree that the situation is pretty forlorn, but perhaps not that bad. Obama admitted candidly that the Daschle appointment was a mistake, and we haven’t much heard about that episode since. Lots of politicians survive sex scandals, and at least one survived a manslaughter charge (Ted Kennedy—but probably a bad example since he’s a Kennedy). And many politicians still in office frankly admitted to being mistaken about invading Iraq (which however is one case where I think they should be thrown out).
But generally it is difficult to be hopeful about politics or anti-vax or any of the craziness around. Mankind’s psychological maturity has not been sufficiently developed alongside our technological advancements. We are an ancient, irrational tribal folk trying to make due in a high-tech globalized society.
[…] Will is that Oedipus, that guy who did most of the legwork to out you for the fake you are, had a post that speaks to exactly the predicament you find yourself in now posted on his blog directly after […]
Interesting post. Love the pyramid graphic. However I think there may be another element that needs to be included.
From time to time an opening appears in the pyramid that allows a person to travel directly from far down one side way over to the other side. I have seen this many times.
Perhaps someone is a fierce supporter of a political ideology. They start sliding down the unreality of that position based on bad information. Then, for whatever reason, they find themselves in an environment that so clearly proves their position was wrong they instantly are ported to the other side of the pyramid.