Confirm bias or desire?
A piece in the New York Times by the authors of a paper in press in the Journal of Experimental Psychology: General has kind of baffled the Grumpy Geophysicist for a few days now. It argues that confirmation bias is not a problem, but desirability bias is. In essence, you favor new information that aligns with what you want to happen rather than what you think is true.
Now, had you asked GG before this to define confirmation bias, he might have said “favoring information that says what you want it to say.” What the paper says, though, is that this describes desirability bias. To tease the two apart, you need a situation where what you want and what you expect are two different things [frankly, though, this is a capsule definition of a pessimist].
The experiment described used the desired and anticipated results of the last U.S. presidential election as expressed by 811 participants (89 others were disqualified, 48 for saying they made a mistake or were dishonest). If you believed candidate A was likely to win but wanted candidate B and you were given information that indicated that candidate A was ahead in the polls, you didn’t change your estimate of who would win by much. If you saw information that candidate B was ahead, you gave candidate B a substantially greater chance of winning. The authors then assert that confirmation bias isn’t an issue, but desirability bias is.
It isn’t hard to expect a disconnect on other topics. Do you want climate change to occur and likely to lead to societal disruption? Probably not, yet many exposed to evidence that climate is warming increase their belief that the climate is warming, no? This possible conundrum didn’t slip by the study’s authors, who wrote (R1 version of their manuscript):
When confronted with new information regarding global temperature increase, strong believers updated their beliefs more upon receipt of ostensibly undesirable information (i.e., a faster temperature increase than expected), whereas weak believers updated their beliefs more upon receipt of ostensibly desirable information (a slower increase than expected). Though this pattern appears consistent with an independent confirmation bias, such an outcome may emerge when individuals are personally invested in “being right”—indeed, for many climate change activists a belief that the world is warming constitutes a core part of their identity (Stern et al., 1999). For such people, objectively undesirable (but confirming) information about the rate of global warming may be subjectively desirable: vindicating their commitment to combatting climate change (Sunstein et al., 2016) and affirming their cultural group identity (Kahan et al., 2012).
In other words, those anticipating climate change want to be proven right, so their acceptance of evidence confirming their evaluation that climate change is occurring is because they desire to be right more than they desire the climate to not change. Um, precisely how is this different from confirmation bias again? Is confirmation bias supposed to be free of emotions? It feels like you can always make it seem as though desirability bias is at the root, making the term confirmation bias irrelevant.
The op-ed closes with this summary: “Our study suggests that political belief polarization may emerge because of peoples’ conflicting desires, not their conflicting beliefs per se. This is rather troubling, as it implies that even if we were to escape from our political echo chambers, it wouldn’t help much. Short of changing what people want to believe, we must find other ways to unify our perceptions of reality.”
Sorry, but this is feeble. It implies we are prisoners of our present beliefs. This is the precise mindset underlying the mantra that science advances one funeral at a time–a mindset precisely contrary to what science should be. If this is so, how exactly did same-sex marriage advance when the echo chambers kept up their respective drumbeats? A lot of folks who are OK now with same-sex marriage don’t personally want it or even like it, yet they have come to feel that is the fair thing to do. How did they come to change their minds if they didn’t favor it all along? At a more general level, how could we ever recognize hazards? Why would you want to believe that DDT killed birds? Why would you want to believe that humans created an ozone hole?
Hell, how did all those study participants ever reach the point where they expected their preferred candidate to lose? Doesn’t the existence of those folks somehow disprove the rigidity of this hypothesis?
GG’s feeling is that in parsing the question of belief versus desire, the study’s authors have made a distinction with little value. The cute example they used feels artificial.
Maybe instead of studying why we don’t change our minds, we need to study why we do. There are folks working on this (GG has noted one example before). Hopefully all they desire is to get a correct answer–then their desirability bias will work for us all.