Can Curiosity Kill The Sciences?

There’s a book out there that seems to be attracting lots of lightning bolts (Steven Pinker’s Enlightenment Now!).  GG is not interested in reading or discussing that, per se. It sounds as though logic and empirical observation got confused in there (they are not the same). What got his attention was one of the responses by Ross Douthat of the New York Times, who essentially argues that smugness by those who purport to know better will stifle real science. The nub of the argument is in this quote:

I’m reasonably confident that both of the stranger worlds of my childhood, the prayer services and macrobiotic diet camps, fit his [Pinker’s] definition of the anti-empirical dark. And therein lies the oddity: If you actually experienced these worlds, and contrasted them with the normal world of high-minded liberal secularism, it was the charismatic-religious and “health food” regions where people were the most personally empirical, least inclined to meekly submit to authority, and most determined to reason independently and keep trying things until they worked.

Basically he argues that these are the people being the most empirical–the ones really out there who are curious, the ones really sparking science.

There is a grain of truth there. If all in society passively accept what Doctor Authority Figure Type (DAFT) tells them, we aren’t going to get far. For a long time explanations of the world were attempts to logically extend notions from really old DAFTs. So yes, curiosity and intellectual ferment are good for making progress.

But, there is empiricism and then there is empiricism.  Doing empirical tests like seeing where in your garden the carrots grow best is a pretty clean experiment with a pretty clear outcome. But what Douthat describes are people who are trying everything to get healthier or avoid death. Presumably some in his experience got healthier by praying; some by eating macrobiotic foods.  And no doubt some did not. When you figure in the complexity of human medicine and fold in the amazing strength of the placebo effect, you expect quite a number of people to find a cure in things that, frankly, are not curative. Thinking you can find a better way is a pretty universal behavior: Steve Jobs, hardly an idiot, initially rejected modern medicine for his pancreatic cancer. All are free to explore this with their own lives, but there is a point where society suffers, and presumably this is what Pinker might have been driving at (remember, GG is not reading that book). But, you ask, when is it bad to ignores the DAFTs out there?

Frankly it is when you mistake your empirical skills as producing superior results than those from people better trained than you, especially if you aren’t willing to put in the elbow grease to overcome your initial deficit. If you think you are a highly skilled analyst based on your successful treatment, say, of a wart by eating fig leaves, and so you reject out of hand what DAFTs say, you are a fool. In the age of Big Data and Google, more and more people are simply assuming that they can find the better cure, the real story, the true science by applying their own noggin to some problem.  And so we emerge with people convinced GMOs are inherently evil, that vaccines are bad, that climate is not changing or is changing for natural reasons, or that evolution doesn’t happen. And this is troublesome because we have people who work in quite rigorous ways to examine these questions; substituting our unchanneled curiosity for their expertise is a bug, not a feature.

Now again Douthat has a point, and what you’d like to do is find a way to encourage curiosity without losing respect for those doing the job well. Consider two examples, one from climate change, one from seismology.

Seismology first. A segment of the U.S. population, dominantly along the West Coast, thinks they have discovered a way to predict earthquakes (er, well, discovered many ways as each individual has their own special sauce). Some are silly: one woman was convinced a big earthquake was about to happen because there were a lot of flies on the wall, just like before the one big earthquake she had experienced.  (This is a common human effort to identify dangerous situations–lots of details stick in your mind from such events and you search them looking for clues to avoid the next time that danger appears). But many are more methodical but are often covering ground long ago found to be fallow. Many are genuinely curious but unskilled.  How do you tell them they are wrong without just being a DAFT?

One approach is to give them some tools to really test their ideas rigorously.  For many years Alan Jones at SUNY Binghamton would tell prospective earthquake predictors on an earthquake newsgroup just what the random chance was of a given prediction being right. Part of this was forcing predictors to actually make the prediction clear. Predicting a magnitude 5 somewhere in the world in the next day? Pretty near 100% chance of being right by chance. A magnitude 7 within a 100 km radius within a 12 hour window? Pretty darn low–hit that and we’ll talk.

What happened varied. A few folks just couldn’t believe they were wrong and would continue to make prediction after prediction, ignoring that either they were predicting things close to the sun rising in the morning or were moving the target after the fact (that earthquake was only 300 km away from my area! And only half a magnitude too low! That was a hit!). But many–maybe most–when given real evaluation tools could see that their predictions were not working. They were disappointed, of course, but they took it well because  they were allowed to really test their ideas.

Awhile back GG mentioned a more professional case.  Richard Muller, a Berkeley physicist, was a darling of the right for claiming that climate scientists were making horrible mistakes in estimating global warming. Now, he could have simply continued saying these climate scientists are idiots and at minimum he’d have collected lots of frequent flyer miles in traveling to and from Congressional hearings. Instead he did the unexpected and tried to rigorously test his own ideas…and found that if anything, the climate scientists were too conservative.

In both cases skeptics of DAFTs could take matters into their own hands, and as a result they learned why the DAFTs were saying what they said. But you have to be willing to exert some effort, and a big part of that is making your opinion falsifiable. What is the evidence you need to collect? How do you collect it so you aren’t just reinforcing your original opinion? This is hard work–and here’s the thing: you have to be willing to do it. Just saying “hey I’ve reached a different opinion, prove me wrong” is insulting to somebody working for years on a topic; don’t expect them to drop everything to prove you wrong, especially if there is a train of other folks behind you making the same demands.  That isn’t their job.  But they might be willing to point to some tools you can use to explore and learn on your own.

As with many things in society, there has to be a balance between accepting what an authority figure tells you and questioning it. Being curious and making personal explorations is fine, but disregarding expert opinion simply because you’ve made your own guess is courting disaster. Raw unbridled curiosity is not the same as encouraging science; perhaps exploration of pseudoscience is a side effect of a culture willing to make scientific advances, but it is not the rootstock of science itself.

Long ago a student of paleontologist Steven Jay Gould was found to be a creationist. Some were surprised at Gould’s response.  Let him stay, Gould said.  If he can prove he is right from the evidence, that’s fine, but he has to play by the same rules as everybody else. So don’t expect the medical establishment to suddenly start recommending your wart cure because your wart went away; you want it to be believed, best to design some good experiments to prove it.

Tags: ,

2 responses to “Can Curiosity Kill The Sciences?”

  1. Paul Braterman says :

    Reblogged this on Primate's Progress and commented:
    I am in the middle of a series of posts about scientific method, so this seemed very much to the point. especially, how to avoid acting like a Doctor Authority Figure Type (DAFT), while still defending the value of expert evidence over anecdote (and, I would add, over ideology)?

    Like

  2. David McKnight says :

    Well some DAFT types will remain daft whatever we do but surely we cannot deviate from the advice or rather ‘mantra’ that we encourage the maximum effort to apply scientific method on any and every occasion. We do not suddenly get wise through science. If we miss one observation , if we do not repeat enough times or collate and compare the results of others then we cannot expect Rolls Royce results. Encouragement to be religious about science is the only defence against people who hate science or want to see holes in it or at times feel for its downfall. Give them praise for having an hypothesis in the first place.Do they know what that implies? Encourage through every stage of the process until they want success to come from their own efforts and let go of your mentoring.

    A far more interesting challenge is “Are people born with curiosity or can it be developed and how?

    Like

Leave a Reply to Paul Braterman Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: