The importance of stupidity in scientific research

I don’t entirely agree with this piece in it’s entirety, but stumbled across it doing literature searches and thought it’d make a great article for Climate Shifts. Titled “The importance of stupidity in scientific research”, below is an editorial piece published in the Journal of Cell Science by a microbiologist named Martin Schwarz that makes for interesting reading:

I recently saw an old friend for the first time in many years. We had been Ph.D. students at the same time, both studying science, although in different areas. She later dropped out of graduate school, went to Harvard Law School and is now a senior lawyer for a major environmental organization. At some point, the conversation turned to why she had left graduate school. To my utter astonishment, she said it was because it made her feel stupid. After a couple of years of feeling stupid every day, she was ready to do something else.

I had thought of her as one of the brightest people I knew and her subsequent career supports that view. What she said bothered me. I kept thinking about it; sometime the next day, it hit me. Science makes me feel stupid too. It’s just that I’ve gotten used to it. So used to it, in fact, that I actively seek out new opportunities to feel stupid. I wouldn’t know what to do without that feeling. I even think it’s supposed to be this way. Let me explain.

For almost all of us, one of the reasons that we liked science in high school and college is that we were good at it. That can’t be the only reason – fascination with understanding the physical world and an emotional need to discover new things has to enter into it too. But high-school and college science means taking courses, and doing well in courses means getting the right answers on tests. If you know those answers, you do well and get to feel smart.

A Ph.D., in which you have to do a research project, is a whole different thing. For me, it was a daunting task. How could I possibly frame the questions that would lead to significant discoveries; design and interpret an experiment so that the conclusions were absolutely convincing; foresee difficulties and see ways around them, or, failing that, solve them when they occurred? My Ph.D. project was somewhat interdisciplinary and, for a while, whenever I ran into a problem, I pestered the faculty in my department who were experts in the various disciplines that I needed. I remember the day when Henry Taube (who won the Nobel Prize two years later) told me he didn’t know how to solve the problem I was having in his area. I was a third-year graduate student and I figured that Taube knew about 1000 times more than I did (conservative estimate). If he didn’t have the answer, nobody did.

That’s when it hit me: nobody did. That’s why it was a research problem. And being my research problem, it was up to me to solve. Once I faced that fact, I solved the problem in a couple of days. (It wasn’t really very hard; I just had to try a few things.) The crucial lesson was that the scope of things I didn’t know wasn’t merely vast; it was, for all practical purposes, infinite. That realization, instead of being discouraging, was liberating. If our ignorance is infinite, the only possible course of action is to muddle through as best we can.

I’d like to suggest that our Ph.D. programs often do students a disservice in two ways. First, I don’t think students are made to understand how hard it is to do research. And how very, very hard it is to do important research. It’s a lot harder than taking even very demanding courses. What makes it difficult is that research is immersion in the unknown. We just don’t know what we’re doing. We can’t be sure whether we’re asking the right question or doing the right experiment until we get the answer or the result. Admittedly, science is made harder by competition for grants and space in top journals. But apart from all of that, doing significant research is intrinsically hard and changing departmental, institutional or national policies will not succeed in lessening its intrinsic difficulty.

Second, we don’t do a good enough job of teaching our students how to be productively stupid – that is, if we don’t feel stupid it means we’re not really trying. I’m not talking about ‘relative stupidity’, in which the other students in the class actually read the material, think about it and ace the exam, whereas you don’t. I’m also not talking about bright people who might be working in areas that don’t match their talents. Science involves confronting our ‘absolute stupidity’. That kind of stupidity is an existential fact, inherent in our efforts to push our way into the unknown. Preliminary and thesis exams have the right idea when the faculty committee pushes until the student starts getting the answers wrong or gives up and says, ‘I don’t know’. The point of the exam isn’t to see if the student gets all the answers right. If they do, it’s the faculty who failed the exam. The point is to identify the student’s weaknesses, partly to see where they need to invest some effort and partly to see whether the student’s knowledge fails at a sufficiently high level that they are ready to take on a research project.

Productive stupidity means being ignorant by choice. Focusing on important questions puts us in the awkward position of being ignorant. One of the beautiful things about science is that it allows us to bumble along, getting it wrong time after time, and feel perfectly fine as long as we learn something each time. No doubt, this can be difficult for students who are accustomed to getting the answers right. No doubt, reasonable levels of confidence and emotional resilience help, but I think scientific education might do more to ease what is a very big transition: from learning what other people once discovered to making your own discoveries. The more comfortable we become with being stupid, the deeper we will wade into the unknown and the more likely we are to make big discoveries.

5 thoughts on “The importance of stupidity in scientific research

  1. Thanks for sharing this Ove. One comment that I found particularly insightful was, “I don’t think students are made to understand how hard it is to do research. And how very, very hard it is to do important research.” I’ll save this for my PhD students and myself.

  2. Very insightful. I’ve copied some lines from it and will try to reread them whenever fear of feeling stupid stops me from doing important things.

  3. On the cartoon: Something similar that I have heard three times is that Y2K was a beat-up over nothing. I have heard this used as a precedent for hysterical warnings about large scale disaster that ultimately amounted to nothing. One denialist asserted that a similar thing was happening with warnings about AGW.

    The premise is faulty though. In fact, when I worked in IT in the lead up to Y2K, we did see instances of software that were failing because of date related issues. These often happened in the months before Dec 31.

    But people didn’t see nuclear reactors runaway, or planes fall from the sky, so some people I guess drew the conclusion that it was a beat-up.

    The lesson I took from Y2K instead was that by early proactive interventions, we were able to fix problems before they arose. Prevention is better than cure right? We shouldn’t wait for foreseeable problems to manifest themselves before taking steps to avert them. I use my brakes before hitting the car in front, rather than hitting it and then going to the panel-beater. These days anyway.

    Please use these thoughts when debating someone who likens climate to change to Y2K.

Leave a Reply

Your email address will not be published. Required fields are marked *