By Theresa Patten, PhD
This is the second post in our series about how scientific findings are communicated and the consequences thereof.
“Do your own research.” It sounds harmless, even admirable. But dig a little deeper, and you’ll find that this philosophy is a major antagonist in the fight against scientific misinformation. Anti-vaxxers “did their own research” when they found a Lancet paper (later retracted) that claimed a link between the MMR vaccine and the onset of autism. Despite the paper’s retraction and its claims having no scientific basis, a nationally representative survey conducted in 2019 found that 18% of respondents believed vaccines cause autism. Ivermectin, an anti-parasitic drug most commonly used in veterinary practice, received attention after a few early studies suggested it might prevent and treat COVID-19 infection. It couldn’t have helped when Joe Rogan, currently the most popular podcast host on Spotify, shared “his own research” and personal belief in ivermectin’s efficacy. The research on ivermectin was later retracted by the author or subject to criticism. Still, due to the initial hyperbolic attention, the CDC reported a 24-fold increase in ivermectin prescriptions in August of 2021 compared to the pre-pandemic baseline. This sharp rise in prescriptions was associated with concern from Poison Centers and the FDA about the number of individuals reporting medical complications and hospitalizations following inappropriate ivermectin use.
Scientific misinformation comes in various forms. Some instances of it can come from non-scientists spreading their personal beliefs about a topic in the absence of expert knowledge. Others can originate from (supposedly) scientific sources that did not intend to deceive but were later proven wrong. The examples above highlight how misinformation bolstered by “cherry-picked science” and famous personalities threatens individual and public health. “Do your own research” is a wonderful idea, in theory. But the reality is, in 2023, the world is far too rich with information for us to find and understand it all without some help. When left to our own devices, we tend to seek out and believe more strongly in information that confirms our pre-existing beliefs. These tendencies make non-experts unable to make unbiased conclusions about unfamiliar topics. Instead, we are encouraged to turn to scientists who are experts in their fields, like Dr. Anthony Fauci during the pandemic, or look to consensus reports that summarize our breadth of knowledge on any one topic, like the IPCC’s report on climate change.
But if we expect Americans to turn to these kinds of vetted resources to address their scientific questions, it is essential that they trust scientists. Survey data shows that science is held in fairly high regard by the American public, relative to other institutions like the press and government. But why? What have scientists done to earn this trust? For a long time, we haven’t had to do much. Pleased that the public approves, we have mostly sat back and cashed in on America’s romantic image of a scientist as the earnest and bookish Nobel Laureate. Now though, it seems that times have changed. PEW research reports Americans' confidence in scientists dropped in 2021 and 2022, after an upward trend in the 2010s. We can no longer afford to sit back and let public trust in science dwindle.
So what’s next? How can we build and maintain trust in science as misinformation runs rampant on the internet? For these “simple” questions, I spoke to Dr. Carl Bergstrom. Dr. Bergstrom is a professor of biology at the University of Washington who has devoted part of his career to the fight against scientific misinformation. Until recently, his strategy focused on improving data reasoning. He designed an undergraduate course, "Calling BS," where non-science majors learn to understand and use numeric information to make informed conclusions. He later wrote a book with his colleague, Dr. Jevin West, based on similar concepts. While Dr. Bergstrom still believes that improving his students' data reasoning is important, he no longer believes it is the silver bullet that will end all science misinformation. And it is certainly not an effective strategy for building general trust in science. What is his current focus, then? You might be surprised, as I was, to learn that it’s civics.
If you're like me, when you hear "civics," you are transported to your middle school classroom, watching a School House Rock video, where a talking “Bill” dreams of becoming a law. Although we often immediately associate civics with government, its definition, "a social science dealing with the rights and duties of citizens," can easily be applied to any social structure, including science. Think about what your civics class probably taught you. You may have had to memorize the number of seats in the House of Representatives and how vetoes work. But your major takeaway was most likely that our government has checks and balances and because of that, we can generally trust its ability to move us (sometimes on a very winding path) toward a better society. Science, too, has rules, norms, and checks and balances. Some examples include the peer review process, the expectation that a groundbreaking experiment will be reproducible by others, and the high bar needed for a theory to become scientific consensus. However, most students don’t learn about these concepts unless they continue to study science at the graduate level and conduct scientific research of their own. Dr. Bergstrom suggests that this failure to teach the civics of science is how we also fail to teach “why we ought to believe in science and why science as an institution is generally credible.”
My conversation with Dr. Bergstrom led me to two fundamentals of Science Civics 101 that I think could be among the most important to help non-scientists better understand and trust the norms and processes of science: (1) Emphasize the "Scientific Lifecycle" over the "Scientific Method" to build trust in science and (2) Teach people to be comfortable with the scientific uncertainty that always exists when discoveries are first reported.
Emphasize the scientific lifecycle over the scientific method to build trust in science.
In our science classrooms, we learn from textbooks that report scientific discoveries that have already become consensus. The earth is spherical. Water is made of hydrogen and oxygen atoms. Photosynthesis requires light. But, we are not taught how these hypotheses were developed, what experiments were performed to validate them, or why alternate theories were abandoned. By simplifying our presentation of science, we may give the illusion that science moves along a straight path and results in a list of explicit facts that can be printed in a book. The scientists reading this post are undoubtedly scoffing at this misrepresentation. They know an idea’s complex and often unsuccessful journey as it moves from hypothesis to consensus. But do people who have never worked in a lab know that? Maybe not. Perhaps, by changing our focus from teaching the individual-level concept of the scientific method to the community-level idea of the scientific lifecycle, we could demystify the checks and balances that inherently exist within science.
When the scientific method came up during my conversation with Dr. Bersgstrom, he said, "It is not at all, in my opinion, why science is fairly reliable. It's reliable because of the way it functions as a social institution.” You might remember the scientific method from your own education. It is the idea that a single researcher should make an observation, develop a hypothesis, test it, and report the results. This concept has value. For example, it might teach a student to be inquisitive or when to trust the claims of other students. But, it is unlikely to increase trust in science as a whole.
Now, consider the scientific lifecycle, which is roughly illustrated in the flowchart below. It starts with an idea, which becomes a hypothesis. If many other scientists collect evidence in support of this hypothesis, it could become a theory. Then, if this theory is backed up by data generated by many more scientists and verified using various techniques over decades, it could become a consensus. This more complex and realistic depiction of information flow through science gives a better sense of the number of people involved in a single consensus discovery and the frequency and duration with which those people are checking and re-checking each other’s work. Additional checks and balances exist within the scientific institution that can be mapped along the lifecycle. For example, science follows norms to determine who is qualified to evaluate certain research (i.e., the notion of expertise) and how to identify untrustworthy reports due to scientific error or misconduct (i.e., the peer review process and post-publication critique). Although these concepts are beyond the scope of the Science Civics 101 flowchart, they are an essential part of science’s system of checks and balances that, if taught universally from a young age, could instill broader trust in scientific research.
2. Teach people to be comfortable with the scientific uncertainty that always exists when discoveries are first reported.
Take a look at the arrows to the right of the flowchart. They indicate the passing of time and the concurrent building of certainty in a hypothesis. The American public does not yet accept this concept of uncertainty in science. Instead, the public perceives scientific uncertainty as either a lack of transparency or incompetence, undermining trust in scientists’ integrity and expertise. Americans' response at the beginning of the COVID-19 pandemic highlighted this discomfort with scientific uncertainty. People were asking legitimate and urgent questions: "How dangerous is COVID-19?", "Will a mask protect me?", "Are my children safe?" Because of the high level of uncertainty that exists at the beginning of the scientific lifecycle, public health leaders weren't providing clear and consistent messages. Not because they didn’t want to, but because clear and consistent answers didn’t yet exist. Making matters worse, when people went out to do their own research, they could easily find freshly published, and therefore highly uncertain, science that validated their pre-existing, and often partisan, beliefs.
Perhaps this should not come as a surprise. As I’ve mentioned, our science textbooks deliver facts and imply that we fully understand each topic at hand. We have chosen only to teach the things that we are certain about. Perhaps, this method worked fine when trained scientists were mostly the only ones interacting with science at the beginning of its lifecycle–when journal articles were mailed out as periodicals to paying subscribers. Now, science hot off the press is constantly available to anyone with a web browser via open-source journals and preprints. Ultimately, I think democratizing access to scientific research is a good thing, but it requires that we more broadly teach the relative uncertainty at all stages of the scientific lifecycle. People need to understand that when a new topic is first explored, like at the beginning of the COVID-19 pandemic, our confidence in any one publication is extremely low. A single result should be thought of as an intriguing observation and understood to be proof of nothing. Even if that interesting paper turns out to be true in the long run, it is tens of years and hundreds of publications away from consensus. Furthermore, it is important for people to understand the reverse: once the scientific community establishes a consensus, it is incredibly likely to be true. At that point, almost everyone who has devoted their life to understanding a phenomenon has agreed upon an answer, and we are encouraged to investigate how exactly they came to that conclusion with a bit less skepticism.
There's much more that we need to do to mitigate the many forms of science misinformation, which go beyond the problem of folks falling prey to cherry-picked science. Here, I’ve explored some of the concepts of science civics, which Dr. Bergstrom and other experts believe are critical to prepare Americans for our new world flooded with misinformation. In this new era, everyone interacts with science at all stages along its lifecycle. We need to prepare them for that. Although modifying our science curriculum is far less flashy than a science misinformation fighting bot, I've been convinced that improving Americans’ understanding of Science Civics 101 could reduce science misinformation and misunderstanding.