Philip Kitcher, Has Science Journalism Helped Unmask a “Replication Crisis” in Biomedicine?, LARB, Nov. 28, 2019. From the article:
During the past eight years, many astute people, inside and outside the scientific community, have worried about the quality of scientific research. They warn of a “replication crisis.” [...] What is going on?
Explanations typically fall into three categories. One possibility is that contemporary science, at least in some domains, is full of corrupt and dishonest people who routinely commit fraud, making up data for experiments that were never performed, or misreporting the results they have actually found, or tweaking their graphs and prettifying their images, and so on. [...] A second possibility is that incompetence or sloppiness is at play. As in Nick Carraway’s verdict on Tom and Daisy Buchanan, biomedical and psychological researchers are quite simply careless people who make a mess for others to clear up as best they can. And the third possibility: Neither fraud nor lack of rigor is responsible for the problem. Investigating some kinds of scientific questions may simply be devilishly difficult, sensitive to myriad factors that are hard for scientists to survey and control. In this case, the difficulties of replication represent the growing pains of an area of research as it struggles to achieve stable and reliable findings.
It's the third possibility that interests me:
But it is far from obvious that fraud or sloppiness lies behind most cases in which results prove difficult to reproduce. In fact, most scientists can report how, despite admirably conscientious procedures, they themselves have sometimes been unable to replicate experimental results they had obtained in one place or at one time. Relatedly, the tacit or unconscious knowledge of the laboratory investigator can have an impossible-to-discern impact on results. Recognizing the role of this tacit knowledge is one of the great achievements of recent sociological studies of science. However carefully a given researcher tries to describe how she had performed an experiment, the “methods” section of the published article will inevitably omit certain details. Indeed, she may be quite unaware of the tiny, but consequential, features of her laboratory practice that are crucial to the — repeatable — result she has found.
This point is worth further emphasis. It is actually part of almost everyone’s experience: few of us pass through high school science classes without, at some stage, failing to set up and run an experiment appropriately. Similarly, beginner cooks frequently can’t make a recipe work. And novice gardeners may over- or under-water. Most of us can’t assemble furniture from the parts delivered in the box without experiencing some frustration. It’s hardly surprising, then, that everyday difficulties are magnified when scientific investigation is at its frontiers and the experimental work envisaged outruns established conventions. In much biomedical and psychological research, investigators struggle for months and years to obtain acceptable data. Findings obtained on one occasion or in one sample may be at odds with those delivered by others. Only after much adjusting and tinkering do researchers finally arrive at a result they take to be stable. When others then try to repeat what has been done, the would-be replicators sometimes do not invest the time required to generate that same stability. Indeed, even when the original investigators themselves later attempt to redo the experiment, they more often than not have lost the skills they had built up in the initial long process of modification and tweaking. Like a tennis player who returns to the courts after a significant absence, they are rusty.
The rest of the piece goes on to discuss two books: Richard Harris, Rigor Mortis: How SLOPPY SCIENCE Creates WORTHLESS CURES, CRUSHES HOPE, and Nicolas Chevassus-au-Louis, Fraud in the Lab. He's a bit skeptical, in effect, because neither author even considers the fundamental difficulty of doing science well and so makes no effort to sort out the relative contributions of those three sources of flaws in the scientific record: fraud, incompetence, and difficulty.
He concludes:
Science journalism is crucial to democratic societies, whether or not it explains the details of new scientific findings or reports on a general feature of the scientific enterprise. Plato famously thought democracies would end in disaster since the majority of the citizens are too unintelligent to think through the issues confronting them. His elitism was wrong. But, as the world has learned, ignorance, often fed by misinformation, can be as toxic as stupidity. Had the message from climate science been clearly enunciated to the public two or three decades ago, our species might well have moved beyond bickering about the reality of anthropogenic global warming. We might now have been in the thick of discussions on hard policy questions that arise in the course of trying to preserve our planet.H/t 3QD.
Journalism can do much good, but also considerable harm when it lapses. Yet, as I acknowledged, delivering clear messages that capture and retain the attention of lay readers is exceptionally hard. News media and individual journalists are constantly tempted to fall into narrative traps, provide simple slogans, tell catchy stories, add human color, portray research as an exciting horse race — and pretend that issues remain open long after the evidence has closed them. That’s the way to create clickbait, raise newspaper subscriptions, or sell books. As things now stand, science journalism suffers from the same perverse incentives to cut corners that both Chevassus-au-Louis and Harris identify in the social structure of biomedical research. In this case, the corner-cutting consists in not doing anything that might tax the reader. Never analyze. Never present a sustained line of reasoning. Entertainment is everything.
Schools of journalism might try addressing the problem by more actively seeking out students with strong backgrounds in science, offering them rewards for undertaking the training required for writing in ways that are informed, enlightening, and vivid. They might develop and inculcate Slow Science Journalism.
No comments:
Post a Comment