It wouldn’t obstruct replication but it does give rise to an error in replication so that the newly formed DNA strand carries with it a mutation and you’ve got a virus again.
When I was 13 years old I watched a movie on television called Blade Runner. I had never heard of the movie and didn’t know at the time that it was a famous, ground-breaking science fiction film by a highly respected film maker. I just thought it was a cool-looking movie. It was truncated due to commercial breaks, and I was watching it on the 12 inch TV we had downstairs, and I missed the first twenty minutes, but nonetheless I was completely absorbed. I didn’t exactly understand a lot of it, but I kept watching to the end. When it ended I knew I had seen something of importance, even though I couldn’t have articulated what that was. One line in particular stuck in my mind, the line quoted above. Eldon Tyrell speaks these words to his robot creation, the Teutonic assassin Roy Batty during a complex (and, apparently, scientifically accurate) debate the two have about the logistics of genetically modifying a living organism to extend its lifespan. The seemingly innocuous phrase “an error in replication” in particular stayed with me.
It came back to me tonight after reading my new issue of The Economist. Because I am over forty years old, my copy arrived printed on actual paper, delivered by a postal worker wearing a Canada Post uniform. Apparently, however, The Economist is also available in an excellent digital format. Either way, the cover story this week is titled “How Science Goes Wrong.”
There’s a lot of cynicism about science around these days, most of it from people who communicate with the dead, who use crystals to cure cancer, or who insist that anything “natural” must be good for you to eat (not realising, apparently, that a comprehensive list of natural substances would include uranium, arsenic, coal, and lead, to name but a few). So I’ll admit, I don’t have a lot of patience for people who have little, if any, formal knowledge of the physical sciences yet say things like “What do scientists know? They’re all biased just like everyone else.” Ordinarily I would probably just ignore a cover story titled “How Science Goes Wrong“–if it wasn’t the cover story of a reputable publication like The Economist. But on The Economist, I take notice. The story was fascinating reading, both the leader itself (link above) and the main piece, titled “Trouble at the Lab“.
Let me stress that although I have three university degrees (shameless self-promotional plug, I know) none of them are in science (or the close cousin of science, mathematics). My training is in literature and education, so in any kind of debate on science I am simply just an ordinary guy who knows only what he reads in the newspapers (readers of The Economist will know that although The Economist is obviously a magazine, its editors insist on the strange conceit of referring to it as “this newspaper”). As Confucius himself once said, “I do not innovate, I merely transmit.” If it’s all right for him, it’s all right for me. So let me share with you one central point in The Economist piece that I found fascinating.
Let’s say that I’m a scientist and I’ve made a remarkable discovery: I have found that water is actually composed of two distinct elements, hydrogen and oxygen. I publish my findings in a scientific journal. That’s the end of it, right? Well, not quite. What happens next is that lots of other scientists attempt to replicate my experiment to see if they get the same results. If they do, it strengthens my claim; if they don’t, it weakens it. This is what is called “verification” in science and it is designed to keep science as honest and self-correcting as possible. And since scientific claims can be empirically tested (unlike religious doctrines, cultural folktales, or artistic expressions) the act of replication should, in theory, keep science free from error.
The problem is that there is not enough verification going on in science these days, for a distressingly simple reason: performing replication experiments doesn’t help one’s career. Unlike those who hold doctoral degrees in the humanities (and often end up working part-time at Starbucks while teaching night classes at the local community college, all the while dreaming of one day paying off that $100,000 student loan), a PhD degree in science can lead to a six-figure salary in a relatively short period of time, along with a degree of status and prestige that used to be accorded poets, philosophers and theologians (see previous comment about Starbucks and community colleges). Thus science is now a highly competitive institution. Biologist Craig Venter, famous for his role in sequencing the human genome, once described academia as vastly more cut-throat than the business world (Venter is also a wealthy entrepreneur, so he knows whereof he speaks). You would think that the competitive nature of science would be a good thing for testing other scientists’ results, but not quite, and The Economist explains why:
“[R]eplication is hard and thankless. Journals, thirsty for novelty, show little interest in it . . . . Most academic researchers would rather spend time on work that is more likely to enhance their careers. This is especially true of junior researchers, who are aware that overzealous replication can be seen as an implicit challenge to authority. Often, only people with an axe to grind pursue replications with vigour–a state of affairs which makes people wary of having their work replicated.”
To use a humanities analogy, it’s kind of like this: Everyone wants to write a best-selling novel. That’s how you get rich and famous. There is no money in fact-checking, proofreading and editing someone else’s novel; the real money is in original writing. Everyone knows J. K. Rowling and Stephen King, but can you name their proofreaders? Scientists don’t want to bother conducting tedious verification experiments; they want to do original research. Yet, paradoxically, without the verification process, the results of the original research are less trustworthy.
“Trouble at the Lab” (or at least the one-page leader essay, “When Science Goes Wrong”) should be required reading for all students of the physical sciences. Science teachers, however, often run into a practical dilemma: although they would like to explore these issues in depth, they are often limited by very content-heavy (and exam-driven) courses.
And to be perfectly technical about it, reading and discussing an essay from The Economist isn’t science. The subject of the essay concerns science–meta-science, you could call it–but studying the essay isn’t actually the “doing” of science, which is what science classes are for. And since high schools (for the most part) lack courses in Philosophy of Science or Social Implications of the Breakdown in Scientific Methodology, I would respectfully suggest that English teachers (and Social Studies teachers as well) would be well-advised to take hold of these meta-science issues and run with them. I plan on reading “Trouble at the Lab” with my English 12 class at some point before the semester ends. After all, the unifying theme of my English 12 course is “Pattern Recognition, Narrative, and the Search for Meaning.” Is there a greater form of pattern recognition than science?
And just in case you accuse me of being a shill for The Economist, let me stress that I am immediately suspicious of any publication that fascinates me so greatly. So I looked for, and quickly found, a scathing rebuttal, brilliant and eloquent (and almost as convincing as the The Economist piece itself, argh) by science journalist Fay Flam (who herself holds a degree in geophysics). Her essay appeared on M.I.T.’s science journalism webpage, and it is an essential read after reading The Economist. Her rebuttal is cleverly titled, “How Science Writing Goes Wrong.”
As for me, I think I’ll go back and dust off my old DVD copy of the 1992 Blade Runner Director’s Cut. (I used to think “old DVD” was an oxymoron; now it’s a fact of life.) An error in replication indeed . . .