On “scientific” journalism: An interview with Peter Aldhous

Ask 50 different science journalists how they started writing about science and you’re likely to get 50 different answers. Still, many of my colleagues came to the field with a science background. I did not. It wasn’t until I started writing about environmental health that I cracked a graduate level molecular biology textbook to understand the sort of disease mechanisms you don’t learn about in a political science seminar. Then I started searching PubMed and was surprised to discover how much information lay locked away in the scientific literature — either sitting behind a paywall or comprehensible only to 100 experts. Though it took quite a lot of time and effort to separate the interesting trend from the interesting artifact at first, I thought readers had a right to know the difference between evidence-based and illusory environmental risks. (Whether communicating evidence increases public acceptance of evidence is a complex story for another day.)

Peter Aldhous

But we can do more than just report on patterns that scientists find. We can unearth those patterns ourselves. Science journalists who come to the enterprise with a science degree are in some ways uniquely qualified to do this. So why don’t more of them do it?

Science journalist Peter Aldhous raised this question on a National Association of Science Writers’ listserv earlier this month, prompted by a discussion of the “sting investigation” by John Bohannon recently published in Science. Bohannon sent a bogus “wonder drug” research paper to 304 pay-to-publish open access journals. More than half of those journals that sent it for review accepted it. (Full disclosure: I work part-time as a senior editor for the magazine section of the open-access journal PLOS Biology.)

Some criticized the story for not using a control group (journals that use the subscription model), while others argued that it’s not Bohannon’s job to run a case-control study. Peter, a contributor to MATTER and Medium who teaches investigative reporting at the University of Santa Cruz Science Communication Program, wrote on the listserv (posted here with his permission):

There are definitely differences between science and reporting, as I discuss with my students at UC Santa Cruz in our classes on investigative journalism. However, I believe passionately that science journalism would be a richer profession if more of its practitioners considered when original data analysis, and even the design and commission or execution of methodologically sound studies, is appropriate.

I asked him to share his thoughts on using data in science reporting and why we don’t see more of it. The conversation was edited for length and clarity.

Those who have criticized Bohannon, mostly scientists, have focused on the lack of a control, calling it a shoddy science experiment. Do you see a problem, from a journalism perspective, with his methodology?

I’ve been a little critical as well. But my criticism, and surprise, was less with him and more with Science, because Science has a dog in this fight. I know there’s separation between the news side and the original research side. I used to work there, so I know that very well. But I think for a publication that is involved in the traditional model of publishing to run an article that focuses particularly on open access just seemed not terribly well thought through. Even if you think there’s likely to be a problem with open access journals lowering their standards of what they accept because of the business model, why not compare it with a traditional model? I just don’t get why you wouldn’t do it.

Some of the response to that has been, well, look, he’s not a scientist, he’s a journalist, you’ve got to look at these two things separately, they’re not the same discipline. I don’t agree. I believe that sound data analysis can be an integral part of journalism. And in that regard, though there are important differences between journalism and science, I believe journalism can include aspects of the scientific method in gathering the information you use to drive an informed piece of journalism. I think it’s a bit of a false distinction.

What, in your view, is one of the most important things an investigative reporter, or any reporter, should keep in mind when doing an investigation?

Just look at what investigative reporters are doing, look at the traditions of investigative reporting. A lot of it involves acquiring material through Freedom of Information and other public records requests. But there is also by now a fairly long tradition of doing methodologically sound data analysis within your reporting. It starts out with a guy called Philip Meyer, who used survey research to look at the causes of the 1967 Detroit riots, arguably something that the city never really recovered from. And he turned explanations for what triggered it on their head by applying methods from social science. In the book he wrote, Precision Journalismwhich is a revered tome within the organization Investigative Reporters and Editors, he talks about precision journalism as “scientific” journalism. He means using the methods of science, using sound data analysis to inform your journalism.

Most of the people doing this call themselves investigative reporters. Very few come from a background in science. This is my surprise and disappointment, because a lot of people who are science journalists, like me, were originally trained in science. I happen to have a PhD, but I don’t think it’s necessarily important. But a lot of people have been trained in designing studies, doing statistical analyses, and so on. And I think it’s wholly appropriate for us to keep doing that as we report on the scientific enterprise. I’m not saying you should do a PhD project rather than a news story. There are certain questions, more discrete, more journalistic, that you can get at by applying some of those methods you learned. Unfortunately, we don’t see too much of that, and I think science journalism is poorer for it.

Many science writers take an explanatory or isn’t-science-cool perspective rather than a speaking-truth-to-power journalism perspective. But you’d think people would naturally embrace the same models they used to understand how something works in the natural world to understand how something works to tell a story. Why do you think more people don’t do it?

I think you’ve put your finger on it. This type of journalism does take time and editorial resources. I realize the question of resources is a problem nowadays. However, with some exceptions, I don’t think this stuff was happening to a great extent in the golden days, when there were brimming science sections and generous editorial budgets.

What I think explains it, as you said, is a question of mindset. I think for most science journalists, their model of journalism is explanatory. It’s taking the arcane world of the high priests and priestesses of science and translating what they do into language the ordinary mortal can understand. And I think that’s incredibly valuable and very important if we’re to have an informed society. But it is a different mindset from thinking that part of your job is to keep an eye on these guys and check that science isn’t being used and abused, that there isn’t corruption or fraud. And once you get into that mindset, you’re going to approach things differently. I’d argue that science journalists who have that mindset and wed it to what their training would allow them to do, in terms of data analysis and even studies done as part of the story, it can be very powerful.

It does seem a shame that many who are well-suited to this type of reporting aren’t interested in it. But what about journalists who don’t have a science background and might be intimidated by collecting data and then figuring out what to do with it? What words of wisdom can you share with them?

Whether or not you have training in the scientific method, you should be really aware of the danger of running with scissors. What you don’t want to do is flawed analyses. Unless you are absolutely on top of the method you’re using and really, really know you understand what you’re doing, you want to be taking expert advice.

But I don’t think this is fundamentally different from the rest of what we do. As science journalists we comment on papers that have conclusions based on analyses. If we’re doing our job properly, we have to work out whether it was the right way to do it. We do that by talking to independent sources. If we’re not being appropriately critical, questioning the methods of those we’re writing about, then that’s a problem as well.

But in terms of where do you learn the skills if you don’t have the training already, I’d go back to IRE, which is a fantastically training-centered and professional-development-centered organization. If you want to immerse yourself in this stuff and get a hothouse introduction to it in a very friendly and collegiate environment, you can do no better than to go to the annual IRE [and NICAR, National Institute for Computer-Assisted Reporting] meeting, which is in Baltimore in February. The whole meeting is devoted to teaching skills and data analysis in journalism, from knowing how to use a spreadsheet and database work, through geographic information systems and statistical analysis, and even coding to do data analysis. If you think you might be interested in this stuff, come along, I’ll see you there!

When I went to a training session at the New England Center for Investigative Reporting, one of the instructors, Maggie Mulvihill, talked about getting into a “data state of mind.” What does that mean to you?

I give a talk with a slide on exactly that point. I think journalism is all about questions. You don’t typically start a story by aimlessly phoning up sources to shoot the breeze. Stories start because there’s a question. Somebody, somewhere along the line had a question. If you’re doing enterprise scientific journalism, you’re starting with questions about what’s going on in the world. If you’re not in the data state of mind, you’ll think about who you can speak to. And you still absolutely need your human sources to negotiate what might initially be an unfamiliar landscape. But if you’re thinking about data, it’s not just who can I speak to about this, it’s what sources of information are available that I might be able to interview.

When you launch an investigative project, it's wise to have a good support team. (Image courtesy of The British Library)
When you launch an investigative project, it pays to have a good support team. (Image credit: The British Library on Flickr.)

 

I talk about interviewing data because I think that’s a really good way to think about it. Data can give you answers if you know what sorts of questions to ask. It’s adding that string to your bow, saying not only can I ask a bunch of experts what they think about an issue, but how can I start poking into the information itself and seeing what it says. You may think, well, the experts have been all over it, they know everything. But they’re not necessarily asking the questions that your readers might want to know the answer to. It’s realizing that there are other ways of reporting a story and other ways of asking and answering questions that center around data, which may just be sitting there for you to download or may require you to go out and collect. And that’s the only way certain stories are going to come out.

What are particularly good examples of this type of reporting?

USA Today did an interesting series some years back called “The Smokestack Effect” that looked at what kids are breathing in school. Are there worrying levels of air pollution in the environment in which our kids are being educated, and if so what can we do about that? I thought it had an interesting approach. I don’t think there have been major criticisms of the methods and it certainly won some major awards. They did a couple of things. They employed a model of the distribution of air pollution used by the Environmental Protection Agency, and they also did some of their own air pollution monitoring by setting up sensors near schools and looked at the results. There was a whole bunch of conventional reporting around that but the story wouldn’t have been there without those two key bits of data analysis. Also, they didn’t march in blindly and start using a model, they got expert help. I think it’s a neat example of the sort of thing you could do, but it’s a really, really big project.

But these things don’t need to be huge. I’m not claiming that I’ve done anything particularly earth shattering but there’s one example of a story that came up only because I was in a data state of mind. It was an online news story, fairly quick turnaround. I looked at two genome scan companies, one of the surviving companies has been in the news lately, 23andMe, and the other main one at the time, deCODEme. I had my genome done. That cost a bit of money but once I got the scans, I got story ideas just from playing around with the data.

I realized that one of the companies, deCODEme, seemed to be giving a really weird readout for my mitochondrial DNA. I couldn’t make sense of it. That was when the data was presented in a “genome browser” online. I downloaded the data from both companies and matched up the same mitochondrial markers – so there’s some database work there. It became apparent from the downloaded data, which was consistent across the two companies, that something was going on with the display online. I ended up with an annotated spreadsheet showing that there was a screw-up with what was presented online from deCODEme. It turns out it was a software bug, which raises some interesting issues about IT in genomics and the future of medicine. There was no harm done in this case. It just looked like, as I put it in the story, nonhuman DNA. But as we move toward personalized medicine, we don’t want our doctor prescribing drugs based on errors thrown up by software bugs. So there’s a story to write. But it only happened because I was in a data state of mind. It’s the type of story I think science journalists ought to be doing.

Final thoughts on why science journalists should give “scientific” journalism a try?

Why do I do what I do? I really like science. I always loved designing experiments and doing data analysis, and the writing up part – which is probably true of a lot of science journalists. What’s also probably true is that you get out of science because you don’t want to be one of three people in the world who knows everything there is to know on one little tiny bit of an intellectual enterprise. I want to ask questions across a much broader landscape, which I think is the motivation for a lot of science journalists.

So bringing in the type of approach to journalism that I’m talking about is like having the best of both worlds. I’m having the broad look at the world as a journalist, asking the questions I’d always wanted to ask, which for me is fun and what I used to really like about being a scientist. I can do both. Why wouldn’t you do that?

* * * * 

Additional resources

Tools for getting started, based on Peter Aldhous’ UC Santa Cruz class

Tip and tools from IRE and ProPublica

A Guide to Computer-Assisted Reporting, from Pat Stith, investigative reporter for The News & Observer on Poynter