I figure that some of ya'll (including me) who snark at Jez's attitude towards writing about science might appreciate the lifehacker guidelines...and possibly wish they were a kinjawide policy.
We cover a lot of topics at Lifehacker, including health, medicine, nutrition, and psychology. Whenever we do, we strive to make sure we're presenting clear, accurate, and scientifically backed information on those topics, derived from vetted and trustworthy sources. These guidelines aim to make sure we continue—and improve upon—that tradition.
These guidelines are primarily aimed at our writers, but in the interest of transparency, we've posted them publicly here.
The Short Version
The Long Version
A lot of media outlets play fast and loose with science reporting. Most strive for some level of accuracy, but too often science coverage takes the form of "How can I make this research/discovery interesting to a non-scientific audience?" Our answer to that question at Lifehacker should be: "If you have to ask that question, stop and rethink why you're writing about it."
Our primary goal when we cover these topics in the scientific realm is to use the data available as support for facts, reference information to bolster our points, and as primary sources for useful information. Some of the best pieces we've written have been "myth busting" with the help of research and data. Others have been "interesting facts about medicine/psychology/food/etc. based on science," with research and literature cited accordingly.
What we shouldn't do, however, is look at articles based on studies, press releases, or science blog posts and try to massage them into Lifehacker tips. Doing so almost always walks us into the same trap that so many other media outlets fall into.
This doesn't mean we should shy away from those topics. The key is to make sure that you're making a claim that's sufficiently backed by credible science—enough that it can be properly generalized to everyone (or at least enough people that the post is relevant.) Unfortunately, this means that when you see a new study on a site like PsyBlog or Science News, you can't take their angle for granted and assume it's what the researchers themselves concluded, or even studied. That brings me to our first rule:
Read and Write About Studies, Not Articles About Studies
If you see an article in the New York Times about how 10 minutes of sunlight every day staves off depression, before you post it and cite the Times, read the article for the names of the researchers that conducted the study, the journal the study was published in, or the institution the researchers are from. Ideally, they'll link the study in their article, but many outlets don't (for various reasons—some assume readers aren't interested, others don't because studies are usually behind paywalls. We don't care about either of those reasons. It's more important for us to provide the option to read the study to our readers, and we never assume they wouldn't want the raw data.) If they don't cite that information, do some clever Googling, like "researcher name" and "study" or "journal name" and "daylight" for example. You'll almost always turn up the actual study in question.
Unless you have access to scholarly journals, you may not be able to read the study either (although many journals do offer them for free, and many universities and libraries make the studies available to the public), but at the very least read the abstract and the conclusions. Read a sample of other articles on the topic. See if they all agree, and more importantly if the abstract and conclusions support the claim the article is making. If you can draw a direct line between them, good. If it's fuzzy, or they don't clearly relate, or you don't understand, hold off unless you can find more specific, corroborating information. Ideally, you want access to the full text.
Avoid Common Science Reporting Pitfalls
First, take a look at this graphic. Bookmark it, save it, whatever:
The whole thing is good to keep in mind when you're trying to avoid reporting a study or scientific announcement poorly, but we're going to hit on some of the ones that are relevant to Lifehacker and the kinds of science we write about.
- Make sure your science is relevant to everyone. That means avoid obviously small sample sizes. If a new study says that drinking orange juice in the morning will make you more productive all day, but the sample size is something like "20 male college students from France," then the study was carefully designed to limit external factors and is probably designed to be preliminary research and a starting point for future investigation. At the same time, you would never say that 20 male French college students are representative of the world at large, so why would you cover the study on a site read by people around the world from all walks of life?
- Remember, not every study needs to be reported, and just because a study says something doesn't make it undeniable truth, or even news. People often forget that science is a slow process. Studies get published, but it's usually media outlets that try to proclaim a "truth" because "a study says so." In reality (and we'll discuss this more later), studies are almost always specific bits of research on very specific things. They're a tiny part of a big picture. Unless there's a broad wealth of data to support a generalization, or the takeaway is useful, practical, or good to think about, avoid citing a single study as any real "truth." Similarly, remember not every study is news, or even applicable beyond a starting point for more research.
- Correlation doesn't equal causation...but it's a start. Whenever you find a study that indicates a "correlative relationship" or says that one factor "correlated" with another, keep this site in mind: http://www.tylervigen.com/ Correlation absolutely does not indicate causation, and if you cover a study or series of studies with only correlative relationships, you should mention it in your coverage. However, that doesn't mean the takeaway isn't interesting, and in the case of some studies (especially social science, where proper controls are difficult, and at times impossible), correlation is all we're going to get. That's fine, just make sure you discuss that (again, if the study's worth discussing at all).
- Avoid sensational, bombastic headlines. The job of the headline and the lede is to get readers interested in the story and convince them to click and read more. There's no doubt about that. What you don't have to do however in order to accomplish that goal is to make a claim that the study or the science you're covering doesn't support. A study that indicates that there's a "correlation between a and b given c and d as environmental conditions" doesn't mean "avoid a if you don't want b" or "a causes b" in our headlines. In short, don't write checks that your headline—or your linked study—can't cash.
- Avoid infomercial language, like "Scientists say" and "Studies show." Anyone can say "studies show" or "scientists say" as an empty appeal to authority. Let's not be anyone. Don't use empty appeals to validate your point. If you have a study that indicates something (mind you, studies "indicate," "point to," "suggest," and "explain." They rarely "show," which implies permanence) then be up front about who wrote the study, where it's published, and what journal it's in. For every "Studies show," you could (and should) write "A study by researchers at Boston University and published in the Journal of Cardiology suggests." Don't be coy about your sources. Strive for transparency—our readers expect it, and anything less will (rightfully) set off their BS detectors.
- Avoid Dr. Oz-style, daytime talk show science. If you catch yourself looking at things like "this new herb from the Amazon helps cure diabetes," or "this vitamin will make you healthier," take a step back for a second and really look at the study or the backing for that statement. Those kinds of "consume/buy/use X thing to get Y result" in context of health/fitness/medicine leads to things like "Acai berries cure cancer!" and "This funny herb from the Amazon will make you live to 100!" Check your sources—is this based on an article from a reputable news source, or a periodical with a track record of fact-based coverage? You should always take your source into account, especially when you're reading about things that could fall into the "one wierd trick" or "miracle cure" category. Even if you have studies that back up the idea that "the juice from this plant has tons of vitamin a and vitamin a is shown to reduce lung disease in certain populations of rats," that doesn't mean we should tell people to start buying pills made from this plant/drinking juice from that plant for their health. Remember, the science of medicine, health, and nutrition is complicated, difficult, and takes time. There'll rarely—if ever—be a situation where we need to post a study that says some specific thing, food, vitamin, or nutrient is something you should get a ton of/avoid at all costs/etc.
- Avoid super speculative language too. If you catch yourself saying things like "may possibly be related in part to," because you're trying to spin a study into a tip that's meaningful to the general public, stop. Similarly, If you're trying to invent some potential meaning or social effects from the study or the news, stop. For example, if you're writing about a relationship between a chemical found in popular foods and productivity, and your angle is "X makes you more/less productive," stick to the data in the study that relates to that. You're probably working too hard to relate something that a scientist said to something you're thinking makes a good tip. Look at what the study actually said, preferably the full text of the study, and re-evaluate whether you should be drawing the conclusion you're drawing at all.
Do a Little Due Diligence
Before you cover a study or journal article, do a little digging. When you claim the article, do a quick Google search for the topic and see who else has covered it. Obviously you should be looking primarily at the journal or the study in question, but look around and see if other reputable organizations have picked it up, and what they've said. Don't copy them, but read their angle and approach. Were you planning to go in more bombastic than they did? Are they overly excited about the news? Show some restraint in either case—it's easy to find science reporting done poorly, but it's difficult to do it right. See if you can find examples of others before you put words to the page. Like we said, studies are rarely timely, have-to-post-now kinds of things, so you should have time to do your homework.
In the same vein, when you do that digging, look up the journal or periodical you're writing about. Beware niche journals, and try to get a feel for how large, respected, and peer-reviewed a journal is before you take a journal on their word. If the journal is published or aggregated by services like Springer, Elsevier, PLoS, PubMed, and so on, you're usually in good shape. If the journal is something super small, like the "Journal of Aquatic Wicker Construction" and it self-publishes without a review process, you should probably shy away from its conclusions. Similarly, if the journal is "The Journal of Antiperspirant Studies" and is published by Unilever or Proctor and Gamble, you may want to take its articles with a grain of salt.
On a related note: watch out for conflicts of interest in your sources. For example: In a recent article about sweating, one of us (prior to editing) cited a site called Antiperspirant Info as a source of information. The information itself was true—it was really just a medical definition of what a part of the human body does, but the site is owned by Unilever, who makes deodorant and antiperspirant. The information may be good, but the choice of source is poor.
Another example is the Gatorade Sports Science Institute. They're a huge sports and physical medicine research entity—but they're entirely funded by Gatorade, and their mission is to do research into sports science that supports and helps market sports drinks. Not too long ago they published some extremely questionable data (not in a scientific journal mind you, just a marketing push of documents they put together) stating that Gatorade was better than water (and in small print, they acknowledged that this was only under certain conditions.) A number of news outlets, and even health and medical departments for schools, universities, and other non-profit institutions picked up on it. It's not true at all, and I called them out in this piece and this one if you want more information on that specific example.
Both examples are really just important reminders to not just see a "journal" and assume it's peer reviewed, or a research institution and assume they're on the up and up. Look around. You likely have varying opinions on the trustworthiness of certain web sites, news organizations, and blogs. Keep in mind that not all journals and research institutions are the same either.
Not All Science Topics are Equal
It's important to remember that we're not a science "news" site. It's going to be rare that some brand new study or revelation is going to be applicable specifically to Lifehacker and our readers. People visit because they want to find ways to make their lives better, easier, and more productive—not because they want to hear about the latest 50-person industry-backed survey that indicates beetroot juice can make you more energetic.
Be especially scrutinous of social science, medical topics, and diet/food-related topics. They fit well into our range of coverage, but they're also the types of science that are more prone to industry tampering, labs with a conclusion to be drawn before the experiment begins (often sponsored by related companies or organizations), poor duplicability (meaning no one has or will be able to obtain the same results a second time), and the aforementioned Dr. Oz syndrome.
While the article focuses on social science (topics like psychology, sociology, and other sciences we may like to consider as "mind hacks") the issue is applicable to health, medicine, food, and diet as well. We don't want to fall into the trap of talking about how "fresh air boosts your creativity" if we can't find multiple studies to back it up—not just the one that crossed our desk. Similarly, just because one study indicates people who adopt a vegetarian, paleo, fruititarian, whatever diet live longer doesn't mean we should run to the compose window and share the news—especially in fields like food and health where studies are at best preliminary and designed to elicit further research, and at worst completely contradictory as time progresses and new information comes to light.
None of this is bad, mind you. That's how science works. People perform experiments and prove or disprove theories and hypotheses. They document their findings and experimental method, and have it reviewed by their peers—other scientists in their field. Their research is published for what it is - not as a harbinger of truth or anything, as research. Other researchers pick it up, do their own testing, and add their own findings (or refute the original findings.) The puzzle comes together as more people get involved. No one study represents the truth, but a large group of studies by a large portion of a field's research base can certainly suggest it.
A Quick Guide to Scholarly Journals and Publishers
As we mentioned, not all scholarly journals and publishers are alike. Some of them accept only peer reviewed material (PubMed, for example), while others accept preliminary studies that haven't been independently reviewed (arXiv, for example). Some are open to the public entirely (like PLoS—not so much a publisher but a library of publicly accessible papers) and others are closed unless you have access. You'll probably run into this often, where the only parts of a study you can actually read are the abstract and the conclusion. Sometimes they'll be good information, enough to get an idea of the study, but you shouldn't rely entirely on them.
If you can, Google around and see if there's a full-text version of the paper somewhere. If you can't find it, your local library, community college, or any other public institution probably has access to scholarly journals if you don't have it yourself. If that doesn't work, you can always reach out to the PR department of the school that's behind the research, or the researcher themselves, to ask about the study or the paper. Odds are they're willing to talk about it, or at least give you a full copy of the paper so your article is complete and accurate—especially if you tell them that's why you're reaching out to them.
I ran down some sources and tips in this guide that we can all use as well:
For reference, here's a good list of academic databases and search engines you'll come across and their access requirements, and here's a list of open-access journals anyone can check out.
Finally, some basic grammar and style bits to keep in mind:
- Journal and other publication names should be italicized. Link to the journal name in your article on your first mention in-text. (They may not have a dedicated website, but a link to their homepage on their publisher's site is enough.)
- Your primary sources at the bottom of a post should be studies, not articles that talk about studies. If you found the study through another article, give that site the via at the end of your post.
- Make sure to always cite the journal name at the bottom of your article—not the publisher. So for example, you would link: Name of Study [link to study] | Journal of Awesomeness instead of Name of Study [link to study] | Elsevier
Some additional science reporting guidelines that have, overall, some solid tips to keep in mind and take to heart:
- 10 Best Practice Guidelines for Reporting Science & Health Stories (The blockquoted section here is of particular interest.)
- Tips on Reporting Science (The bulleted section at the beginning, and the top 10 tips at the end are especially good.)
- Reporting Science News: Resources for the Media
Title image by xkcd.