BS…Bad Science? Or just Bullshit?
I’ve spoken a lot about the importance of asking for evidence, and not taking claims on blog posts, social media, or even press releases at face value. I still stand by this, and the recent media scare over acrylamide just shows that – the media love sensationalised headlines. To put your mind at ease, there’s a link to cancer in rodents but there hasn’t been one found in humans, so keep enjoying your toast. It’s not going to kill you. Really. We have more important things to worry about.
But it’s got me thinking about the importance of basic scientific literacy, and how I wish understanding and critiquing evidence were a part of our general compulsory education (along with things like how to do taxes, how to recognise abuse, how mortgages work… but that’s another story). I wish people knew how to be a BS detector and sniff out the bad science.
It struck me that there’s no real point just telling people to be more sceptical and more scientific in their thinking, without explaining how. Most people don’t have a degree in science, and don’t understand the complex language often used by scientists in most publications, and even if they did, these papers are often trapped behind paywalls. Not ideal when we need a more scientifically literate population.
So, in continuing my mission to make nutritional science (and science in general) more accessible, I’m going to do my part and try to make topics like peer-review, statistics, hypotheses and methodologies more fun and exciting. Wish me luck!
Obviously this information cannot substitute a BSc, and shouldn’t be taken as such, but it may at least help to clear up some of the confusion and help to (at least ever so slightly) bridge the gap between the general population and the scientific community.
How to become a BS detector:
I’m going to go through a lot of these in more depth in the coming weeks/months. But here’s a basic guide to what to look out for, and what should set your BS detector tingling.
1. The ‘Expert’?
Can this person be considered an ‘expert’? Do they have the relevant qualifications and expertise to be making these claims? Look them up. Is there any conflict of interest? If someone has a series of letters after their name it’s easy to simply assume they’re legit. But different qualifications hold more weight than others, and if it’s not the obvious BA/BSc/MSc/PhD/PGDip/etc. then it’s worth looking up. For fitness for example a personal training qualification is vital, and nutritionists and dietitians have their own set of qualifications, which I’ve spoken about in detail here.
2. One study found…
Are the claims based on “one study found…”? Public health guidelines are based on hundreds, if not thousands of studies to avoid basing important advice on outliers. A single study doesn’t tell us much – it’s a good start – but it could just be an anomaly and doesn’t tell us the bigger picture. We can’t know for sure until we put all the evidence from many studies together. There are levels of evidence, and other factors to consider such as number of participants, study duration, and so on. More on this here.
3. Anecdotes as evidence
Relying on anecdotes is a clear warning sign. The plural of anecdote is not data. What works for a single individual cannot be applied to the general population. Anecdotes are the lowest form of scientific evidence, with meta-analyses/systematic reviews at the top of the pyramid, and a great many others in between. Using anecdotes to assume something will work for everyone is wrong.
4. Going against the scientific consensus
The scientific consensuses are based on the accumulation of the evidence that is currently available. New evidence is constantly analysed against this to update what we know to be scientific truths, and to test the validity of currently accepted theories. Extraordinary claims that go against this require extraordinary evidence.
This is the trickier part. An article can seem legit and be written by someone with a PhD in a relevant area, but that doesn’t guarantee that the science is good. We have ‘gold standards’ when it comes to collecting and analysing data, but these vary according to what is being investigated. This is probably one of the hardest things to call BS on simply because it requires such specific knowledge, and if you’re not familiar with the strengths and limitations of these, as well as how they work, then it’s almost impossible to tell if it’s been done well. So instead of boring you with all the details, I’ll be guiding you through two papers: one good and one bad, and analysing the differences between them, so stay tuned for that coming soon.
6. Media article vs journal article
The media love sensationalised headlines; they love the outliers and the odd bits of science that are interesting and weird. Those are fun to write about. Writing about public health guidelines being the same as they were 5 years ago because the science is still robust? Yawn. Unfortunately what’s written in a newspaper article is often a massive extrapolation and exaggeration of what’s actually been reported in the results of a scientific paper. Even if you can’t access the paper due to a paywall, you can still read the abstract, and it’s worth comparing the newspaper headline (and content) to the results and conclusions of the abstract. Often you’ll find they don’t really match up. Trust the science over the media there for sure.
The paper has had to go through a hardcore peer-review system, where scientists get to anonymously criticise everything that’s wrong with a paper they’ve been given to analyse. They get to anonymously determine whether the paper is worth publishing or not. As someone who’s been through this let me tell you, it is fucking brutal. There’s no niceties, no benefit of the doubt; every tiny flaw is attacked, and many papers don’t make it past this stage as a result.
7. Retracted papers
If a paper has been retracted, don’t trust it. Simple as. There’s clearly a major flaw with it that can’t be resolved with a simple amendment, and as such it’s deemed not advisable to share with the wider scientific community and general population. In some cases it can be because the paper can be dangerous – take Wakefield’s paper on the MMR vaccine and autism for example. It’s been disproven so thoroughly, his paper has been retracted, he’s been named and shamed…yet people still use this paper as ‘proof’ of vaccines causing autism. Papers are removed for a very good reason, and thankfully it doesn’t happen too often.
8. The power of social media
Social media can have it’s major downsides (using Instagram to find medical/nutrition advice for example), but it also has its advantages. Following scientists, registered nutritionists, and registered dietitians on Twitter, for example, is a great way to connect and see their views on certain topics based on what they share. You can also use this to contact people directly to see what they think, and even message the author of a paper you’re interested in or would like explaining in less complex terms. You can even use social media to find amazing events like the Rooted Project which connects experts with the general public, and allows them to discuss their areas of expertise in an exciting and engaging way.
So those are my quick top tips to becoming a BS detector. As I said, all of these points will be covered in extensive detail, and hopefully not in a way that’s dry and dull. We’ll add some examples, a dash of sarcasm, and some cool diagrams (hopefully), and that way I hope you’ll be entertained and informed. Stay tuned – we’ll make a fine BS detector of you yet.