As I was writing this post about smoking cessation and exercise for The Metro Spirit, I felt kind of cheap. The reason: I was extolling the benefits of a study “recently released.” I usually have a big problem with media reports of new studies. We know there is a problem with science literacy, and media reports often have mis-representations of the signifigance or implications of new research.
A quick search found an example of a media report entitled,
This report detailed the observation of only 32 out of the thousands of individuals who have weight loss surgery. There was no risk:benefit analyisis and the researcher even refused to comment on how common the occurence of this issue is among people treated for weight loss with surgery. The research seemed solid, but the fright-factor of the headline seems out of place.
I have decided, therefore, to pay more attention to the type of evidence I’m writing about. Type of evidence you say?
Levels of Evidence
Scientific research/evidence is broken down into levels.
Or here.
At the bottom is case reports, which are simple observations recorded objectively. Further up, there are Randomized Controlled Trials (RCTs), which are good prospective studies where one can begin to imply causality. At the top are Systematic Reviews (SR), where statistical researchers sort through only the best RCTs and hold them to high standards. The conclusions of a SR can be held in very high regard.
For what its worth, and your still reading this, the study about smoking I mentioned above was a Systematic Review. I just want everyone to know that as I do my part to advance science literacy…(and release my inner nerd energy.)
I will hereby vow from now on, to report the level of evidence in articles I’m citing when appropriate. I’ll simply tell you Good, Fair, or Less Than Solid.
Labels: Metro Spirit, Research, Science Literacy