Tag Archives: research

Running for life?

In this week’s health news, running will make you live longer.  Maybe.  Maybe not.

The National Post headline announced that  “Running for five to 10 minutes a day may add three years to lifespans, study suggests“(7/29/2014).  I shouldn’t blame the journalists for exaggerating when the journal article title oversells the study findings as well:  Leisure-Time Running Reduces All-Cause and Cardiovascular Mortality Risk.  (Lee D, Pate RR, Lavie CJ, Sui X, Church TS, Blair SN. J Am Coll Cardiol 2014; 64:472-81.)

The study basics:

  • subjects = 55,137 people who visited a health clinic for a check-up between 1974 and 2002; mostly white, middle to upper class, college educated
  • Data was gathered at the first visit and, where possible, at a second visit whenever that might have occurred.
  • National Death Index used to identify death through 2003.
  • Running was self-reported estimate of activity over previous six months including frequency, duration, distance and speed.

 All of the graphs in the paper illustrate differences in hazard ratio (risk of death in one group compared to another group) rather than differences in actual mortality rates.  The paper provides mortality rates adjusted for baseline age, sex, and examination year so I’ve graphed those rates for comparison.

running and mortality bar chart

We can make predictions based on this information.  Each person-year is observing one person for one year.  So 10,000 person-years could be following 10,000 people for one year, 5,000 people for two years, 1,000 people for ten years or some other combination.  Comparing a group of 10,000 non-runners to a group of 10,000 runners, you would expect to observe 15 more deaths in the non-running group (45 vs. 30).

Another way to describe this is using Absolute Risk Reduction for all-cause mortality.  The ARR = 0.146%.  So, yes, runners did have a 30% lower risk of dying from any cause but the risk of dying was low anyway (0.45% in non-runners) so 30% isn’t really much of a difference.

The big problem with the conclusions stated by the researchers is that association does not equal causation.  They found that runners had lower all-cause mortality.  That does not prove that running reduces mortality.  In fact, in another part of the study, non-runners at the first visit who became runners as of a later visit, did not have lower mortality.  This study observed people who chose to run and chose how much they ran.  Higher duration, frequency, and/or intensity of running did not result in lower mortality.

Which leads to the newspaper headline.  Can running five minutes per day add three years to your life?  Probably not.  The study didn’t even attempt to answer that question.  They estimate that the life expectancy of non-runners  is three years shorter than runners.

This study is not proof that, if you start running, you will live longer.

Advertisements

Ban reporters from scientific meetings.

I’m serious.  Scientific meetings should be for scientists to disseminate results among themselves and plan future research.  Meeting coordinators need to stop holding press conferences and distributing press releases.  What is their goal?  Influencing funding?  All that seems to be accomplished is the proliferation of articles exaggerating or completely misinterpreting the results.  Studies presented at meetings are often pilot studies, preliminary analyses, and exploratory projects.  There’s no peer-review beyond selection of which studies are presented based on very brief abstracts (summaries).

An example:

AD retina

Your eye doctor could look at your eye and tell if you are developing Alzheimer’s Disease?!

No.

The studies described compared people with full-fledged Alzheimer’s Disease to people without the disease.  This is far from detecting it early.   In fact, based on the article, the test missed 15% of AD patients and misdiagnosed 5 – 15% of unaffected controls.

That’s another problem with reporters covering scientific meetings — we can’t read the actual source paper presenting the results because it doesn’t exist yet.  At a meeting, results are either presented as a short platform presentation with Powerpoint slides or as a large poster.  Sometimes results and conclusions change by the time the full paper is published which could be months to years later or even never.  In this case, I had to read four articles before one actually named the meeting:  Alzheimer’s Association International Conference in Copenhagen.

 

This is the text of a Letter to the Editor that I’ve just submitted.

Heather Hanes is wrong about the research project being initiated at the University of Regina (SP, June 16, 2014 “U of R begins pesticide research study”). They don’t have a hypothesis. They have an agenda. Tanya Dahms admits that the goal is to eliminate pesticides. Her tone implies that they are simply going through the motions to gather evidence of what they already assume to be true. A real scientific inquiry would ask “How does pesticide use affect plant populations in a grass lawn?” It is possible that killing dandelions and other noxious weeds allows non-weed species to flourish. No one can know until the study is complete and even then they might have no useful answer since two test plots is a tiny trial and the researchers are obviously biased.

It’s just a mouse study.

Holy crap.

Here’s the headline that I read today:

Multiple sclerosis drug shows promise in treating post-traumatic stress disorder

No.  No.  No.  A thousand times no.

This article is based on a study of an MS drug in MICE published in Nature Neuroscience.  I have two biology degrees and a PhD in epidemiology and I can’t understand the abstract.  The headline seems to be derived from a chat with the paper’s author:

The possibility that the drug may treat PTSD is “one thing that comes to mind,” Spiegel said in a telephone interview. “That is a potential implication.”

That’s it.

Do not put faith in mouse studies.  Ever.

Junk Food Junk Headline

According to the National Post headline, “Eating junk food before getting pregnant spikes risk of premature birth: researchers”.

No, it doesn’t.

The study “Preconception dietary patterns in human pregnancies are associated with preterm delivery” (JA Grieger, LE Grzeskowiak, and VL Clifton) was e-published ahead of print in the Journal of Nutrition (April 30, 2014).  They concluded that nutrition before pregnancy was associated with pre-term birth.  This was a retrospective, cross-sectional study which means that the 309 pregnant women were surveyed one time about what they ate during the 12 months before they got pregnant.  They were never asked about what they ate during pregnancy although they cite other studies showing that the two are typically similar.

Problem 1:  The results could just reflect that diet during pregnancy affects risk of premature delivery.

After a lot of complicated analyses, the researchers identified three dietary patterns (high-protein/fruit, high-fat/sugar/takeaway, vegetarian-type) and assigned a score to each woman based on how she compared to the average score.  They didn’t categorize mothers into separate groups and compare the incidence of pre-term delivery.  The overall pre-term rate is 10%.  What is the risk for those eating “junk food”?  We don’t know.  This type of analysis could result in complete different findings with a different group of moms.

Problem 2:  Even the researchers acknowledge in their discussion that the results of this study “may not be generalizable to other populations”.

The article and the headline clearly overstate the findings of this study.  I wouldn’t equate an odds ratio of 1.5 to a “spike” in risk.  As long as associations are being used to make dietary recommendations, they could have emphasized that the meat-based, high-protein diet was associated with lower risk of pre-term birth while the vegetarian diet showed no benefit.

Health reporters need to make much more liberal use of the words “may”, “might”, “suggests”, etc. and the headline writers need to stop being so sensationalistic.

 

 

Unintended consequences.

Sometimes, just when I need it, I get a reminder of something that I’d forgotten.

My last post was in response to movement by some to have age restrictions placed on tanning beds in Saskatchewan.  Click the link to read and you can check out the discussion on Twitter.

So, what did I forget?  When advocating looking at the big picture, I referred to absolute risk of melanoma and the impact of the proposed legislation. I forgot to take another step back and look at the really big picture of all cause mortality.  Often in health research, we discover that an intervention to prevent one outcome actually increases other outcomes.

A tweet from @#JWF# reminded me to do this with respect to this specific topic:

jwf tweetThe study that he refers to was conducted in Sweden and published in the Journal of Internal Medicine.

 

 

Tanning beds: Shine a light on some facts.

Doesn’t the Saskatchewan NDP have bigger issues to worry about than banning teens from using tanning beds?

This is one of those issues that where everyone “knows” what they know but few bother to actually look up the facts.

In a well-designed Minnesota study published in the Cancer Epidemiology, Biomarkers, & Prevention, Lazovich et al. reported an increased odds ratio of 1.75 associated with ever use of tanning beds.  Odds ratio is an estimate of risk although it tends to be an overestimate.  This finding is similar to the 2.06 reported in a nested case-control study (an even better design*) from the Nurses’ Health Study.  Based on an odds ratio of 1.75, the risk of developing melanoma is estimated 75% higher for someone who uses a tanning bed compared to someone who doesn’t use a tanning bed.  Therefore, some of the people who use a tanning bed and are later diagnosed with melanoma would have still developed melanoma even if they had not used a tanning bed.

Some numbers:

  • Incidence of melanoma in Saskatchewan in 2003 was approximately 11 cases per 100,000 people (based on 106 cases).  Mortality was 2 deaths from malignant melanoma per 100,000 people.
  • 27% of women ages 16 – 24 in Saskatchewan use tanning beds (from SunSmart Saskatchewan).
  • In the MN study, only 18% of cases used a tanning bed before age 18 and first use before age 18 was not associated with a greater risk of melanoma. (Sorry MLA Chartier.)
  • The incidence of melanoma in the US has been increasing for almost 20 years but the mortality rate has remained relatively constant.

Therefore … banning tanning beds before age 18 would prevent almost precisely zero cases of melanoma.  Doubly true if a teen denied access to a tanning bed just decides to tan outside without sunscreen instead.  Maybe if the person denied access to tanning as a teen never uses a tanning bed after turning 18, a few cases of melanoma might be prevented.

Also, your risk of developing melanoma this year is 0.011%.  If you never use a tanning bed and assuming that the risk estimates in the studies actually reflect the true risk, your risk of developing melanoma this year is 0.008%.  That’s not a huge change in absolute risk.

* A retrospective case-control study compares the exposure rate in people with disease to that in people without disease.  Because people are asking to report exposures from sometime in the past, there is recall bias which affects the estimate of risk.  People with disease are more likely to recall negative exposures.  A nested case-control study uses prospective data on exposure which was collected before disease started which removes the potential for recall bias.