A round table topic on The Sara Mills Show yesterday was the under-reporting of sexual assaults. One of the guests said that only 3 of every 1000 sexual assaults resulted in a conviction. That number seemed really low so I asked about it. The source is an infographic produced by the YWCA and highlighted in a Huffington Post article.
Where did that 460,000 number come from? How do you count sexual assaults that aren’t reported?
The source cited is the chapter “Limits of a Criminal Justice Response: Trends in Police and Court Processing of Sexual Assault” by Holly Johnson in the book Sexual Assault in Canada: Law, Legal Practice and Women’s Activism, edited by Elizabeth Sheehy and published in 2012.
Where did she get the number?
Her source was the 2004 General Social Survey (Victimization)conducted by Statistics Canada. A summary is available. The question asked was “During the past 12 months, has anyone ever touched you against your will in any sexual way? By this I mean anything from unwanted touching or grabbing, to kissing or fondling.”
So, 460,000 is an estimate of “sexual assault” calculated based on a survey of ~24,000 people and includes unwanted touching, grabbing, kissing, or fondling. Is it any wonder that these “assaults” are not reported?
The truth is that we don’t know how many sexual assaults go unreported because they are not reported. Based on the overly broad definition used above, it’s safe to say that the number is much less than 460,000 per year. Exaggerating doesn’t strengthen your point; it weakens your position because your audience won’t trust your other “facts”.
While reading an article in the Globe and Mail about skin cancer, something about the article combined with the graphic for “What is the likelihood that you will get cancer?” bothered me. The bottom of the graphic emphasized lifetime probability of dying from cancer, 24% for women and 29% for men. The implication is that too many people get cancer and we need to reduce the number of people dying from cancer.
Here’s the problem: everybody dies eventually. Everybody. You can kind of pick what you are more likely to die from from but immortality is not an option. Reducing the number of people dying from one cause will increase the number dying from another. I found a CDC graph which perfectly illustrates this.
If fewer people die from infectious diseases, more will die from heart disease. If fewer people die from heart disease, more people will die from cancer.
According to the National Post headline, “Eating junk food before getting pregnant spikes risk of premature birth: researchers”.
No, it doesn’t.
The study “Preconception dietary patterns in human pregnancies are associated with preterm delivery” (JA Grieger, LE Grzeskowiak, and VL Clifton) was e-published ahead of print in the Journal of Nutrition (April 30, 2014). They concluded that nutrition before pregnancy was associated with pre-term birth. This was a retrospective, cross-sectional study which means that the 309 pregnant women were surveyed one time about what they ate during the 12 months before they got pregnant. They were never asked about what they ate during pregnancy although they cite other studies showing that the two are typically similar.
Problem 1: The results could just reflect that diet during pregnancy affects risk of premature delivery.
After a lot of complicated analyses, the researchers identified three dietary patterns (high-protein/fruit, high-fat/sugar/takeaway, vegetarian-type) and assigned a score to each woman based on how she compared to the average score. They didn’t categorize mothers into separate groups and compare the incidence of pre-term delivery. The overall pre-term rate is 10%. What is the risk for those eating “junk food”? We don’t know. This type of analysis could result in complete different findings with a different group of moms.
Problem 2: Even the researchers acknowledge in their discussion that the results of this study “may not be generalizable to other populations”.
The article and the headline clearly overstate the findings of this study. I wouldn’t equate an odds ratio of 1.5 to a “spike” in risk. As long as associations are being used to make dietary recommendations, they could have emphasized that the meat-based, high-protein diet was associated with lower risk of pre-term birth while the vegetarian diet showed no benefit.
Health reporters need to make much more liberal use of the words “may”, “might”, “suggests”, etc. and the headline writers need to stop being so sensationalistic.
A conclusion is only as good as the data that it is based upon and I am always curious about the data.
One of today’s big “news” topics was a story about how unfit Canadian kids are with the typical headline “Canadian kids continue to get failing fitness grade“. (I feel like this needs a dramatic sound effect). Canadian kids are not fit? How do they know? What does that mean?
Well … all of the news reports are a result of the release of the “2014 Active Healthy Kids Canada Report Card on Physical Activity for Children and Youth“. Active Healthy Kids Canada is a charity whose purpose is to increase kids’ activity levels. I would be surprised if they said anything other than “kids need to be more active”.
You can download a long version of the report card on their site and it includes the data sources that they used.
One example: Do children in various age groups meet the daily recommendation of at least 60 minutes of moderate-to-vigorous physical activity? This data came from the 2011-12 Canadian Health Measures Survey from Statistics Canada. The questions asked included “Over a typical or usual week, on how many days is he physically active for a total of at least 60 minutes per day?” (maximum answer is 4 or more days) and “About how many hours a week does he usually take part in physical activity that makes him out of breath or warmer than usual?”. The questions and answer choices appear to be carefully written but the problem comes with summarizing the results. For moderate-to-vigorous exercise, the answers must have come from the second type of question each with one of four classifications (school/home, organized/free play) and then grouped according to an average number of hours per day. Two full weekend days playing sports does not equal 60+ minutes each day but it would have to be classified that way given the available data.
Other categories for report card marks relied upon completely different data sources including surveys of parents, surveys of students, surveys of schools, clinical exams, and direct measurement of steps per day. Data was from a few different years.
Comparisons were then made to 14 other countries ranging from Australia and Ireland to Mozambique and Nigeria. I can only assume that these countries also use a wide range of data sources of varying quality.
TL;DR: Statements on report card may not be 100% accurate.
And don’t even get me started on that headline. None of the variables in this report actually measured fitness. None of them.
Also, big surprise, non-government organizations promoting physical activity received an A-! Are those the same groups that support Active Healthy Kids Canada and use this report card to promote their agendas? How convenient.
A tweet today from The Council of Ontario Universities claimed that “a university education remains the best protection against unemployment” citing statistics that less than 4% of university graduates (Bachelor’s and Master’s) in Canada are unemployed compared to almost 7% of people without university degrees.
This is faulty logic. Although unemployment is correlated with educational status, a university degree is no guarantee that you won’t be unemployed. Think of it this way: the intelligence and work ethic necessary to meet the requirements of a university degree are the same characteristics that make a person more likely to succeed at any job. Having a university degree could just be a marker for “successful at work” regardless of whether that work requires a degree.
And what happens when more people have university degrees but the same number of jobs exist that require a degree? Unemployment will increase because more people will be competing for each job.
Obviously, I’m not against university education. I have three degrees. However, I believe that we do a disservice to people by acting as if everyone should aspire to a degree and that a degree guarantees full employment.
Go to university if what you want to do requires a degree but realize that a degree is not a guarantee of success.
A headline in yesterday’s StarPhoenix stated that “1,026 aboriginal women killed merits inquiry”.
Don’t all murders merit inquiry? According to Stats Canada counts of homicide victims, in 2011 there were 598 homicides — 422 men, 276 women. Based on the RCMP report cited in the StarPhoenix article by Stephen Maher, 16% of murdered women are aboriginal. Therefore, we can extrapolate that 44 aboriginal women and 232 non-aboriginal women were murdered in 2011. Why does one group merit a special inquiry?
It is possible that many of the murdered women share characteristics which might provide clues about where to focus prevention strategies. Examining only aboriginal victims assumes that race is the primary factor and limits the power to identify other risk factors. For that matter, this discussion also ignores the far greater number of murdered men.
*If 1026 murdered aboriginal women represent 16% of all women murdered, the total number of women murdered over 30 years equals 6412.
Earlier this week, I noticed Tweets from people alarmed about a shockingly low life expectancy of only 37 years for aboriginals in Toronto. This instantly seemed wrong. The links lead to an article on the Kevin Newman Live website by a producer, Jordan Chittley. I found the number cited to be unbelievable and it is. Colby Cosh caught the error but many others did not. I sent a series of tweets to the author; the headline and article have since been edited but still contain errors.
The story reports the results of a study “Early deaths among members of Toronto’s aboriginal community” written by three doctors — C.P. Shah, R. Klair, and A. Reeves — from Anishnawbe Health Toronto. The data used for the study was from just 109 deaths of patients from four clinics in Toronto over three years.
The original article and headline referred to “life expectancy” which is quite different from “average age at death”. Calculating life expectancy requires knowing the risk of death at each age (or in each age group) in a population.
The authors of the report did not even actually calculate average age at death for aboriginals in Toronto. They only calculated the average age at death for patients in their clinics who died during those three years. This doesn’t account for all of the patients who haven’t died yet or all of the aboriginals who are not patients at their clinics. Their data is useless. They might be able to make a comparison if they could identify non-Aboriginal patients at similar clinics but even then they are just comparing people who died. What percent of patients died? What were the causes of death? Who is their patient population? Is it representative of the entire aboriginal population of Toronto? So many unanswered questions.
The errors in the report are perpetuated by the media including the corrected version of the CTV article. Almost every article that I’ve seen compares the average age of death of this small group (37 years) to the correctly calculated life expectancy of Toronto residents (75 years). The two statistics are completely different and should never be compared. At least Metro News got it right.
So what are the facts? Aboriginals do have a lower life expectancy than non-Aboriginals but only by a few years and both groups can expect to live far past 37 years.