Skip to main content

In the June 2015 issue of their magazine, Consumer Reports tries to convince me to wear a bicycle helmet. They do not succeed, nor should they. While it may be true that we should all be wearing helmets, nobody should be persuaded by Consumer Reports' statistical arguments, which are so silly as to be laughable.

It's actually a pretty big fail on Consumer Reports' part. Because, I'm sure, helmets do save lives, and it should be pretty easy to come up with data that illustrate that competently. Instead… well, it's almost like they don't take the question seriously enough to care about what the numbers actually mean (The article isn't online, but here's a web page from their site that's similar but less ardent.)

Here's Consumer Reports' first argument:

'… the answer is a resounding yes, you *should* wear a helmet. Here's why: 87% of the bicyclists killed in accidents over the past two decades were not wearing helmets.'

Well, that's not enough to prove anything at all. From that factoid alone, there is really no way to tell whether or not helmets are good or bad.

If Consumer Reports doesn't see why, I bet they would if I changed the subject on them:

'… the answer is a resounding yes, you *should* drive drunk. Here's why: 87% of the drivers killed in accidents over the past two decades were stone cold sober.'

That would make it obvious, right? If the same argument that proves you should wear a helmet also proves you should drive drunk, the argument must be flawed. What's wrong with it? It doesn't work without the base rate.

In order to see if '87%' is high or low, you need something to compare it to. If fewer than 87% of cyclists go helmetless then, yes, they're overrepresented in deaths, and you can infer that helmets are a good thing. But if more than 87% go without a helmet, that might be evidence that helmets are actually dangerous.

To make the drinking-and-driving argument work, you have to show that fewer than 87% of drivers are drunk. Neither of those is that hard. But you still have to do it!

Why would Consumer Reports not notice that their argument was wrong in the helmet case, but notice immediately in the drunk-driving case? There are two possibilities:

1. Confirmation bias. The first example fits in with their pre-existing belief that helmets are good. The second example contradicts their pre-existing belief that drunk driving is bad.

2. Gut statistical savvy. The Consumer Reports writers do feel, intuitively, that base rates matter, and 'fill in the blanks' with their common sense understanding that more than 13% of cyclists wear helmets, and that more than 13% of drivers are sober.

As you can imagine, I think it's almost all number one. I'm skeptical of number two. In fact, there are many times that the base rates could easily go the 'wrong' way, and people don't notice. One of my favorite examples, which I mentioned a few years ago, goes something like this:

'… the answer is a resounding yes, you *should* drive the speed limit. Here's why: 60% of fatal accidents over the past two decades involved at least one speeder.'

As I see it, this argument actually may support speeding! Suppose half of all drivers speed. Then, there's a 75% chance of finding at least one speeder out of two cars. If those 75% of cases comprise only 60% of the accidents… then, speeding must be safer than not speeding! And, of course, there's this classic Dilbert cartoon.

It wouldn't have been hard for Consumer Reports to fix the argument. They could have just added the base rate:

'… the answer is a resounding yes, you *should* wear a helmet. Here's why: 87% of the bicyclists killed in accidents over the past two decades were not wearing helmets, as compared to only 60% of cyclists overall.'

It does sound less scary than the other version, but at least it means something. (I made up the '60%' number, but anything significantly less than 87% would work. I don't know what the true number is but, since we're talking about the average of the last 20 years, my guess would be that 60% to 70% would be about right.)

Even if Consumer Reports had provided a proper statistical argument that riding with a helmet is safer than riding without… it still wouldn't be enough justify their 'resounding yes'. Because, I doubt that anyone would say,

'… the answer is a resounding yes, you *should* avoid riding a bike. Here's why: 87% of commuters killed in accidents over the past two decades were cycling instead of walking – as compared to only 60% of commuters overall.'

That would indeed be evidence that biking is riskier than walking – but I think we'd all agree that it's silly to argue that it's not worth the risk at all. You have to weigh the risks against the benefits. On that note, here's a second statistical argument Consumer Reports makes, which is just as irrelevant as the first one:

'… wearing a helmet during sports reduces the risk of traumatic brain injury by almost 70%.'

(Never mind the article chooses to lump all sports together – we'll just assume the 70% is true for all sports equally.) So, should that 70% statistic alone convince you to wear a helmet? No, of course not.

Last month (this actually happened) my friend's mother suffered a concussion in the lobby of her apartment building. She was looking sideways towards the mail room, and collided with another resident who didn't see her because he was carrying his three-year-old daughter. My friend's mom lost her balance, hit her head on an open door, blacked out, and spent the day in hospital.

If she had been wearing a helmet, she'd almost certainly have been fine. In fact, it's probably just as true that:

'… wearing a helmet when walking around in public reduces the risk of traumatic brain injury by almost 70%.'

Does that convince you to wear a helmet every second of your day? Of course not. The relevant statistic isn't the percentage of injuries prevented. It's how big the benefit is as compared to the cost and inconvenience. The '70%' figure doesn't speak to that at all.

If I were to find the Consumer Reports writers, and ask them why they don't wear their helmets while walking on the street, they'll look at me like I'm some kind of idiot – but if they chose to answer the question, they'd say that it's because walking, unlike cycling, doesn't carry a very high risk of head injury.

And that's the point. Even if a helmet reduced the risk of traumatic brain injury by 80%, or 90%, or even 100%… we still wouldn't wear one to the mailroom. I choose not to wear a helmet for exactly the same reason that you choose not to wear a helmet when walking. That's why the '70%' figure, on its own, is almost completely irrelevant.

Everything I've seen on the web convinces me that the risk is low enough that I'm willing to tolerate it. I'd be happy to be corrected by Consumer Reports – but they're not interested in that line of argument. I bet that's because the statistics don't sound that ominous. Here's a site that says there's around one cycling death per 10,000,000 miles. If it's four times as high without a helmet – one death per 2,500,000 miles – that still doesn't sound that scary.

That latter figure is about 40 times as high as driving. If I ride 1,000 miles per year without a helmet, the excess danger is equivalent to driving 30,000 miles. I'm willing to accept that level of risk. (Especially because, for me, the risk is lower still because I ride mostly on bike paths, rather than streets, and most cycling deaths result from collisions with cars.)

You can still argue that three extra deaths per ten million miles is a significant amount of risk, enough for Consumer Reports to create a moral or legal requirement for helmets. But, do you really want to go there? Because, if your Consumer Reports criterion is magnitude or risk… well, cycling shouldn't be at the top of your list of concerns.

In the year 2000, according to this website, 412 Americans died after falling off a ladder or scaffolding. Now, many of those deaths are probably job-related, workers who spend a significant portion of their days on ladders. Suppose that covers 90% of the fatalities, so only 10% of those deaths were do-it-yourselfers working on their own houses. That's 41 ladder deaths.

People spend a lot more time on bicycles than ladders… I'd guess, probably by a factor of at least 1,000. So 41 ladder deaths is the equivalent of 41,000 cycling deaths. But there were only 740 deaths from cycling. That makes it around 55 times as dangerous to climb a ladder than ride a bicycle.

And that's just the ladders – it doesn't include deaths from falling after you've made it onto the roof! If Consumer Reports were to acknowledging that risk level is important, they'd have to call for helmets for others at high-risk, like homeowners cleaning their gutters, and elderly people with walkers, and people limping with casts, and everyone taking a shower.

Finally, Consumer Reports gives one last statistic on the promo page:

'Cycling sends more people to the ER for head injuries than any other sport – twice as many as football, 3 1/2 times as many as soccer.'

Well, so what? Isn't that just because there's a lot more cycling going on than football and soccer? This document says that in the US in 2014, there were six football fatalities. They all happened in competition, even though there are many more practices than games. All six deaths happened among the 1.2 million players in high-school level football or beyond – there were no deaths in youth leagues.

Call it a million players, ten games a year, on the field for 30 minutes of competition per game. Let's double that to include kids' leagues, and double it again to include practices – both of which didn't have any deaths, but still may have had head injuries.

That works out to 20 million hours of football. In the USA, cyclists travel between 6 billion and 21 billion miles per year. Let's be conservative and take the low value. At an average speed of, say, 10 mph, that's 600 million hours of cycling. So, people cycle 30 times as much as they play football. But, according to Consumer Reports, they have twice the head injury rate. That means cycling is only 7% as dangerous as football.

That just confirms what already seemed obvious, that cycling is pretty safe compared to football. Did Consumer Reports misunderstand the statistics so badly that they convinced themselves otherwise?

 

This article first appeared on Phil's Sabermetric Research blog.

Leave a Reply

Significance Magazine