First of all, Health Impact News has all the markings of a crank site. For instance, its banner claims it is a site for 'News that impacts your health that other media sources may censor.' This in itself ought to be a red flag, just like Kevin Trudeau’s Natural Cures They Don’t Want You to Know About.
But enough about that. Let’s see how this article and the study they refer to abuses statistics.
First of all, this is a bit of a greased pig. Their link leads to a malformed PDF file on a site called vaccineinjury.info. The site’s apparent reason for existence is to host a questionnaire for parents who did not vaccinate their children. So I’ll have to go on what the article says. The authors claim:
'No study of health outcomes of vaccinated people versus unvaccinated has ever been conducted in the U.S. by CDC or any other agency in the 50 years or more of an accelerating schedule of vaccinations (now over 50 doses of 14 vaccines given before kindergarten, 26 doses in the first year).'
Here’s one. A simple Pubmed search will bring up others fairly quickly. These don’t take long to find. What happens after this statement is a long chain of unsupported assertions about what data the CDC has and has not collected, that I really don’t have an interest in debunking right now.
So on to the good stuff. They have a pretty blue and red bar graph that’s just itching to be shredded, so let’s do it. This blue and red bar graph is designed to demonstrate that vaccinated children are more likely to develop certain medical conditions, such as asthma and seizures, than unvaccinated children. Pretty scary stuff, if their evidence were actually true.
One of the most important principles in statistics is defining your population. If you fail at that, you might as well quit, get your money back from SAS, and call it a day, because nothing that comes after that is meaningful. You might as well make up a bunch of random numbers if that’s the case, because that will be just as meaningful.
This study fails miserably at defining its population. The best I can tell, the comparison is between a population in an observation study called KIGGS and respondents to an open invitation survey conducted at vaccineinjury.info.
What could go wrong? (Rhetorical question)
We don’t know who responded to the vaccineinjury.info questionnaire, but it is aimed at parents who did not vaccinate their children. This pretty much tanks the rest of their argument. From what I can tell, these respondents seem to be motivated to give answers favorable to the anti-vaccine movement. The data they present are supplemented with testimonials and this gives it away. They are comparing apples to rotten oranges.
The right way to answer a question like this is a matched case control study of vaccinated and unvaccinated children. An immunologist is probably the best one to determine which factors need to be included in the matching. That way, an analysis conditioned on the matching can clearly point to the effect of the vaccinations, rather than leave open the questions of whether the differences in cases were due to differences in inherent risk factors.
I’m wondering if there isn’t some ascertainment bias going on as well. Though I really couldn’t tell what the KIGGS population was, it was represented as the vaccinated population. So in addition to imbalances in risk factors, I’m wondering if the 'diagnosis' in the unvaccinated population was derived from the parents who were asked which medical conditions their children have.
In that case, we have no clue what the real rate is like, because we are comparing parents’ judgments (and parents are probably more likely to ignore mainstream medicine) with, presumably, a GP’s more rigorous diagnosis. That’s not to say that no children in the survey were diagnosed by a doctor, but without that documentation (which this web based survey isn’t going to be able to provide), the red bars in the pretty graph are essentially meaningless.
But let’s move on. The vaccineinjury.info cites some other studies that seem to agree with their little survey. For instance, McKeever, et al. published a study in the American Journal of Public Health in 2004 from which the vaccineinjury.info site claims an association between vaccines and the development of allergies. However, that apparent association, as stated in the study, is possibly the result of ascertainment bias (the association was only strong in a stratum with the least frequent GP visits). Even objections to the discussion of ascertainment bias leave the evidence of association of vaccines and allergic diseases unclear.
The vaccineinjury.info site also cites the Guinea-Bisseau study reported by Kristensen et al.in BMJ in 2000. They claim, falsely, that the study showed a higher mortality in vaccinated children.
What they don’t do is describe how they chose the studies to be displayed on the web site. What were the search terms? Were these studies cherry-picked to demonstrate their point? (Probably, but they didn’t do a good job.)
What follows the discussion of other studies is an utter waste of internet space. They report the results of their 'survey', I think. Or somebody else’s survey. I really couldn’t figure out what was meant by 'Questionnaire for my unvaccinated child ("Salzburger Elternstudie")'. The age breakdown for the 'children' is interesting, for 2 out of the 1004 'children' were over 60! At any rate, if you are going to be talking about diseases in children, you need to present it by age, because age is a risk factor in disease development. But they didn't do this.
What is interesting about the survey, though, is the reasons the parents did not vaccinate their children, if only to give a preliminary notion of the range of responses.
In short, vaccineinjury.info, and the reporting site Health Impact News, present statistics that are designed to scare rather than inform. Proper epidemiological studies, contrary to the sites’ claims, have been conducted and provide no clear evidence to the notion that vaccinations cause allergies except in rare cases. In trying to compile evidence for their claims, they failed to provide evidence that they did a proper systematic review, and even misquoted the conclusions of the studies they presented.
All in all, a day in the life of a crank website.
This article first appeared on John Johnson's Realizations in Biostatistics blog.