Real Life. Real News. Real Voices.
Help us tell more of the stories that matter from voices that too often remain unheard.
Join HuffPost Plus

CDC: Vaccine Study Used Flawed Methods

My original post mischaracterized the CDC vaccine investigation, and I'm reposting this piece to reflect that information accurately, but also to point out many of the weaknesses identified in the CDC's data.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

(NOTE: My original post on this topic mischaracterized the 2003 CDC vaccine investigation as an "Ecological Study," which it was not. I am reposting this piece to reflect that information accurately, but also to point out that many of the weaknesses identified in the CDC's data and methods apply to the published 2003 "retrospective cohort" study, as much as they do to any future "ecological" ones. I regret and apologize for the error.)

A new report (PDF) that CDC Director Dr. Julie Gerberding has delivered to the powerful House Appropriations Committee casts new light -- and new doubt -- on the data and methodology that the CDC used in its landmark 2003 study that found no link between mercury in vaccines and autism, ADHD, speech delay or tics.

Gerberding was responding to a report from the National Institute of Environmental Health Sciences (NIEHS), which evaluated the strengths and weaknesses of the CDC's vaccine database, and showed how the weaknesses, in particular, would have to be addressed in conducting further studies of thimerosal and autism.

These weaknesses included: uncertainties in case ascertainment, heterogeneity of business practices within and across HMOs in the database and their systematic changes over time, misclassification of exposure status, and the inability to control for temporal changes in awareness, diagnostic practices and potential confounding factors.

Many of these weaknesses should be taken into account when moving into the future, but they also apply to CDC studies that have been done in the past, including the methodology that was employed in the CDC's flagship thimerosal safety study of 2003.

To begin with, the NIEHS panel had "identified several areas of weaknesses that when taken together reduce the usefulness of the project for conducting an ecologic study design to address the potential association between exposure to thimerosal and the risk of autism."

Ecological studies are large, epidemiological analyses of risks and trends using data from large populations without making efforts to link outcomes to actual individual patients. The 2003 CDC study was not, strictly speaking, an "ecological study," but rather a "retrospective cohort study."

CDC researchers did go back and review some of the charts of the children diagnosed with the outcomes under study -- though this accounted for less than 1% of all children enrolled in the study.

Dr. Gerberding said in her letter that thimerosal studies done by the CDC have not been "ecological," because they utilize "medical chart reviews, neurological assessments, and parent interviews." But in the 2003 study, chart reviews were not done on 99% of the study population, and no assessments or interviews were conducted to make sure that the diagnoses were accurate.

The NIEHS report was largely focused on the feasibility of conducting an ecological study of the database, but many of the weaknesses identified are also applicable to the 2003 CDC study of the Vaccine Safety Datalink (VSD) which contains the records of hundreds of thousands of HMO patients.

In that investigation, CDC officials conducted at least five separate analyses of the data over a four-year period from 1999-2003. The first analysis showed that children exposed to the most thimerosal by one month of age had extremely high relative risks for a number of outcomes, compared with children who got little or no mercury: The relative risk for ADHD was 8.29 times higher; for autism, it was 7.62 times higher; ADD, 6.38 times higher; tics, 5.65 times; and speech and language delays were 2.09 more likely among kids who got the most mercury.

Over time, however, all of these risks declined into statistical insignificance, statistical inconsistency or else outright oblivion: The relative risk for autism plummeted from 7.62 in the first analysis, to 2.48 in the second version, to 1.69 in the third round, to 1.52 in the fourth, and down to nothing at all in the fifth, final, and published analysis printed in the Journal Pediatrics in November of 2003.

Vaccine officials attributed the steady drop to the elimination of "statistical noise" from the data through due diligence and the endeavor for excellence in governmental statistical analysis.

Indeed, the VSD study was the main pillar of a hugely influential 2004 report by the Institute of Medicine, which also concluded that there was no evidence of link between mercury, vaccines and autism.

To this day, public health officials routinely point to five "large epidemiological studies" representing the "highest quality science," none of which found any link to thimerosal.

In fact, the American VSD study has long been held up as the best and brightest of them all (the others were in Sweden, the UK, and two in Denmark, which WERE ecological studies, and presumably subject to some of the same weaknesses identified by the NIEHS). This reputation has stuck in the minds of medicine and the media.

Curiously though, even the study's lead author -- Dr. Thomas Verstraeten, an employee of vaccine maker GlaxoSmithKline -- protested that the VSD study "found no evidence against an association, as a negative study would. In fact, he said that additional study was needed, which "is the conclusion to which a neutral study must come."

That's when Congress stepped in.

In 2005, a group of Senators and Representatives headed by Sen. Joe Lieberman wrote to the NIEHS (an agency of the National Institutes of Health) saying that many parents no longer trusted the CDC to conduct independent minded studies of its own vaccine program. Lieberman et al asked NIEHS to review the CDC's work on the vaccine database and report back with critiques and suggestions for future investigations.

Among the official tasks given to the NIEHS panel were to "Identify the strengths and weaknesses of the VSD for evaluating the possible association between exposures to thimerosal-containing vaccines and AD/ASD," and had nothing to do with "ecological studies."

In her letter to the House Appropriations Committee, the CDC Director responded directly to many -- though not all -- of the most important critiques and recommendations contained in the NIEHS panel report. The weaknesses she noted apply to Verstraeten as much as they do to any future studies, ecological or otherwise.

For example, the NIEHS report had said the VSD data failed to account for other mercury exposures, including maternal sources from flu shots and immune globulin, as well as mercury in food and the environment.

"CDC acknowledges this concern and recognizes this limitation," the Gerberding reply says.

The NIEHS also questioned why CDC investigators eliminated 25% of the study population for a variety of reasons, even though this represented, "a susceptible population whose removal from the analysis might unintentionally reduce the ability to detect an effect of thimerosal." This strict entry criteria would likely lead to an "under-ascertainment" of autism cases, the NIEHS reported. Again, this would have been an issue in the Verstraeten data.

"CDC concurs," Gerberding wrote, again noting that VSD data are "not appropriate for studying this vaccine safety topic. The data are intended for administrative purposes and may not be predictive of the outcomes studied."

Another serious problem is that the HMOs have changed the way they tracked and recorded autism diagnoses over time. Gerberding said this would be a problem going into the future, but did not mention that the same principal might apply to past studies.

I hope everyone will read these documents, including the recommendations to make the VSD better, and the CDC's agreement with all of the suggestions. Hopefully, this data can still be used in some effective way.

As questionable as the US thimerosal methodology was, "it was an improvement on other studies, including the two ecological studies in Denmark, both of which had serious weaknesses in their designs," Dr. Irva Hertz-Picciotto, Professor of Public Health at UC Davis Medical School and Chair of the NIEHS panel, told reporter Dan Olmsted at UPI.

That leaves little for the CDC to go on in terms of proving that thimerosal and autism are not associated in any way.

Yes, there is always the study of disability services data from California -- which seem to be rising among the youngest cohorts of kids, who presumably received little or no mercury because thimerosal was largely removed from childhood shots.

But California is an "ecological study" as well, with problems of its own.

"Although (this) information is often used by media and research entities to develop statistics and draw conclusions, some of these findings may misrepresent the quarterly figures," cautions the website of the California Department of Developmental Services (DDS). "Increases in the number of persons reported from one quarter to the next do not necessarily represent persons who are new to the DDS system."

Even the CDC admits that "there are several limitations" with linking a VSD study design with the California data, Gerberding wrote to Congress, because, among other things, California only counts "persons who were referred to and/or voluntarily entered" the disability system."

It will be interesting to see how the House Committee -- and the mainstream media -- react to this report by the CDC, which does seem to want to conduct the best vaccine-autism science possible (see Gerberding's replies to NIEHS recommendations for improving the VSD: CDC officials are currently conducting in- depth follow up studies with VSD patients).


This revised piece does raise two new questions, I think:

1) If the VSD is not necessarily appropriate to help determine the effect of reducing mercury levels in vaccines, are taxpayers getting their money's worth?

2) If studies done in Denmark, Sweden and California were also "ecological" in nature, are they subject to some of the same weaknesses and limitations?