The United States and most of the rest of the developed world enjoy very high vaccination rates for routine childhood illness — typically over 90 percent. But in the last decade or two, these vaccination rates have stagnated and even declined in some locations. The measles vaccination coverage in the United Kingdom, for example, went from a high of 92 percent in the mid-1990s to just 81 percent by the mid-2000s (it has since recovered). Poor countries often can’t achieve complete vaccination because of cost, but non-universal vaccination in the developed world is more often a matter of choice.
There are many reasons people choose not to vaccinate their children. Some parents view vaccinations as risky — some people even think they can cause autism, despite the fact that this theory has been thoroughly debunked. Some object for religious reasons. In other cases, medical conditions make vaccination impossible or dangerous.
And there are still others who aren’t necessarily convinced that vaccines are dangerous, but simply view the benefits as minimal. Why subject your child to a painful shot, given that you’ve never actually seen anyone get the measles? This logic relies on the idea that “herd immunity” will protect even unvaccinated children. If enough people are vaccinated against measles, then no one will get it, so even unvaccinated children will be protected. Since vaccination rates are extremely high in the U.S., herd immunity surely operates, right?
Well, it turns out, yes and no. High vaccination rates do provide significant protection. But when I analyzed the data, I was surprised to find that even within the U.S., states and counties with lower vaccination rates do experience higher rates of illness.
Community Guidelines
Log in