Last week, I observed a friend frantically chasing her toddler grandson who had slipped out of her grip, run to a buffet table and grabbed a cookie. “Does it have nuts in it?” she yelled in abject fear to no one in particular.
Today, more than 1% of American children, like this little guy, and approximately .5% of adults in the United States are allergic to peanuts. That is an estimated one million kids and three million individuals, who could die by simply breathing the air in a room where someone ate a peanut butter sandwich.
When and how did this epidemic develop, and why is it continuing? Most important, what is its cause, and what can we do to stop it?
The frightening answers to these questions are in The Peanut Allergy Epidemic: What’s causing it and How to Stop it by Heather Fraser, a Canadian mom whose child had an anaphylactic reaction to peanut butter at 13 months of age. I could hardly put it down! You and everyone you know who, especially your pediatrician, should read it.
The “perfect storm” that spawned the peanut allergy epidemic around 1990, (not surprisingly paralleling the autism epidemic) occurs, like autism, in more boys than girls. “Victims” are the same: picky eater kids with lessened ability to detoxify, consuming less nutritious food and receiving an ever-increasing number of vaccinations, growing up in an increasingly toxic environment. My friend, Dr. Ken Bock wrote about them in his book Healing the New Childhood Epidemics: Autism, ADHD, Asthma, and Allergies: The Groundbreaking Program for the 4-A Disorders.
Bock knows from his busy practice that many children with autism have severe allergies, including life-threatening reactions to peanuts. Likewise, many children with peanut allergies are diagnosed with autism spectrum disorders, including attention deficits, pervasive developmental disorders, Asperger syndrome and full-blown autism. The commonality, he and others agree: an overburdened immune system. How did that happen? Let’s start by understanding allergy and the ONLY means by which mass allergy has ever been created: by injection.
What is Allergy?
Early twentieth century American researchers, Rachel Carson and Theron Randolph, and a contemporary, MacArthur “genius award” winning biologist, Margie Profet believe that allergy is an evolved, and often risky, protective response: the body’s natural defense against toxins linked to benign substances. An “allergic” reaction occurs when the body is exposed to proteins of unfamiliar foods, triggering immunoglobulin epsilon (IgE) antibodies, the soldiers whose job it to protect the body’s mucous membranes from invaders. When they detect trouble, they deploy a biochemical cascade, characterized by coughing, shortness of breath, itchy skin hives, leaking of blood vessels causing swelling and potential asphyxia, vomiting and diarrhea. Scratching, vomiting, diarrhea and sneezing are a body’s desperate attempts at ejecting a toxin as fast as it can. In severe reactions, blood pressure drops, draining vital organs and causing the heart to stop.
The term “allergy” was coined in 1906, only one hundred years ago, by an Austrian pediatrician, trying to reconcile an unexpected reaction to vaccination in some of his patients. The modern concept of allergy grew out of the occurrence of “serum sickness,” a man-made malady. Keep reading.
The Hypodermic Needle
Documented life-threatening mass allergic reactions were rare prior to the late nineteenth century, and first emerged as an “unintended consequence” of a new invention, precipitated by the unprecedented need for pharmaceuticals near the end of the Civil War: the hypodermic syringe. Louis Pasteur was the first doctor to use a hypodermic needle to inject a vaccine: anthrax for livestock, and later rabies to a boy bitten by a dog. Hypodermic needles were quickly adopted as a hygienic improvement over the messy, often dirty, transdermal lances previously used to puncture or scratch the skin to insert pathogens.
As demand increased, costs became more reasonable, and production soared. Upjohn and Parke-Davis (both now owned by Pfizer) and Eli Lilly (the developer of thimerosal) were born out of demand for hypodermically delivered vaccines. Their 1890’s marketing methods closely resembled today’s, minus television and computers. Sales reps visited physicians’ offices, leaving promotional literature and samples in lively packaging. And, don’t forget the annual medical almanacs! By the turn of the 20th century, vaccine manufacturing was big business.
The Need for Preservatives
With an increased demand for vaccines for dreaded smallpox, tuberculosis, diphtheria and cholera, and the realization that a single vaccination did not confer lifelong immunity, the need arose for vaccines that could travel safely and be administered efficiently. Pus and scabs from sick animals decomposed quickly; sick animals were difficult to transport. The answer: preservatives suspended in an antibacterial carrier gel made of vegetable glycerin that extended shelf life and could be delivered by injection.
Early twentieth century ingredients included mercury-based antifungals and various oils. Exact ingredients were fiercely guarded proprietary formulas, protecting the scientists, their companies and shareholders by law.
A common outcome of the first mass preserved, hypodermically delivered injections of sera for scarlet fever, tetanus and diphtheria was a poorly understood and potentially fatal condition. It was first called “serum sickness,” later termed “anaphylaxis” by French Nobel laureate and immunologist Charles Richet – from the Greek ana (against) and phylaxis (protection) – the opposite outcome from what was expected from vaccination. Symptoms included fevers, rashes, diarrhea, decreased blood pressure, lymph node swelling, joint pain, an enlarged spleen, kidney failure, breathing difficulties, and shock, lasting for days, weeks or a lifetime, and, occasionally, proving fatal.
What was causing so many people to get sick instead of stay well? Richet experimented with dogs to find the answer. He injected his subjects with raw meat proteins, and then fed them raw meat. The result was anaphylaxis! Two other researchers did the same, except by injecting egg and milk, showing that without exception, all proteins, toxic or non-toxic outside the body could produce anaphylaxis by injection. Richet discovered that this phenomenon is universal for all animals.
Austrian pediatrician Clemens von Piquet and his Hungarian colleague, Bela Schick, studied serum sickness in thousands of children, noting a paradoxical relationship between the two outcomes of vaccination: attaining immunity and acquiring serum sickness. In both outcomes, an incubation period occurs between the initial inoculation and appearance of symptoms. Subsequent injections (just like secondary exposure to infections) are accompanied by an accelerated and exaggerated response resulting from “a collision of antigen and antibody.” This conjecture was confirmed by the fact that in 90% of von Piquet’s patients, immediate adverse reactions occurred following the “booster” injection 10-30 days after the first.
In 1934, up to 50% of children experienced post-vaccinal serum sickness. Families were forced to weigh their fears of fatal diseases such as smallpox against the risk of being injured or killed by a vaccine, and choose the lesser of two evils. The only difference from today is that few of these dreaded diseases kill many people any longer in developed countries because of antibiotics.
As Richet continued to experiment with cats, rabbits, horses and frogs, he deduced that “digestive juices” were required to break down the protein, and if this did not happen, the body would mount an immune response. Experimental alimentary anaphylaxis is almost impossible to demonstrate in the presence of healthy digestion. The first injection of undigested protein into the blood stream sensitizes and weakens an animal, making it susceptible to a second, smaller dose which then could cause a serious, even fatal reaction in persons with inadequate digestion. Conclusion: healthy digestive juices actively transform potentially toxic proteins, rendering them innocuous, or restated, inadequate digestion is a common sense prerequisite for food allergy.
The “ingestion” theory of anaphylaxis has persisted to explain the vast majority of food reactions. Some of these reactions, however, are not life-threatening, but more subtle and hard to pinpoint, such as migraines, skin conditions, fatigue, anxiety, irritability and behavioral problems. Egg was a case in point; why did a young boy suffer from “egg poisoning” in 1908 when nobody had ever injected egg into him? Hmm…Unfortunately, his doctor did not know that for many years prior, emulsified egg lecithin was used extensively in vaccines, and vaccine manufacturers had introduced fertile hen’s egg as medium for growing viruses. What was the link? The answer came in the 1940’s with the discovery of penicillin.
When we first examined the peanut allergy epidemic, we recognized the attributes of the perfect storm for the “victims.” With the discovery of penicillin in 1928 by Scottish biologist Alexander Fleming all the pieces of the “perfect storm” for the “weapon of mass destruction were in place: a pathogen suspended in an injected or encapsulated undigested protein from oil.
Both oral and injected forms of penicillin contained a new ingredient, cottonseed oil, a product whose proteins are considered potent allergens. A gelatin capsule sealed the drug, which was not released until it reached the small intestines, bypassing the modifying effects of digestive enzymes. I’m sure by now you can guess what happened!
From the 1930’s through 1950, sensitivity to cottonseed oil grew, as did penicillin allergies. Scientists sought a cheap, plentiful replacement. You guessed right again. After World War II, the all-American peanut replaced cottonseed as the oil of choice in the manufacturing of penicillin and in almost all vaccines! It was plentiful, inexpensive, stable in heat, and during the war, patriotic.
By 1953, Pfizer and others produced six hundred tons of penicillin, laden with peanut oill mixed with beeswax (POB for penicillin in oil beeswax) to coat the penicillin particles in a concoction known as the Romansky formula. As the body metabolized the wax and oil, the drug was released into the system. By the mid 1950’s, an estimated 2.5% of all children had developed an allergy to injected penicillin. Scientists reduced the amount of beeswax and oil in an attempt to reduce and eliminate undesirable reactions, such as fatal anaphylaxis, antibiotic resistance, fungal overgrowth and dysbiosis.
Then came a new formula mixing penicillin with aluminum monostearate (PAM), also suspended in peanut oil. PAM was the delivery of choice from the mid-fifties through the 1980’s. More frequent and more severe allergic reactivity, including anaphylaxis emerged during what was dubbed “the PAM era.” Penicillin had created an unparalleled outbreak of allergies and anaphylaxis.
During the late 1940’s and throughout the fifties, peanut oil in penicillin was not suspect. It was used not only in this wonder drug, but in streptomycin, broad-spectrum antibiotics, injected epinephrine for asthma, in anesthetics and vaccines. Unknown to consumers, peanut oil was a popular ingredient in vitamins, skin cream and even infant formulas!
Prior to 1941, the literature shows no report of peanut allergies in adults or children. A survey of people showed self-reported peanut allergies in .3% of those born 1944-47, .4% of those born 1948-57, and .6% between 1959-67. In 2008, over 1% of people born 1944-67, reported allergies to nuts, including peanuts.
Articles published in the late 1950’s and early 1960’s show a growing awareness of peanut allergy, but the first formal study of peanut allergy in children was not launched until 1973, and then on only 114 kids. Doctors watched the mysterious rise in peanut allergies, but few asked “why?” By the early 1990’s tens of thousands of peanut allergic kindergartners entered school, not only in the U.S., but in Canada, the United Kingdom and in Australia. This allergy accelleration was concurrent with an unprecedented push of political, social, legal and economic reforms to alter and accelerate the vaccination schedule in these countries.
The Vaccine Connection
In 1964, pharmaceutical giant Merck announced a new vaccine ingredient promising to extend immunity: Adjuvant 65-4, containing up to 65% peanut oil as well as aluminum stearate. An adjuvant (from the Latin “adjuvare,” to enhance) is a vaccine additive that stimulates the immune system, upping the body’s production of antibodies to a pathogen. Adjuvants reduce production costs as the vaccine maker needs less of the expensvie antigen; they also increase a vaccine’s efficacy. The can also be dangerous; the more effective a vaccine, the greater the risk of allergies and other adverse effects.
The inventor of Adjuvant 65-4, Maurice Hilleman and his colleagues at Merck knew that allergic sensitization to the peanut oil in the adjuvant was a distinct possibility, but considered toxicity and allergenicity inevitable outcomes of vaccination. It was simply difficult to balance potency and safety.
The public clearly did not know what was being injected into their children, called by immunologist Charles Janeway, “the immunologist’s dirty little secret.” The peanut allergy epidemic in children was precipitated by vaccines. Lawsuits ensued, especially related to the DPT vaccine. By 1985, over 200 lawsuits were pending against four vaccine manufacturers. This litigious environment caused many pharmaceutical companies to abandon the lucrative vaccine market, causing a vaccine shortage. A solution: combination or conjugate vaccines.
Vaccines were combined for convenience. With speed and efficiency the U.S. Pediatric vaccination schedule took off, helped by President Clinton’s Childhood Immunization Initiative in the mid-nineties. By 1998, childhood vaccination rates were at an all time high. So was the incidence of peanut allergy in children. Between 1997 and 2002, the peanut-allergic pediatric population in the U.S. grew by and average of 58,000 children a year, and doubled between 2002 and 2008. By 2008, more than one million children under 18 and another two million adults were allergic to peanuts in the United States alone.
According to Heather Fraser, “vaccination was the elephant in the middle of the room. Researchers glanced at it, knew it was there, but were reluctant to get too close.” The possibility that hundreds of thousands of children have been sensitized to peanuts by ingredients in one or more routine pediatic vaccinations is just too much to conceive. But it is too obvious to deny. The real clue is the sudden rise in peanut allergy following the escalation of the pediatric vaccine schedule.
Cross Reactivity and Vitamin K1
Most peanut-allergic patients have IgE antibodies against other legume proteins, including soybeans and other oil seed proteins, such as castor. At the same time that the vaccination schedules were accelerating in the mid-1980’s, doctors in the U.S. and many Western countries added a prophylactic injection for newborns. The purpose of this shot was to prevent hemorrhagic disease in newborns (HDN) or vitamin K-deficiency bleeding (VKDB). The two available brands contained castor seed oil, as well as aluminum, a well-known IgE stimulating adjuvant, 4% of which remains in the body indefinitely.
These ingredients remain in the body for an extended period of time, and are still being released as a baby receives its first Hib, DpaT, and Hep b shot at one or two months of age. IgE to castor could cross-sensitize a child to peanuts.
Why don’t ALL children react to peanuts? Ken Bock and other doctors treating children with autism spectrum disorders believe allergenicity is inversely related to an individual’s ability to detoxify. Children with peanut and other allergies have compromised immune systems and are poor detoxifiers. Most have gut problems, including fungal and other infections. Most are male.
Prevention and Rationalization
Screening children before each vaccination could help, but is antithetical to the goals of mass vaccination. Obviously, the “one size fits all approach to vaccination is simply not right. We have sizes of shoes, different ages of walking, teeth eruption, speaking and reading. We need to look individually at appropriate vaccine schedules.
But why should the burden be on the consumer and a family’s health-care providers? Clearly, vaccine manufacturers must take some responsibility. Right now they are basically financially exempt from ANY damage. Why? Because vaccines are BIG business tied to the military and school admission.
Furthermore, from an economic standpoint “food allergy” is BIG business. Think of all the enterprising companies producing peanut-, gluten-, casein-, soy-, and egg-free foods. Do we want to put them out of business? Hardly.
The biggest problem though is that it is virtually impossible to prove a causal link between vaccination and a later life-threatening allergy, even though the medical literature demonstrates that the ONLY means by which immediate and mass allergy has ever been created is by injection. Starting with combining the hypodermic needle and vaccines at the end of the 19th century, mass anaphylaxis exploded into the Western world.
We MUST have a formal study of vaccinated vs. unvaccinated populations. For starters, peanut allergy is virtually unknown in Amish communities, which discourage vaccination. Now that parents of children with autism are selecting not to vaccinate subsequent children, perhaps a target group is emerging. The National Vaccine Information Center (NVIC) has promised to pursue this research. Let’s hope it comes soon!
So for today, parents of peanut-allergic children are coping. Some have discovered ways to lessen their kids’ reactivity with energy medicine, acupuncture, NAET, and other alternative medicine techniques. But coping with an outcome that was forced upon them is unfair and insufficient. These parents must combine their forces as has the autism community and say “Enough!” Only then can we stop this runaway train.