My daughter just turned twenty-one a few months ago. Watching her plow through a plate of Queso Espanoles (with Sangria) at one of her favorite spots near school, you’d never believe that as a toddler she couldn’t get near anything even resembling dairy. At about two years old, a single piece of buttered popcorn found between the sofa cushions just about closed up her throat.
There have always been people with allergies. But it’s clear to even the most casual observer that things are insane today compared to the past. Sorting out the nut, dairy, gluten, strawberry, etc. allergies is just an expected part of kid gatherings these days. So WTF is going on?
When our kids were little, the accepted recommendation was to strictly avoid common allergens until at least age six. We thought that exposure to this stuff too early was the, or at least one, cause of allergies. And boy did we listen — after all, I still can’t believe the world trusted me to raise an actual human — I wasn’t going to screw it up.
Whoops. Or at least, maybe whoops.
The scintillatingly-titled Randomized Trial of Peanut Consumption in Infants at Risk for Peanut Allergy, published last Monday in the New England Journal of Medicine, seems pretty clearly to show that we were exactly, 100% wrong with this approach.
Let’s make things really simple. Basically, one group of kids got peanuts in their diet, and the other did not. The peanut kids developed allergies 3.2% of the time. The non-peanut kids? 17.2%.
Holy crap, are you kidding me? By avoiding peanuts, risk of allergy increased by 14%. THAT IS A SCARY-HUGE DIFFERENCE.
Of course, there are lots of ways to pick at studies, and you can do that here too. But even if you seriously handicap things — say, assume that every kid that fell away from the study had the “wrong” reaction (all peanut kids got allergies, all non-peanut kids did not) — the numbers are still impressive at 4.8% to 16.8%. You may get some bias out of the lack of diversity in the kids, or because the families knew which cohort they were in. But the difference is so significant, it’s hard to imagine any of those would flip the results completely.
In retrospect, this make sense. By sensitizing the immune system to various substances at low doses, you give it a chance to learn. Cells that freak out in the presence of these pseudo-antigens will be suppressed by normal selection processes, so they aren’t given a chance to expand in the first place.
The longer I’m around this stuff, the more I believe that adaptive immunity really is the key to almost everything. Hopefully these results will be duplicated and we can reverse what is otherwise a pretty unsettling trend.
As for our daughter … sorry Alex, my bad.
PS. All this reminds me of that awesome Canadian professor who postulated that eating boogers could help boost the immune system. Yum!