Reprinted from FutureFood 2050, an initiative of the Institute of Food Technologists
A birthday party where every child can enjoy the cake and ice cream. Restaurant meals that don’t end in an emergency room visit. Halloween trick-or-treating without an [adrenaline shot]. This vision of the future is what motivates Steve L. Taylor, a food scientist at the University of Nebraska-Lincoln and a leader in food allergy research.
In fact, Taylor says, the food supply in the United States and many other developed countries has never been safer for people with serious food allergies, thanks to clearer labeling, increased vigilance and better risk assessment. Yet while Taylor says he’s cautiously optimistic, he believes the food industry and public health agencies can do more for the millions of people for whom food is a ticking time bomb.
Evolution of allergy awareness
Food allergies aren’t a new phenomenon, although it can feel that way as they become ever more common. According to the U.S. Centers for Disease Control and Prevention (CDC), the prevalence of food allergies in children rose steeply from 3.4 percent in 1997–1999 to 5.1 percent in 2009–2011. Although scientists are still trying to determine the cause of that rise, the increased prevalence has forced the public and the food industry to take notice. That awareness helped lead to a law, enacted in 2006, requiring manufacturers to state in plain language whether a product contains any of the eight most common food allergens—a development that has had a positive impact on allergic consumers, he says.
![]() |
Steve Taylor |
At that time, he says, most allergists weren’t focused on treating food allergies. “The practical advice was that if peanuts make you sick, don’t eat peanuts,” he says.