As I often do, I recently spent a few moments on a Friday morning reviewing the CDCs “Current Outbreak List” page. This resource is a single page clearinghouse for recent United States and international outbreaks reported on by the Centers for Disease Control and Prevention. On the day I reviewed the page, recent Salmonella-based outbreaks in breakfast cereal, pre-cut melon, live poultry and backyard flocks and pet guinea pigs were mentioned. Other Salmonella-related outbreaks occurring in 2018 included dried coconut, chicken salad, Kratom, raw sprouts, cucumbers, and papayas.
The predominance of Salmonella among food-borne pathogens is not surprising, as non-typhoidal Salmonellae in the US is one of the leading causes of foodborne disease. It is also a major cause of worldwide diarrhea. The high prevalence of non-typhoidal salmonellosis in the food supply has been primarily associated with agricultural reservoirs, most commonly with poultry and chicken eggs. In recent years, outbreaks associated with raw produce have increased in prevalence, perhaps related to contamination of the food chain resulting from run off from animal agricultural lots, or the secondary cross-contamination of produce during harvesting, processing, or food preparation. Yet this is not a new phenomenon. Foodborne outbreaks have been an issue in the U.S. food industry for decades.
In the course of researching this topic, I came across a media report entitled, “Expert: Technology could help make food safer.” The article was an interview of Dr. Dennis Maki, a well-known professor in epidemiology at the University of Wisconsin School of Medicine and Public Health. Dr. Maki noted that the era of feeding 300 million plus people through the use of small farms and simple agricultural techniques has long since passed.
Dr. Maki said,
“it’s important to realize there are 300 million people in the US. The only way you are going to feed 300 million people is by industrial techniques. The common complaint or the plea is, ‘lets return back to 1945, let’s have small farms, let’s not have industrial farming and industrialized production of food on a huge scale. Then we wouldn’t have the problems that we have now.” But such an approach would not result in the ability of our agricultural industry to feed a 21st Century population, he asserted.
Dr. Maki emphasized several approaches for the consumer to reduce their likelihood of acquiring foodborne illness. (1) Buy food from reputable sellers. Most large grocery chains are highly reputable, or they would not be in business. (2) Wash fruits and vegetables thoroughly as this will greatly reduce, though not completely eliminate contamination. (3) Cook meats properly and to the recommended temperature.
The article featuring Dr. Maki, though very relevant today, was not from 2018, but 2007. And over the last decade, seemingly very few manufacturing components of the agricultural industry appear to have changed. However, the grocery marketplace continues to emphasize foods that are more “natural,’ ‘organic,’ or preservative-free. Similarly, upscale grocery chains promote food products that are marketed as being less associated with the food-industrial complex (grain fed beef, cage-free eggs, organic sprouts, etc.). At the same time, the public’s insatiable demand for all-season produce results a food distribution system that needs to be worldwide. (i.e., grapes from Chile, avocados from Mexico, pineapples from Thailand, oranges from Brazil, figs from Turkey, etc.)
So the original question in the 2007 article still applies today: “Can technology reduce infection in our food?” One of the most fascinating and progressive areas where technology is pushing the envelope is in the meat industry. American consumers eat 26 billion pounds of beef a year. A single cow is said to consume up to 11,000 gallons of water annually. The fast food chain McDonald’s cells 75 hamburgers every second. As a result, the cultivation of meat products requires a huge worldwide industry that is estimated to be responsible for 15-20% of all greenhouse gas emissions. Furthermore, the production of the millions of poultry, pork and beef livestock animals places considerable demands on agricultural areas to contain and manage the tremendous quantities of animal waste and other industrial byproducts of meat processing, much of which is at risk for contamination of the environment and at various points in the food chain.
As a result, several Silicon Valley technology start ups have for several years now been working on different lab-based techniques for the production of meat. The concept in itself is deceptively elementary: What if the production of meat could be made as simple as growing yeast in a vat? If we could turn the philosophy of, “kill it and grill it” to “fill it and then distill it.” If we could culture meat in a lab and then grow it utilizing industrial techniques similar to beer production, this would eliminate a large proportion of the land use, water consumption, greenhouse gas admissions, and production of animal byproducts that result in the runoff of enteric pathogens such as Salmonella and the contamination of the environment and of our food.
Just, a Silicon Valley company, is perhaps the leader in attempts to bring the first “clean meat” products to market. Their process involves the development of a large-scale cell culture process for cultivating sustainable meat cells into high-grade meat and seafood with a production process that promises to be over 10 times more efficient then the world’s largest slaughter houses. They claim their process uses unmodified cells which naturally occur in livestock animals and can be grown antibiotic free. Because the process does not involve slaughtering live animals, they claim their manufacturing chain will have a significantly lower risk of foodborne illness (and be much less negatively impactful on the animals in general). Other lab-based meat companies looking to develop similar technologies include Memphis Meats and Future Meat Technologies.
Such food technology elicits a wide variety of reactions from the public, ranging from those who say, “I would never eat that,” to individuals who claim that “cultured meat is not natural,” to the more adventurous who boast, “I’ll eat it if it tastes good and is safe.” But the history of food in the U.S. suggests that similar battles have been waged in the past, from the adoption of margarine as a butter substitute in the late 19th Century to the rise of nut-based milk products (e.g., soy milk, almond milk, etc) in the last 15 years, with the new products ultimately finding acceptance if they are felt to be economical, healthier, and no less palatable than their traditional counterparts. But with the production of meat and seafood estimated to double to 1.2 trillion pounds by 2050, given a finite land supply and limited environimental resources, current industrialized processes for meat and food production are unsustainable, and new techniques are clearly needed.
The development of cultured meat products has led to an industry debate, led mainly by traditional agricultural industry firms and trade associations, on what constitutes the terms “meat” and “beef” at all. The FDA will hold public hearings on this topic this summer.
As consumers, all of us can play a role in ensuring the safety of the U.S. food system in the products we consume, the choices we make, the food preparation techniques we use, and in the ways we maintain awareness of what is going on with our food supply.
Please note that the opinions expressed in this blog post are my own and do not necessarily reflect the views of IDPodcasts, the Division of Infectious Diseases, USF Health, or the University of South Florida.