Can Technology Reduce Infection in the Food Industry?

July 13th, 2018

As I often do, I recently spent a few moments on a Friday morning reviewing the CDCs “Current Outbreak List” page. This resource is a single page clearinghouse for recent United States and international outbreaks reported on by the Centers for Disease Control and Prevention. On the day I reviewed the page, recent Salmonella-based outbreaks in breakfast cereal, pre-cut melon, live poultry and backyard flocks and pet guinea pigs were mentioned. Other Salmonella-related outbreaks occurring in 2018 included dried coconut, chicken salad, Kratom, raw sprouts, cucumbers, and papayas.

The predominance of Salmonella among food-borne pathogens is not surprising, as non-typhoidal Salmonellae in the US is one of the leading causes of foodborne disease. It is also a major cause of worldwide diarrhea. The high prevalence of non-typhoidal salmonellosis in the food supply has been primarily associated with agricultural reservoirs, most commonly with poultry and chicken eggs. In recent years, outbreaks associated with raw produce have increased in prevalence, perhaps related to contamination of the food chain resulting from run off from animal agricultural lots, or the secondary cross-contamination of produce during harvesting, processing, or food preparation. Yet this is not a new phenomenon. Foodborne outbreaks have been an issue in the U.S. food industry for decades.

In the course of researching this topic, I came across a media report entitled, “Expert: Technology could help make food safer.” The article was an interview of Dr. Dennis Maki, a well-known professor in epidemiology at the University of Wisconsin School of Medicine and Public Health. Dr. Maki noted that the era of feeding 300 million plus people through the use of small farms and simple agricultural techniques has long since passed.

Dr. Maki said,

“it’s important to realize there are 300 million people in the US. The only way you are going to feed 300 million people is by industrial techniques. The common complaint or the plea is, ‘lets return back to 1945, let’s have small farms, let’s not have industrial farming and industrialized production of food on a huge scale. Then we wouldn’t have the problems that we have now.” But such an approach would not result in the ability of our agricultural industry to feed a 21st Century population, he asserted.

Dr. Maki emphasized several approaches for the consumer to reduce their likelihood of acquiring foodborne illness. (1) Buy food from reputable sellers. Most large grocery chains are highly reputable, or they would not be in business. (2) Wash fruits and vegetables thoroughly as this will greatly reduce, though not completely eliminate contamination. (3) Cook meats properly and to the recommended temperature.

The article featuring Dr. Maki, though very relevant today, was not from 2018, but 2007. And over the last decade, seemingly very few manufacturing components of the agricultural industry appear to have changed. However, the grocery marketplace continues to emphasize foods that are more “natural,’ ‘organic,’ or preservative-free. Similarly, upscale grocery chains promote food products that are marketed as being less associated with the food-industrial complex (grain fed beef, cage-free eggs, organic sprouts, etc.). At the same time, the public’s insatiable demand for all-season produce results a food distribution system that needs to be worldwide. (i.e., grapes from Chile, avocados from Mexico, pineapples from Thailand, oranges from Brazil, figs from Turkey, etc.)

So the original question in the 2007 article still applies today: “Can technology reduce infection in our food?” One of the most fascinating and progressive areas where technology is pushing the envelope is in the meat industry. American consumers eat 26 billion pounds of beef a year. A single cow is said to consume up to 11,000 gallons of water annually. The fast food chain McDonald’s cells 75 hamburgers every second. As a result, the cultivation of meat products requires a huge worldwide industry that is estimated to be responsible for 15-20% of all greenhouse gas emissions. Furthermore, the production of the millions of poultry, pork and beef livestock animals places considerable demands on agricultural areas to contain and manage the tremendous quantities of animal waste and other industrial byproducts of meat processing, much of which is at risk for contamination of the environment and at various points in the food chain.

As a result, several Silicon Valley technology start ups have for several years now been working on different lab-based techniques for the production of meat. The concept in itself is deceptively elementary: What if the production of meat could be made as simple as growing yeast in a vat? If we could turn the philosophy of, “kill it and grill it” to “fill it and then distill it.” If we could culture meat in a lab and then grow it utilizing industrial techniques similar to beer production, this would eliminate a large proportion of the land use, water consumption, greenhouse gas admissions, and production of animal byproducts that result in the runoff of enteric pathogens such as Salmonella and the contamination of the environment and of our food.

Just, a Silicon Valley company, is perhaps the leader in attempts to bring the first “clean meat” products to market. Their process involves the development of a large-scale cell culture process for cultivating sustainable meat cells into high-grade meat and seafood with a production process that promises to be over 10 times more efficient then the world’s largest slaughter houses. They claim their process uses unmodified cells which naturally occur in livestock animals and can be grown antibiotic free. Because the process does not involve slaughtering live animals, they claim their manufacturing chain will have a significantly lower risk of foodborne illness (and be much less negatively impactful on the animals in general). Other lab-based meat companies looking to develop similar technologies include Memphis Meats and Future Meat Technologies.

Such food technology elicits a wide variety of reactions from the public, ranging from those who say, “I would never eat that,” to individuals who claim that “cultured meat is not natural,” to the more adventurous who boast, “I’ll eat it if it tastes good and is safe.” But the history of food in the U.S. suggests that similar battles have been waged in the past, from the adoption of margarine as a butter substitute in the late 19th Century to the rise of nut-based milk products (e.g., soy milk, almond milk, etc) in the last 15 years, with the new products ultimately finding acceptance if they are felt to be economical, healthier, and no less palatable than their traditional counterparts. But with the production of meat and seafood estimated to double to 1.2 trillion pounds by 2050, given a finite land supply and limited environimental resources, current industrialized processes for meat and food production are unsustainable, and new techniques are clearly needed.

The development of cultured meat products has led to an industry debate, led mainly by traditional agricultural industry firms and trade associations, on what constitutes the terms “meat” and “beef” at all. The FDA will hold public hearings on this topic this summer.

As consumers, all of us can play a role in ensuring the safety of the U.S. food system in the products we consume, the choices we make, the food preparation techniques we use, and in the ways we maintain awareness of what is going on with our food supply.

Please note that the opinions expressed in this blog post are my own and do not necessarily reflect the views of IDPodcasts, the Division of Infectious Diseases, USF Health, or the University of South Florida.

Ten Years of IDPodcasts

May 23rd, 2018

Ten Years of IDPodcasts

By Richard L Oehler, MD

There’s an expression I’ve often seen online — “what a difference a decade makes.” And it resonates across many epochs of our recent history. Historians can look at the transition our country made in the 1960’s, for instance, and see the United States emerging from the consumer-based post-war era of the late 1950s to the turbulent and transformative year of 1969 as indicative of how much things can change over 10 years.

As I reflect upon IDPodcast’s ten-year milestone, recently celebrated in 2017, I think about the many ways medicine, education and society have changed since 2007, when I first sat down with our Chief of Infectious Diseases and IDPodcasts’ co-founder, Dr. John T Sinnott, to look at ways that USF’s Division of Infectious Diseases could broaden its outreach and share its teaching excellence beyond the small community of medical students, residents, and clinical faculty that had been attending our conferences for many years. I credit Dr. Sinnott with the original idea of broadening our educational footprint, when I recall him saying, “why don’t you take these lectures, record them, and place them online on a website.”

A light went on in my head. A website for infectious diseases podcasts, “IDPodcasts,” I thought.  “Wait a minute. If we recorded these lectures, then maybe I and others can listen to them on an iPod.”  I immediately started thinking of how I could set up and design a website and get us online.

It’s easy to forget: In early 2007, most online content was limited to desktop computers or laptops.  Portable content was available only through mobile MP3 players such as the iPod. (Thus, the name, “POD-casts.”) Although the iPhone was first announced in January, 2007, it was not released to consumers until July of that year. The first iPhone had no installable apps, sluggish 2G (EDGE) connectivity, and a relatively tiny low resolution screen. Tablets did not exist at that time.

Online streaming was in its infancy in 2007. YouTube had just been acquired by Google the year before, and was relatively not well known by online users.

 

YouTube in 2006. (Source, Wayback Machine)

Other educational online streaming sites were also in figurative diapers. Khan academy was founded in October, 2006, but did not offer regular online content until 2008. Most educational medical content was distributed via VHS Cassette, CD-Rom or DVD.

Utilizing website creation tools available through Apple’s now obsolete iWeb program, IDPodcasts.net first went live on June 27, 2007. The inaugural podcast was, “A tour of the Medical Wing of the London Museum of Science.”  A great many podcasts in a variety of different infectious diseases subcategories soon followed, and the website celebrated its 50th podcast in 2008, entitled, “A Global Swarming: Infectious Diseases and Climate Change.”

After IDPodcasts established itself as the very first infectious diseases online podcast site, we started looking at other ways we could share content online. Given the popularity of iPods during that time, we inaugurated IDPodcasts’ iTunes Podcast Channel in 2008.

While our website was attending to establish a foothold online, another revolution was rapidly advancing in the mobile universe—smartphones.  The first mobile online applications were being introduced with the next generation iPhone, and our quest to capture medical users of smart phone devices became our next initiative. In 2009, we partnered with Absolute Mobile Solutions in Tampa, Florida to create our first IDPodcasts app.  The IDPodcasts mobile viewer was the first streaming iPhone app at the University of South Florida and the entire state of Florida University system.

As mobile online devices became more popular, device manufacturers soon began to innovate and look for more ways that mobile devices could replace traditional desktop or laptop computers. In 2010, the iPad was born, and IDPodcasts introduced the first iPad app created at the University of South Florida, and then introduced a sister app for android mobile devices.

In 2011, the website introduced social media integration into its main page, permitting users to be notified through their Facebook or Twitter pages of new online content. To capture users of the increasingly popular streaming site, YouTube, in 2012, we inaugurated the IDPodcasts YouTube channel, which has gone on to become one of our most successful online sites with thousands of views.

Our latest generation smartphone and tablet apps premiered in 2016, incorporating many new features such as favorite and playlists, Apple watch controls, enhanced search features, and social media integration.

We commemorated our tenth anniversary in 2017 with the “IDPodcaster awards,” celebrating the best of our podcast presentations over our history. And as we began our second ten years in 2018, IDPodcasts premiered its most significant upgrade yet:  A brand new website designed for maximum usability, speed, and performance across any desktop, portable, or handheld device.  And we look forward to even more refinements and enhancements moving forward. This includes our new blog feature, which recently has highlighted the artistic and literary talents of our faculty and invited contributers.

Throughout the last ten-plus years, IDPodcasts has continued to publish high-quality, well produced, and educationally innovative content at no cost to our listeners. We accept no commercial sponsorship, relying on university and small contributor support. Our biggest reward is the feedback we get from our listeners, including clinicians and lay people from more than 160 countries around the world. Our goal, as our new slogan suggests, is to simply “make infectious disease learning contagious.”

IDPodcasts owes a huge debt of gratitude to its cofounder and Chief of the Department of Internal Medicine, Dr. John T. Sinnott, as well as the Division of Infectious Diseases Director, Dr. Douglas Holt, for their unwavering support of this decade-long educational initiative.  And ID podcasts is especially indebted most of all to its more than 60 faculty contributers who have shared their teaching excellence with a worldwide audience over more than 10 years, as well as its loyal online audience.