Table of Contents
Health officials in Illinois recently turned to the generative AI model ChatGPT to validate a working hypothesis during a complex investigation into a Salmonella outbreak. According to a newly released Morbidity and Mortality Weekly Report (MMWR), investigators in Brown County utilized the chatbot to confirm suspicions about a contaminated beer cooler at a local fair, marking a notablealbeit controversialintersection of artificial intelligence and epidemiology.
The investigation began retrospectively following an event in August 2024, when the Brown County sheriff noticed an unusual number of potential jurors citing stomach bugs to avoid duty. Simultaneously, the state health department flagged a case of Salmonella enterica serotype Agbeni. These triggers led to the identification of 13 casesseven confirmed and six probablespanning five counties. The common denominator was the Brown County fair, a popular rural event attended by approximately 36,000 people.
The Suspect: A Makeshift Drainage Tile Cooler
Traditional contact tracing revealed a puzzling data point: while nine victims had eaten food at the fair, four had not eaten anything, ruling out food vendors as the sole source. However, all 13 individuals had consumed cold canned beer from the fair's single beer tent. Upon closer inspection, investigators discovered a severe hygiene lapse. The beer was stored in a makeshift cooler constructed from a "10-ft length of non-food-grade corrugated black plastic farm drainage tile" divided into four compartments.
The report details that this makeshift cooler was hosed off once at the start of the fair but never fully drained or sanitized again. Instead, it was topped off daily with ice made from municipal tap water. A critical tip from a worker revealed that leftover food had been stored in the cooler overnight, likely introducing the bacteria. The hypothesis was that the melting ice became a slurry of Salmonella, contaminating the lips of the beer cans which patrons then touched to their mouths.
AI as a Public Health Consultant
With the physical evidence (the cooler) already dismantled by the time the connection was made, county health official Katherine Houser turned to ChatGPT for "situational awareness." Investigators fed the chatbot details of the outbreak and asked specific technical questions, including:
- "Will S. Agbeni grow in an improperly drained cooler?"
- "Are any other sources, other than ice, likely if only canned beverages and no foods were available at this location?"
- "What examples of similar outbreaks have been documented in scientific literature?"
The chatbot affirmed that the cooler was a "credible and likely" source. Houser noted in the report that "AI was effective in this rural setting for rapid situational awareness." However, she explicitly acknowledged the risks, stating that all AI-generated summaries were "critically reviewed and validated against primary literature" due to the potential for inaccuracies and lack of source transparency.
My Take
The use of ChatGPT in this investigation highlights a growing trend where general-purpose AI tools are used to fill knowledge gaps in resource-constrained rural settings. While the outcome here was positive, it sets a precarious precedent. An AI chatbot is a probabilistic engine, not a verified database like PubMed. If officials had relied solely on the AI without the subsequent manual verification against primary literature, a hallucination could have derailed the entire investigation. This case proves AI can serve as a useful "second opinion" generator for hypothesis testing, but it must never replace rigorous, evidence-based lab work.