How did antibiotics become part of the food chain?
Evidence indicates that drug resistance spills over from livestock to people. Richard Conniff reports.
Cosmos
On Christmas Day in 1948, a biochemist named Thomas H. Jukes slipped away from his family and made the short drive to his workplace at drug company Lederle Laboratories in Pearl River, New York. A small experiment with chickens was in progress, and the subordinate who would normally have weighed and fed the birds was home for the holiday.
As he made his rounds, Jukes noticed something peculiar. In one group of a dozen chicks, the feed was being supplemented with a pricey new liver extract, which was certain to make those birds gain weight much faster than normal. But Jukes was puzzled that birds in another group were growing even faster. The only thing added to their feed was a new antibiotic called Aureomycin. “The record shows,” Jukes later wrote, staking his claim to the discovery, “that I weighed the chicks on Christmas Day, 1948.”
No one understood how or why an inexpensive antibiotic could cause animals to put on meat more rapidly. (Even now, researchers talk only about “proposed possible mechanisms” to explain it.) But after some quick follow-up work in the field confirmed the finding, Lederle rolled out its product. It was the beginning of a vast uncontrolled experiment to transform the biology of the animals we eat – and perhaps also the biology of the humans who eat them. Antibiotics added to feed at very small, or subtherapeutic, levels would quickly become a standard tool for rearing food animals, so much so that, even now, about 80% of antibiotic sales in the US go to livestock production rather than to human health care.