By Colleen Kim, Historian
In the few short weeks since the coronavirus outbreak began, social distancing and lockdown measures have changed the way many Americans shop for food. This pandemic is revealing the extent to which American shopping and eating habits rely on past technological innovation.
While non-perishable food hardly seems like a cutting edge innovation, there have been dramatic developments in food preservation in the hundred years since the 1918 influenza pandemic. We’ve moved away from seasonally bound, locally produced, and short-lived foods. In situations like the one we find ourselves in today, we see that relying heavily on shelf-stable foods to supplement fresh produce can help us stay home and limit the spread of disease.
Our great grandparents had very different relationships with products that are commonplace on today’s grocery store shelves. In this post, I’ll look at two examples of the pantry staples we take for granted—or perhaps took for granted, until faced with recent shortages.
Eggs are an essential ingredient in most American households. We tend to stock up before snowstorms, hurricanes, and in this case, pandemics. However, before the widespread adoption of electricity, eggs were a seasonal food.
Historically, hens laid the majority of their eggs between April and June, when newborn chicks were most likely to survive. In the early 1900s, refrigerated storage techniques, efficient transportation networks, and hen breeding methods helped to even out egg supply throughout the year, but not completely. Consumer concerns about the quality and safety of months-old eggs remained.
In the 1910s, farmers made the accidental discovery that electric lighting in hen houses changed laying behavior. Hens in electrified coops began laying eggs even in winter. Knowledge about this practice spread and more farmers implemented it during the 1930s, as New Deal programs brought electricity to rural America. By the 1970s, fresh egg production was constant across the whole year.
Canned foods, in some form, have existed for more than two hundred years. For most of that time, canning methods were slow, labor-intensive, and could result in tainted food that led to illness or death. In the 1800s and early 1900s, canned foods were packed by hand in handmade tins. The heating process used to kill bacteria and preserve food was often uneven, resulting in some spoiled or unsafe cans. Before mechanically produced tins became the norm, tins were sealed with lead solder, sometimes causing lead contamination in acidic foods like tomatoes.
Over time, large scale producers like Libby’s began employing industrial canning processes that were relatively safe and uniform. However, smaller producers still used older methods that held the risk of contamination with bacteria such as botulism.
In the 1910s, anxious and distrustful Americans read newspapers reporting death and illness resulting from canned food. At the same time, the federal government strongly encouraged Americans to can their own food to preserve limited resources during World War I. One newspaper in 1917 put it this way: “Those billion cans are a huge monument to patriotism and the speed and efficiency with which democratic America can rally to meet an emergency.”
Botulism poisoning sometimes resulted from improper home canning, but false rumors spread that these cases were a result of German-made canning lids that had been intentionally contaminated. To combat these fears, newspapers printed instructions for reliable and hygienic methods of home canning. Anxiety over home canning made many consumers reluctant to purchase industrially canned foods as well.
In the 1930s, industrial canning became widely safe. Years of dedicated study by scientists, medical professionals, and government and industry groups led to their understanding of the science behind canned food contamination and disease. In a time of crisis, it is comforting to know that the food products we rely on will be safe for ourselves and our families.