Science

The Radical Way Microwaves Changed How We Eat

Transforming the way Americans cook in their homes.

by The Conversation
Getty Images/Quartz

By Timothy J. Jorgensen, Georgetown University

The year 2017 marks the 50th anniversary of the home microwave oven. The ovens were first sold for home use by Amana Corporation in 1967, but they had actually been used for commercial food preparation since the 1950s. It wasn’t until 1967, however, that technology miniaturization and cost reductions in manufacturing made the ovens small enough and cheap enough (a still steep US$495; US$3,575 in 2017 dollars) for use in the kitchens of the American middle class. Now, it would be hard to find a U.S. home without a microwave.

Amana, a subsidiary of Raytheon corporation, actually called their first model the “Radarange” – a contraction of radar and range (as in stove). What do microwave ovens have to do with radar?

Radar is an acronym for “radio detection and ranging.” Developed prior to World War II, the technology is based on the principle that radio waves can bounce off the surfaces of large objects. So if you point a radio wave beam in a certain direction, some of the radio waves will come bouncing back to you, if they encounter an obstruction in their path.

By measuring the bounced-back radio waves, distant objects or objects hidden from view by clouds or fog can be detected. Radar can detect planes and ships, but early on it was also found that rainstorms caused interference with radar detection. It wasn’t long before the presence of such interference was actually utilized to track the movement of rainstorms across the landscape, and the age of modern radar-based weather forecasting began.

At the heart of radar technology is the “magnetron,” the device that produces the radio waves. During World War II, the American military couldn’t get enough magnetrons to satisfy their radar needs. So Percy Spencer, an engineer at Raytheon, was tasked with ramping up magnetron production. He soon redesigned the magnetron so that its components could be punched out from sheet metal – like sugar cookies are cut from dough – rather than each part needing to be individually machined. This allowed mass production of magnetrons, raising wartime production from just 17 to 2,600 per day.

Original cavity magnetron as used to develop radar.

Mrjohncummings, CC BY-SA

One day, while Spencer was working with a live magnetron, he noticed that a candy bar in his pocket had started to melt. Suspecting that the radio waves from the magnetron were the cause, he decided to try an experiment with an egg. He took a raw egg and pointed the radar beam at it. The egg exploded from rapid heating. Another experiment with corn kernels showed that radio waves could quickly make popcorn. This was a remarkably lucky find. Raytheon soon filed for a patent on the use of radar technology for cooking, and the Radarange was born.

Amana Radarange commercial from 1976.

As time passed and other companies got into the business, the trademarked Radarange gave way to more generic terminology and people started calling them “microwave ovens,” or even just “microwaves.” Why microwaves? Because the radio waves that are used for cooking have relatively short wavelengths. While the radio waves used for telecommunications can be as long as a football field, the ovens rely on radio waves with wavelengths measured in inches (or centimeters); so they are considered “micro” (Latin for small), as far as radio waves go.

Microwaves are able to heat food but not the paper plate holding it because the frequency of the microwaves is set such that they specifically agitate water molecules, causing them to vibrate rapidly. It is this vibration that causes the heat production. No water, no heat. So objects that don’t contain water, like a paper plate or ceramic dish, are not heated by microwaves. All the heating takes place in the food itself, not its container.

Microwaves have never completely replaced conventional ovens, despite their rapid speed of cooking, nor will they ever. Fast heating is not useful for certain types of cooking like bread-baking, where slow heating is required for the yeast to make the dough rise; and a microwaved steak is no taste match for a broiled one. Nevertheless, as the fast-paced American lifestyle becomes increasingly dependent upon processed foods, reheating is sometimes the only “cooking” that’s required to make a meal. Microwave ovens’ uniform and rapid heating make them ideal for this purpose.

Over the years, there have been many myths associated with microwave cooking. But the truth is that, no, they don’t destroy the food’s nutrients. And, as I explain in my book “Strange Glow: The Story of Radiation,” you don’t get cancer from either cooking with a microwave oven or eating microwaved food. In fact, the leakage standards for modern microwave ovens are so stringent that your candy bar is safe from melting, even if you tape it to the outside of the oven’s door.

What’s the deal with metal in the microwave?

Nevertheless, you should be careful about microwaving food in plastic containers, because some chemicals from the plastic can leach into the food. And, yes, you shouldn’t put any metal in the microwave, because metallic objects with pointed edges can interact with the microwaves from the magnetron in a way that can cause electrical sparking (arcing) and consequently damage the oven or cause a fire.

The microwave oven has definitely transformed the way most of us cook. So let’s all celebrate the 50th anniversary of the home microwave and the many hours of kitchen drudgery it has saved us from. But if you want to mark the date with an anniversary cake, best not to cook it in your microwave – you’d likely end up with just a very hot and unappetizing bowl of sweet mush.

Timothy J. Jorgensen, Director of the Health Physics and Radiation Protection Graduate Program and Associate Professor of Radiation Medicine, Georgetown University.

This article was originally published on The Conversation. Read the original article.

Related Tags