REFRIGERATION. The preservation of winter ice, principally for cooling drinks in summer, was practiced from antiquity, when ice was stored in insulated caves. Ice cooling was so popular in America that, within a decade of the Revolution, the United States led the world in both the production and consumption of ice. As the use of ice was extended to the preservation of fish, meat, and dairy products, ice production became a major industry in the northern states, and ice was shipped not only to the southern states but to the West Indies and South America. By 1880 the demand in New York City exceeded the capacity of 160 large commercial icehouses on the Hudson River. This industry reached its peak in 1886, when 25 million pounds were "harvested" by cutting lake and river ice with
horse-drawn saws. The advent of mechanical refrigeration in the nineteenth century began a slow decline in the ice trade. The natural ice industry had expired by 1920.
A method of making ice artificially, lowering the temperature of water by accelerating its evaporation, had long been practiced in India, where water in porous clay dishes laid on straw evaporated at night so rapidly as to cause ice to form on the surface of the remaining water. This method endured until as late as 1871, when a U.S. patent utilized this principle for a refrigerator that depended on the evaporation of water from the porous lining of the food compartment. But previous scientific discoveries had suggested other, more efficient methods of cooling.
Several types of refrigeration machines were developed in the nineteenth century. All depended on the absorption of heat by expanding gases, which had been a subject of scientific research in the previous century. It had been observed that the release of humid compressed air was accompanied by a stream of pellets of ice and snow. This phenomenon, the cooling effect of expanding air, led to the development of the Gorrie ice machine in the 1840s. John Gorrie, a physician in Apalachicola, Florida, sought ways to relieve the suffering of malaria victims in the southern summer climate. Gorrie's machine injected water into a cylinder in which air was compressed by a steam engine. This cooled the air. The air expanded in contact with coils containing brine, which was then used to freeze water. Gorrie's machine saw scarce use (although at least one was built—in England), but his 1851 patent, the first in the United States for an ice machine, served as a model for others, including the first large machines that in the 1880s began to compete with lakes and rivers as a source of ice.
More important were "absorption" machines, based on the observation, made by Sir John Leslie in 1810, that concentrated sulfuric acid, which absorbs water, could accelerate the evaporation of water in a dish to such a degree as to freeze the remaining water. This method of artificial ice production was patented in England in 1834. Scientists further increased the efficiency level by using ammonia as the evaporating fluid and water as the absorbing fluid. Ferdinand Carré developed the first important example in France in 1858. In this machine, Carré connected by tube a vessel containing a solution of ammonia in water to a second vessel. When the former heated and the latter cooled (by immersing it in cold water), the ammonia evaporated from the first vessel and condensed in the second. Heating was then terminated and the ammonia allowed to reevaporate, producing a refrigerating effect on the surface of the second (ammonia-containing) chamber. Such a "machine" was no automatic refrigerator, but it was inexpensive and simple and well suited for use in isolated areas. One of them, the Crosley "Icyball," was manufactured in large numbers in the United States in the 1930s. Supplied with a handle, the dumbbell-shaped apparatus had to occasionally be set on a kerosene burner, where the "hot ball" was warmed and then moved to the "ice box," where the "cold ball" exercised its cooling effect (the hot ball being allowed to hang outside).
The vapor compression system would soon replace all of these early designs. In a vapor compression machine, a volatile fluid is circulated while being alternately condensed (with the evolution of heat) and evaporated (with the absorption of heat). This is the principle of the modern refrigeration machine. Oliver Evans, an ingenious American mechanic, had proposed in 1805 to use ether in such a machine, and in 1834 Jacob Perkins, an American living in London, actually built one, using a volatile "ether" obtained by distilling rubber. Perkins built only one machine, but improved versions—based in large part on the patents of Alexander C. Twining of New Haven, Connecticut—were developed and actually manufactured in the 1850s.
The earliest demand for refrigeration machines came from breweries, from the Australian meat industry, and from southern states that wanted artificial ice. Only when the operation of such machinery was made reliably automatic could it serve the now familiar purpose of household
refrigeration, a step successfully taken by E. J. Copeland of Detroit in 1918. Another important step was the use of less hazardous fluids than the ethers common in commercial refrigerators. Ammonia replaced the ethers in a few machines from the 1860s and became the most common refrigerating fluid by 1900. But ammonia was only slightly less hazardous than ether. The problem was to be solved through a research program remarkable for its basis in theoretical science. In the 1920s, Thomas Midgley Jr., with the support of the General Motors Corporation, studied the relevant characteristics (volatility, toxicity, specific heat, etc.) of a large number of substances. The result was a description of an "ideal" refrigerating fluid, including a prediction of what its chemical composition would be. Midgley then proceeded to synthesize several hitherto unknown substances that his theory indicated would possess these ideal characteristics. In 1930 he announced his success with the compound dichlorodifluoromethane. Under the commercial name Freon 12, it became the most widely used refrigerant.
The most noteworthy subsequent development in the use of refrigeration has been the introduction of frozen foods. It began in 1924 with an apparatus in which prepared fish were carried through a freezing compartment on an endless belt, developed by Clarence Birdseye of Gloucester, Massachusetts. By 1929 Birdseye had modified the apparatus to freeze fresh fruit, and in 1934 frozen foods were introduced commercially. Since World War II, progress in refrigerator design has focused on efficiency. The energy crisis of the 1970s spurred the first state regulatory standard for refrigerator efficiency. A California law passed in 1976 over the objection of appliance manufacturers required that 18-cubic-foot refrigerators sold in the state conform to a minimum efficiency level of 1400 kilowatt-hours (kwh) per year. California's large market share made it relatively easy for the federal government to hold refrigerators to a standard of 900 kwh in 1990 and then 700 kwh in 1993. Advances in refrigerator design have improved the consumption of the average refrigerator by over 60 percent in the last three decades of the twentieth century.
Anderson, Oscar E., Jr. Refrigeration in America: A History of a New Technology and Its Impact. Port Washington, N.Y.: Kennikat Press, 1972.
R. P.Multhauf/a. r.
re·frig·er·ate / riˈfrijəˌrāt/ • v. [tr.] subject (food or drink) to cold in order to chill or preserve it, typically by placing it in a refrigerator: refrigerate the dough for one hour. DERIVATIVES: re·frig·er·a·tion / riˌfrijəˈrāshən/ n. re·frig·er·a·to·ry / riˈfrijərəˌtôrē/ adj.