MANUFACTURING: AN OVERVIEW
Pamela L. Kester
Manufacturing: An Overview
When the Civil War began in 1861, neither side in the conflict expected it to last very long. The South expected that due to the superiority of its fighting spirit it would defeat the North quickly on the battlefield and force it to recognize the independence of the Confederate States of America (McPherson 1988, pp. 316–317). Many people in the North were also eager for a brawl, expecting that with the Union's greater population and industrial capacity, it would quickly be able to make the South come to its senses (Foote 1958, pp. 50–60). As the next four years would demonstrate, however, both sides were grossly mistaken in their initial assessments of the other's ability to sustain the war, but in the long run the North was correct about its ability to overwhelm the states in rebellion. The North outstripped the South in manufacturing capacity and manpower, and while the Union struggled during the first few years of the conflict in coordinating its manufacturing resources, it was eventually able to meet all of its supply needs. The Confederacy, on the other hand, despite its attempts to develop a manufacturing base before and during the conflict, had an economy largely based on investment in slaveholding and land ownership (Luraghi 1978; Wright 1978; Genovese 1965). Even though the South desperately tried to marshal its manufacturing resources to supply its wartime needs, it simply lacked the manpower and manufacturing base to effectively counter the North's advantages in these areas.
An Unequal Contest
At the start of the conflict the manufacturing balance sheet between the North and the South was quite lopsided in favor of the Union. One historian notes that the North had 110,000 manufacturing establishments to the South's 18,000; the North had 1,300,000 industrial workers compared to the South's 110,000. Massachusetts alone produced over 60 percent more manufactured goods than all the Confederate states put together, Pennsylvania nearly twice as much, and New York more than twice as much (Foote 1958, p. 60). Another study demonstrates that the total value of manufactured goods in Virginia, Alabama, Louisiana, and Mississippi was less than $85 million while New York alone produced goods worth almost $380 million (Millett and Maslowski 1984, p. 164). Historian James McPherson points out the irony in this disproportion: while many in the South believed that cotton production would provide them with the wealth to fund their war, the states that grew the cotton possessed only 6 percent of the nation's cotton manufacturing capacity (McPherson 1982, p. 23).
Furthermore, the states that remained within the Union had a better than ten-to-one advantage in the gross value of manufactured goods over the states that made up the Confederacy, with the latter producing only 7.4 percent of the nation's gross value of these goods (Tindall and Shi 1992, p. 642). The free states had 84 percent of the nation's capital invested in manufacturing in 1860. In such crucial industries as iron production, the South lagged far behind: During the year ending June 1, 1860, the Confederate states produced 36,790 tons of pig iron, while the figure for Pennsylvania alone was 580,049 tons (Millett and Mas-lowski 1984, p. 164). In other industries with a direct connection to warfare, the Northern states had 97 percent of the country firearms in 1860, 94 percent of its cloth, 93 percent of its pig iron, and more than 90 percent of its boots and shoes (McPherson 1988, 318).
Confederate manufacturers were at a considerable disadvantage because they had largely depended on Northern technological know-how to develop and run their industries, and many of these workers returned to the North at the start of hostilities. In addition, many of the mechanics who were Southern-born chose to seek glory in uniform rather than stay on the job in Southern manufactories (Dew 1966, pp. 90–91). Reflecting the problem in home-grown expertise, James McPherson estimates that the South had contributed only seven percent of the important inventions in the United States between 1790 and 1860 (McPherson 1982, p. 25). Furthermore, many of the South's industries were located in the Upper South close to coastal areas and were therefore always vulnerable to Union invasion (Millett and Mas-lowski 1984, p. 164).
Another great advantage for Northern manufacturers was the population base from which they drew their workers. In 1861 the North had twenty-two million people compared with 9 million in the Confederacy. Of the latter, three and a half million were slaves, and while the South increasingly tried to incorporate enslaved people into its manufacturing workforce as the war progressed, it found this a difficult endeavor (Tindall and Shi 1992, p. 642; Dew 1966, pp. 250–264). Not only did the North's population dwarf the South, the North was also more intensively engaged in industry at the outset of hostilities. Only forty percent of the Northern labor force worked in agriculture, while the slave states had 84 percent of their labor force occupied in farming. Consequently, the slave states had only ten percent of their population living in urban areas of 2,500 people or more, while the free states had 26 percent of their population living in such areas (McPherson 1982, p. 24).
The North's advantage in terms of population resulted from the steady flow of immigrants to the port cities of New York, Boston, and Philadelphia, from which many of them fanned out across the North to take manufacturing jobs in inland towns. This influx of workers did not cease during the war, with more than 800,000 new immigrants arriving between 1861 and 1865 (Millett and Maslowski 1984, p. 163).
The Southern Response
Statistics like these certainly give the impression that the South faced an uphill battle in meeting its manufacturing needs. While it certainly did struggle to meet its needs, it made a great effort to do so. Historian Raimondo Luraghi has argued that despite the omnipresent issue of states rights, the Confederate government took a decisive role in managing its industries through what might be called the first instance of forced industrialization through state socialism. The South's industrialization program had no equal until the rise of the Soviet Union in the early part of the twentieth century (Lur-aghi 1978, pp. 112–132). The authors of Why the South Lost the Civil War point to some of the ways in which the Confederacy brought its manufacturers under its control. One was conscription, which allowed the government to exempt skilled workers. The other was the imposition of public domination over rail transportation. Thus the Confederate government in Richmond could force industrialists to do its bidding by denying them manpower or transportation (Beringer et al. 1986, p. 217). These authors then note that the government in Richmond also invested in manufacturing, establishing factories such as the Augusta Powder Works in Augusta, Georgia, which produced nitre (potassium nitrate, a chemical used to make gunpowder), lead, rifles, shoes, buttons, and other items. Moreover, this government-induced and controlled activity turned Southern cities into large industrial centers (Beringer et al. 1986, p. 217). Early in the short history of the Confederacy, the New York Herald reported on the frenetic pace of development that it saw occurring in the South:
The Sultana Disaster, April 27, 1865
The explosion of the Sultana, a Mississippi River steamship transporting Union soldiers recently released from Confederate prison camps, in April 1865 was the greatest maritime disaster in United States history. It is estimated that more passengers on the Sultana lost their lives than on the Titanic in 1912.
The Sultana had been built in Cincinnati, Ohio, in 1862 for the cotton trade on the lower Mississippi. After 1864 the War Department commissioned the vessel for troop transport. On her last voyage, the Sultana left New Orleans on April 21, 1865, with livestock and about seventy-five cabin passengers bound for St. Louis, Missouri. The ship stopped at Vicksburg on April 24 to take on more passengers—Union soldiers recently released from prison camps at Andersonville and Cahaba. Although the Sultana had a legal capacity of 376 persons, more than 2,100 soldiers came aboard, filling every available space on the decks as well as the cabins. Many of these men had been severely weakened by wounds, malnutrition, and disease (Ambrose 2001).
The severe overcrowding helped set the stage for disaster. As the ship steamed north of Memphis, Tennessee, one of its poorly maintained boilers exploded at about 2 a.m. on the morning of April 27; two of the other boilers then followed. Many passengers were killed immediately by the blast; others were scalded by escaping steam or burned to death in the fire that engulfed the ship when hot coals from the exploding boilers set fire to the wooden decks. Still others drowned or perished from hypothermia after jumping into the cold waters of the Mississippi. For months after the disaster, bodies were found downstream as far as Vicksburg. Many victims were never recovered. About eight hundred people survived the initial explosion and fire, but three hundred of these died within a few days of severe burns or exposure. Estimates of the death toll range between 1,300 and 1,900.
It is thought that the boiler's explosion was the result of the Sultana's careening; that is, as the overcrowded and top-heavy ship struggled around the bends in the Mississippi, she tilted from one side to the other. The four boilers were interconnected in such a way that the tilting caused water to flow out of the uppermost boiler on that side of the ship. Since fires were still burning underneath the boilers, the empty boiler would develop hot spots. When the ship tilted in the other direction, water would rush back into the empty boiler, turn at once into steam, and create a temporary surge in pressure. This effect of careening could be prevented by keeping high levels of water in the boilers; however, the Sultana's boilers were leaky and one had been hastily and improperly repaired when the ship stopped at Vicksburg.
The Sultana disaster received surprisingly little press coverage at the time, most likely because the other events of April 1865— Robert E. Lee's (1807-1870) surrender, the end of the Civil War, and Abraham Lincoln's (1809-1865) assassination—were foremost in the public's attention. The tragedy was soon forgotten, though it cost nearly two thousand lives.
rebecca j. frey
Ambrose, Stephen. "Remembering Sultana." National Geographic News, May 1, 2001. Available from http://news.nationalgeographic.com/.
Potter, Jerry O. "Sultana: A Tragic Postscript to the Civil War." American History Magazine, August 1998. Available from http://www.historynet.com/.
Salecker, Gene Eric. Disaster on the Mississippi: The Sultana Explosion, April 27, 1865. Annapolis, MD: Naval Institute Press, 1996.
We perceive that the States of the Southern confederacy are bestirring themselves in the manufacturing line, with a view to provide for their own wants in those articles for which they were heretofore dependent upon New England. Cotton mills, shoe factories, yarn and twine manufactorie are being put extensively into operation in Georgia and other States. An association of Southern merchants is busily engaged in locating sites for all kinds of factories, with the assistance of competent engineers, where the indispensable water power can be made available. In the neighborhood of Columbus, Georgia, there are already established cotton and woolen mills, a tan yard and a shoe factory, grist mills and saw mills, of the capacity and operations of which a description will be found in another column. In New Orleans there is a very large factory at work in manufacture of brogans, an article of immense consumption on plantations, and hitherto supplied by the factories of Lynn and other New England towns. It is evident that the Southern confederacy is straining every point to make itself independent of the North commercially as well as politically. (March 17, 1861, p. 4)
Besides largely being cut off from the importation of Northern goods, the North's blockade of the Southern coastline meant that items manufactured abroad would be severely curtailed as well. Early in the conflict, the government in Richmond tried to secure some necessary manufactured items from abroad. A Savannah, Georgia, newspaper reported on April 10, 1861, just a few days before shots were fired at Fort Sumter and only nine days before President Lincoln announced the blockade of Southern ports, that an American visiting a Prussian arms factory witnessed the manufactory of arms, where 60,000 rifles and 50,000 swords for the South were being produced (Daily Morning News, April 10, 1861, col. A). Whether this particular shipment of arms got through the blockade is unknown but it is likely that it did. Historians have estimated that despite the Union Navy's best efforts to maintain the blockade, the South exported at least a million bales of cotton and imported 600,000 rifles (McPherson 1982, p. 179). McPherson goes on to point out that while such large numbers may call the blockade's effectiveness into question, the blockade nevertheless cut the South's seaborne trade to less than a third of normal (McPherson 1982, p. 179). This impact meant that the Confederate government would have to get the most out of its own manufacturing base without being able to rely on help from abroad.
The most important manufacturing center of the Confederate States was in Richmond, where such factories as the Tredegar Iron Works provided the war effort with such essential items as cannons. The importance of Richmond's manufacturing establishments was one of the reasons that General Robert E. Lee sacrificed so much to prevent the city from falling under Union control (McPherson 1982, pp. 235–236). Other historians point out that while the South as a whole was largely agricultural, Virginia was a partial exception. "Virginians envisioned a Confederacy filled with large factories, teeming cities, and prosperous merchants… protected from more efficient Northern competitors, Virginia would give Southerners the industrial muscle they needed to sustain political independence" (Carlander and Majewski 2003, p. 335).
Early visions of turning the South into an independent manufacturing behemoth eventually came to naught. With a much smaller manufacturing base, the curtailing of imports due to the loss of trade with Northern states as well as the naval blockade, and the lack of an adequate supply of free labor for its manufactories, the South failed to develop an effective manufacturing base. The economic base of the South was weakened as the war continued. The real output of the Confederacy declined as many of the best workers left the factories for the army. The blockade cut the Confederacy off from the benefits of foreign trade, led to inefficient use of Southern labor, and compounded the difficulties of replacing worn-out or destroyed machinery. In addition, Union troops concentrated on destroying railroad equipment and entire factories and on cutting the supply lines of raw materials (Lerner 1955, pp. 20–40).
The Confederacy had simply started too far behind the Union in terms of manufacturing, and despite trying to catch up to the North, it faced too many obstacles to be successful. The antebellum South's planter economy was no match for the capitalist entrepreneurial spirit of the Northern manufacturers.
Despite the North's advantages, however, it was unable to capitalize on them and overwhelm its weaker adversary early in the conflict. For the first few years of the war the South was able to take advantage of the North's inability to marshal its manufacturing resources. Initially, the North scrambled to meet its supply needs, even going abroad to import such essential items as rifles (Ransom 2006). For the Union the problem was not a lack of manufacturing facilities but a lack of coordination. During peacetime, manufacturers competed with one another and were not in business to cooperate. Moreover, most manufacturers in this period served local needs and did not ship their goods great distances or consider the needs of consumers in distant markets (Zunz 1990, pp. 12–15). Therefore military procurement officers had to encourage far-flung manufacturers to work together and pool their resources into large cooperative operations that met the military's needs. This wartime effort led to the development of a national market that linked distant consumers with manufacturers (Whitten and Whitten 2006, p. 5).
Even though effective coordination took some time, Northern producers were quick to see the money-making potential in supplying the needs of the Union Army and Navy. An article in the Chicago Tribune from the first year of the war illustrates some of the efforts taken by Midwestern manufacturers:
Every day brings with it illustrations of the widespread activity caused by the preparations of the government for a long war. Passing through an alley…we found it barricaded with packing boxes. The boxes are the work of a man who three months ago could hardly find any occupation. He is now making packing boxes for the government with all the hands he can employ…The receipts of clothing at the arsenal are enormous. To inspect the operations is well worth a day's time. One single establishment delivers daily 3,000 shirts and 2,000 pairs of drawers; from another is received an equal number of hose…The number of mills running solely upon army cloths and army flannel is becoming legion…scarce a day passes in which some cotton mill is not altered into a woolen mill, and set at work upon cloth and flannel. (October 21, 1861, p. 2)
To meet the needs of the war, many Northern manufacturers converted their operations to allow them to make goods needed by soldiers. For example, the Amoskeag Manufacturing Company in Manchester, New Hampshire, at that time the largest textile manufacturer in the world, began making rifles (Hareven and Langenbach 1978, p. 10). The same newspaper article quoted above describes other wartime manufacturing conversions:
Where hayforks and scythes took the attention of a manufacturer, sword blades and bayonets are produced instead. Brass turners have left off making faucets and door keys, and are doubling the product of their industry in making trappings for cavalry and the more delicate workmanship upon gun carriages, sword sheaths, &c. Trunk makers have taken to the fashioning of knapsacks, and men who once made carriages for the wealthy, are now making ambulances for the soldier. (October 21, 1861, p. 2)
What all this activity meant for cities like Chicago is described further in the Tribune article:
The result is that the city is gradually becoming one vast workshop, and the hum of industry each day grows louder and louder. From the streets beggary has almost disappeared, and the demands upon the committee by the families of absent volunteers are daily diminishing from the abundance of employment offered to the industrious. The present war may pinch in some places, but it carries employment and comparative ease to others. (October 21, 1861, p. 2)
Despite the ambitions of Northern manufacturers to supply the military, however, many industries suffered economically during the initial months of the war due to the loss of their Southern customers and the accompanying sudden changes in market conditions. In fact, almost six thousand Northern businesses failed in the first year of the war, with financial losses totaling an estimated $178.5 million (Whitten and Whitten 2006, p. 8). Textile manufacturers suffered because of the loss of their cotton supply from Southern plantations, which resulted in a 74 percent drop in production (McPherson 1982, p. 372). Other manufacturers who suffered losses were iron producers, shoe manufacturers, and coal producers (McPherson 1982, p. 372). As manufacturers adjusted to wartime production needs, however, things began to turn around. The manufacturing index for the Union states alone rose to a level 13 percent higher by 1864 than that for the country as a whole in 1860 (McPherson 1982, p. 372). A Boston newspaper article describes the result of war production for Worcester, Massachusetts:
The manufacturing interests of Worcester have been favorably affected by the war. Most of the establishments are in full operation. Most of the establishments are in full operation, many of them running over time, and with much more than the usual complement of hands, in the manufacture of articles worn by soldiers, or in making tools and machinery for the manufacture of those articles. (Boston Daily Advertiser, November 29, 1861, col. D)
With the boom in wartime profits, some manufacturers must have worried about the end of the war and the accompanying drop in military orders. A Mississippi newspaper article from 1863 reports the claims of a traveler just from the manufacturing districts of the North who reported that Northern manufacturers were doing so well that they callously did not want to see the war end: "All are making money by contract, working night and day, and are willing to pay three hundred dollars for substitutes out of their profits. Manufacturers make no complaint of their taxes. They feel none of the horrors of war, and care nothing about it" (Natchez Daily Courier, June 25, 1863, col. D).
The Civil War indeed proved to be a boon to Northern manufacturers, who supplied their nation with the tools of war needed to carry it to victory. The United States emerged from the conflict with a rapidly expanding manufacturing base that would in a few decades be the largest in the world. The 1860s was a period of transition for manufacturers from the small to mid-size manufactory to the large factories that would come to dominate American life. While these changes would have occurred without the Civil War, wartime necessities served to nationalize markets, increase cooperation between government and industry, and expand the size of manufacturing operations. The war speeded up the modernization of American industry but was not the cause of it (McPherson 1982, p. 373). Many veterans returned to their cities to join a new army: the legions of industrial workers that soon filled America's factories (Johnson 2003).
Beringer, Richard, Herman Hattaway, Archer Jones, and William N. Still Jr. Why the South Lost the Civil War. Athens and London: University of Georgia Press, 1986.
Boston Daily Advertiser, November 29, 1861, Issue 128, col. D.
Carlander, Jay, and John Majewwski. "Imagining A Great Manufacturing Empire: Virginia and the Possibilities of a Confederate Tariff." Civil War History 49, no. 4 (2003): 334-352.
Chicago Tribune, October 21, 1861, p. 2.
Daily Morning News (Savannah, GA), April 10, 1861, Issue 85; col. A.
Foote, Shelby. The Civil War, A Narrative: Fort Sumter to Perryville. New York: Random House, 1958.
Genovese, Eugene D. The Political Economy of Slavery. New York: Vintage Books, 1965.
Hareven, Tamara K., and Randolph Langenbach. Amoskeag, Life and Work in an American Factory-City. New York: Pantheon Books, 1978.
Johnson, Russell L. Warriors into Workers: The Civil War and the Formation of Urban-Industrial Society in a Northern City. New York: Fordham University Press, 2003.
Lerner, Eugene. "Money, Prices, and Wages in the Confederacy, 1861-1865." Journal of Political Economy 63, no. 1 (1955): 20–40.
Luraghi, Raimondo. The Rise and Fall of the Plantation South. New York: New Viewpoints, 1978.
Maslowski, Peter, and Allan R. Millett. For the Common Defense: A Military History of the United States of America. New York: The Free Press, 1984.
McPherson, James M. Battle Cry of Freedom: The Civil War Era. New York: Oxford University Press, 1988.
McPherson, James M. Ordeal by Fire: The Civil War and Reconstruction. New York: Alfred A. Knopf, 1982.
Natchez (MS) Daily Courier, June 25, 1863, Issue 193,col. D.
New York Herald (NY), March 17, 1861, pg. 4; col. C.
Ransom, Roger L. Confederate States of America. Historical Statistics of the United States, Millennial Online, eds. Susan B. Carter, Scott Sigmund Gartner, Michael R. Haines, Alan L. Olmstead, Richard Stutch, and Gavin Wright. New York: Cambridge University Press, 2006.
Shi, David E., and George Brown Tindall. America, A Narrative History, 3rd ed. New York: W.W. Norton& Company, 1992.
Whitten, David O., and Bessie E. Whitten. The Birth of Big Business in the United States, 1860-1914: Commercial, Extractive, and Industrial Enterprise. Westport, CT: Praeger Publishers, 2006.
Wright, Gavin. The Political Economy of the Cotton South: Households, Markets, and Wealth in the Nineteenth Century. New York: W.W. Norton & Company, Inc., 1978.
Zunz, Oliver. Making America Corporate, 1870-1920. Chicago and London: The University of Chicago Press, 1990.
At the beginning of the Civil War, neither side was prepared for the long, bloody struggle that lay ahead. The overwhelming superiority that the North enjoyed in its manufacturing capacity should have given it the clear advantage, but in 1861 the North's industries were not coordinated enough to efficiently supply the vast requirements of Northern armies. Furthermore, the majority of industries in the North were still small affairs, largely shops or mid-sized manufactories that primarily served local markets (Zunz 1990, p. 13). While there were many large factories as well, especially in textile manufacturing, where large factories with efficient modes of production had existed for some time, the widespread industrialization that would transform the country in the latter part of the nineteenth century was only in its beginning stages in most industries. For example, in Cincinnati in the 1850s, out of 1,259 manufactories only 21 employed more than hundred workers, while 1,207 establishments had less than fifty workers (Ross 1985, p. 80). Therefore, the Union government faced the task of coordinating disparate industries to produce maximum manufacturing efficiency in the service of its war machine. Over the four years of the war, government and manufacturers worked hand-in-hand to accomplish this goal, and in doing so paved the way both for victory and for the massive industries that would come to dominate much of American life in the decades following the conflict.
Despite its need to organize and consolidate industrial production, the North's stronger manufacturing base still gave it an advantage over the South at the start of the war, an advantage it would maintain and increase throughout the conflict. The 1860 census shows that the total number of manufacturing establishments in the free states and territories was 108,573, while the states that would make up the Confederacy only counted 20,573 establishments. In terms of capital investment, the North had $840,802,835 invested in industry, whereas the South had only $95,922,489 invested in its industrial base ("Manufacturing in the Slave States"). The North also had a huge population advantage, which enabled it to simultaneously supply its factories with workers and its armies with soldiers. Because of its commitment to free labor and manufacturing, the North had for decades attracted large numbers of immigrants, with most workers coming through ports in New York, Philadelphia, and Boston and then fanning out throughout the Northeast, where they supplied manufacturing establishments with an ever increasing workforce pool (Wright 1978, pp. 121–125; Stott 1990, pp. 68–84). Even during the war, immigrants continued to arrive in large numbers: From 1861 to 1865, over 800,000 immigrants disembarked in the North (Millett and Maslowski 1984, p. 163). Southern factories, on the other hand, had to rely on a much smaller supply of white laborers, who were augmented by the numerous slaves hired out to Southern factories by their owners (Dew 1966).
The problem for the U.S. War Department was not a paucity of manufacturing establishments or workers, but the lack of coordination between the many manufacturing establishments, which were used to competing with one another and were often narrowly focused on local markets. With the fall of Fort Sumter, federal officials realized that they would need to move quickly to consolidate the North's disparate modes of production, transportation, and communication by having government agents work closely with manufacturers (Whitten and Whitten 2006, p. 5). The challenges they faced were vast. At the start of the conflict, basic needs such as armaments and uniforms could not adequately be supplied by the nation's manufacturing system. In October of 1861, M. C. Meigs, quartermaster-general to the secretary of war, lamented the "imperative demand for more army cloth than the present manufacturing resources of the North can furnish." The supply situation, Meigs asserted, was so bad that governors "daily complain that recruiting will stop unless clothing is sent in abundance and immediately to the various recruiting camps of regiments." Soldiers were reported to be "compelled to do picket duty in the late cold nights without overcoats, or even coats, wearing only the thin summer flannel clothing" (Chicago Tribune, October 31, 1861).
Besides its problems with uniforms, the North found arms manufacturers unable to supply its soldiers with the weapons of war. At the start of the conflict, the Northern states produced 97 percent of the nation's firearms at factories such as the New Haven Arms Factory (which would become the Winchester Repeating Arms Company after the war), the Springfield Armory, and the Colt Manufacturing Company. However, massive government orders for weapons could not be met without vastly expanding arms production (McPherson 1988, p. 318). In fact, during the first year of the conflict, the North had to import 80 percent of its firearms (Morris 2005, p. 54). The Comte de Paris, Louis Philippe Albert d'Orleans, who served on the staff of General McClellan, described the North's inability to adequately supply its armies with arms:
The armory in Springfield had only the capacity for producing from ten to twelve thousand [weaponons] yearly, and the supply could not be increased except by constructing new machines…. During the first year of the war the ordnance department succeeded in furnishing the various armies in the field, not counting what was left at the depots, one million two hundred and seventy-six thousand six hundred and eighty-six portable firearms (muskets, carbines, and pistols), one thousand nine hundred and twenty-six field or siege guns, twelve hundred pieces for batteries in position, and two hundred and fourteen million cartridges for small arms and cannon. (Commager 1973, p. 103)
However, the situation began to improve as the war dragged on: "In 1862," according to the Comte de Paris, "the Springfield manufactory delivered two hundred thousand rifles, while in the year 1863, during which there were manufactured two hundred and fifty thousand there, the importation of arms from Europe by the Northern States ceased altogether" (Commager 1973 , p. 103).
The war was a boon to manufacturers. The DuPont Company, for example, massively expanded its operations to meet the military's voracious appetite for gunpowder, and emerged from the Civil War as the nation's largest manufacturer of gunpowder and other explosives (Zunz 1990, p. 16). Other companies expanded production rapidly as well, as the following notice in the New Haven Daily Palladium from May 19, 1865 testifies: "The Meriden Manufacturing Co. have a contract for five thousand breech-loading magazine carbines, Triplett's patent, for the State of Kentucky. The arms are to be finished in July, and the armory is run night and day."
The massive expansion of the arms industry, along with the growing demand for other products and parts made out of metal—from iron plates for warships, to railroad track, to horseshoes for army horses—greatly expanded the North's steel industry. The demand for military goods even prompted manufacturers to convert their operations. For example, the Amoskeag Manufacturing Company, the largest textile factory in the world, located in Manchester, New Hampshire, manufactured locomotives and rifles during the Civil War (Hareven and Langenbach 1978, p. 10).
Northern manufacturers also introduced several innovations. One of these was the repeating rifle, which allowed a soldier to fire several shots without reloading. According to the Comte de Paris, the rifle's reception by the troops was very favorable: "Many extraordinary instances have been cited of successful personal defence due to the rapidity with which this arm can be fired, and some Federal regiments of infantry which made a trial of it were highly pleased with the result" (Commager 1973 , p. 104). John D. Billings, a member of the 10th Massachusetts battery of light artillery, recalled several other inventions produced by Northern manufacturers, such as the "combination knife-fork-and-spoon," the "water filterer," and the "fancy patent-leather haversack." One product that seems an essential for soldiers but, according to Billings, didn't catch on was an early form of the bulletproof vest:
These ironclad warriors admitted that when panoplied for the fight their sensations were much as they might be if they were dressed up in an oldfashioned air-tight stove; still, with all the discomforts of this casing, they felt a little safer with it on than off in battle…. This seemed solid reasoning, surely; but, in spite of it all, a large number of these vests never saw Rebeldom. Their owners were subjected to such a storm of ridicule that they could not bear up under it…. [T]he ownership of one of them was taken as evidence of faint-heartedness. ( Commager 1973 , pp. 216–217)
A connection between manufacturers and the government had always existed in the United States, but the Civil War raised this connection to a new level. In the process, many manufacturers used their connections with government agents to secure manufacturing contracts that would make them rich. Many were able to stay home and let substitutes go do the fighting for them, because the 1863 Conscription Act allowed them to do so. A significant number of Civil War-era manufacturers and financiers, such as John Rockefeller, John Pierpont Morgan, Jay Gould, and Philip Armour, later went on to become the famous "robber barons" of the decades of rapid industrialization that followed the Civil War. An excellent case in point is J. P. Morgan, who as perhaps the most famous investment banker in American history became the archetype of the robber baron. Morgan, who got his start before the Civil War, and was well off before Fort Sumter, made a fortune during the war off of government manufacturing contracts.
Despite such corruption, and various road bumps and false starts, the North was able to use its superior manufacturing capacity, along with its vast numbers of workers and troops, to grind down the Confederate Army's ability to respond militarily to its invasion of the South. In 1865 Union armies marched in victory parades throughout the North to celebrate their efforts in preserving the Union. Northern manufacturers did not march in parades of their own, but they were also instrumental in their nation's victory as the suppliers of the instruments of war. Their efforts, too, laid the groundwork for the massive industrialization that was to follow, which by 1880 had transformed the United States into the world's foremost industrial power.
Commager, Henry Steele, ed. The Civil War Archive: The History of the Civil War in Documents. New York: Bobbs-Merill, 1973.
Dew, Charles B. Ironmaker to the Confederacy: Joseph R. Anderson and the Tredegar Iron Works. New Haven: Yale University Press, 1966.
Dublin, Thomas. Women at Work: The Transformation of Work and Community in Lowell, Massachusetts, 1826-1860. New York: Columbia University Press, 1979.
Hareven, Tamera K., and Randolph Langenbach. Amoskeag: Life and Work in an American Factory City. New York: Pantheon Books, 1978.
Josephson, Matthew. The Robber Barons: The Great American Capitalists, 1861-1901. New York: Harcourt, Brace, 1934.
"Manufacturing in the Slave States: Establishments, Capital Invested, Product Value, and Employment, by State: 1860-1870" (Series Eh40-49). In Historical Statistics of the United States: Millennial Edition, ed. Susan B. Carter et al. New York: Cambridge University Press, 2006. Available online from http://hsus. cambridge.org/.
McPherson, James M. Battle Cry of Freedom: The Civil War Era. New York: Oxford University Press, 1988.
Millett, Allan R., and Peter Maslowski. For the Common Defense: A Military History of the United States of America. New York: Free Press; London: Collier Macmillan, 1984.
New Haven Daily Palladium, May 19, 1865.
"The Purchase of Cloth Abroad." Chicago Tribune October 31, 1861.
Ross, Steven J. Workers on the Edge: Work, Leisure, and Politics in Industrializing Cincinnati, 1788-1890. Los Angeles: Figueroa Press, 1985.
Stott, Richard B. Workers in the Metropolis: Class, Ethnicity, and Youth in Antebellum New York City. Ithaca, NY: Cornell University Press, 1990.
Whitten, David O., and Bessie E. Whitten. The Birth of Big Business in the United States, 1860-1914: Commercial, Extractive, and Industrial Enterprise. Westport, CT: Praeger Publishers, 2006.
Wright, Gavin. The Political Economy of the Cotton South: Households, Markets, and Wealth in the Nineteenth Century. New York: W. W. Norton, 1978.
Zinn, Howard. A People's History of the United States. New York: HarperCollins, 1980.
Zunz, Olivier. Making America Corporate, 1870-1920. Chicago and London: University of Chicago Press, 1990.
On April 14, 1861, after a short, bloodless battle, the Confederate States of America raised its flag over Fort Sumter, a coastal fortification in South Carolina, and in doing so initiated a war that its manufacturing base was ill equipped to deal with. One challenge among many for the Confederacy in the coming conflict was to make the best use of its much smaller manufacturing base in order to keep both the civilian population supplied with the necessities of life and the military supplied with the necessities of war. The manufacturing advantages of the North seemed overwhelming when compared to the South's largely agriculturally based economy, and yet many in the South were confident that their region could rise to the challenge and mobilize and equip an army that could successfully challenge the North, both on the field of battle and in the factory. In fact, some saw the arrival of conflict with their Northern neighbors as being a positive impetus for the development of the nascent Southern manufacturing base. For example, the June 1861 issue of the Southern Cultivator boldly proclaimed: "We notice with unfeigned pleasure the impetus which the secession movement has given to the manufacturing enterprises in the South. Old establishments are being remodeled, improved and enlarged, and new ones are springing up, while a much larger number still determined upon, are not yet, however, located" ("Southern Manufactures," 1861, p. 187).
Despite such optimism, the South began the conflict with a substantial disadvantage in terms of its industrial capabilities. In 1861, the United States was only a few decades away from becoming a global manufacturing superpower, but most of that incipient producing might was located in the North. The gross value of manufactured items produced in the states that made up the Union was more than ten times greater than the gross value of items manufactured in the states that made up the Confederacy, which collectively produced just 7.4 percent of the nations manufactured goods at the start of the war. Another stark statistic for the Confederacy was that the Union produced 97 percent of the nation's firearms (Tindall and Shi 1992, p. 642). The state of Massachusetts alone produced more industrial goods than all of the states of the Confederacy combined (McPherson 1982, p. 23). In terms of manpower for its manufacturing base, the North also clearly had the advantage: There were twenty-two million people in the states that made up the Union, whereas the Confederacy had a population of only nine million, with one-third of those being slaves (Tindall and Shi 1992, p. 642).
The reason for the South's distant position behind the North in manufacturing lay in a history of Southern investment in land and slaves, which had made the South largely rural in nature. At the start of the Civil War, the South had few cities of any size and the vast majority of its people lived in rural areas. Whereas around 35 percent of the population of New England and around 21 percent of the population of the Middle Atlantic States lived in urban areas, in the South only Virginia had an urban population that even came close to 10 percent of the state total, and the urban population of the South was less than 5 percent (Fox-Genovese 1988, pp. 74–76). The rural nature of the South,
Southern methods of agriculture—which were labor-intensive and thus did not require as many manufactured farming instruments as were typically used on Northern and Western farms—and heavy investment of capital in land and slaves did not provide strong incentives for the establishment of a local industrial base.
Despite these stark figures, the South had a great deal of wealth that could have been put into manufacturing. Robert Fogel and Stanely Engerman in their seminal Time on the Cross (1974) point out that the South as an independent nation ranked fourth in the world in terms of wealth. The South's wealth, however, was based largely on the ownership of slaves and land, and with these two resources, the South became a major exporter of crops such as cotton, tobacco, rice, and indigo. An article in the October 12, 1861, edition of the Raleigh (North Carolina) Daily Register asserts that in 1860 the North exported slightly less than 98 million dollars worth of products, whereas the South exported around 219 million dollars worth. Unfortunately for the Confederacy's quartermasters, the South's exports could not be used to equip an army (Nicholson 1861).
Despite being well behind the North in manufacturing, several historians have concluded that Southern manufacturing deficiencies were not the chief cause of the South's ultimate defeat. The Civil War was still largely a low-technology conflict and Southern manufacturers in conjunction with government planners did a fairly good job of maintaining production throughout most of the war. Also, because it took so long for the North to coordinate the full capacity of its manufacturing base, Southern manufacturing deficiencies were not felt for some time as strongly as they might have been.
Despite the lack of urban centers and the structure of the Southern economy, several Southerners during the decades leading up to the Civil War had provided the region with a nascent industrial base. Historian Chad Morgan points out in his study of manufacturing in Georgia that during the 1850s, while the number of manufacturing establishments only increased from 1,527 to 1,890, the amount of money invested in these industries increased from $7,086,525 to $16,925,564 (2005, p. 10). According to the 1860 census, the total number of manufacturing establishments in what would be the Confederate states numbered 20,631, with the lead being taken by Virginia, home to more than 25 percent of these enterprises, followed by North Carolina, with nearly 4,000. The 1870 census indicates that the South had 33,360 manufacturing establishments, an increase of 62 percent over the course of the decade, despite the destruction brought to the South by Northern armies (Carter et al. 2006). The problem was that while the South was developing its industries, the North was developing its own faster, so that even though the South had 20 percent of the nation's industrial base in 1840, by 1860, despite a strong attempt by Southern industrialists to catch up, the South's share of the nation's manufacturing output was only 16 percent (Morgan 2005, p. 6).
From the beginning of the war, the government in Richmond encouraged manufacturing in two ways: It established direct state ownership of essential industries, especially ordnance and munitions production, and at the same time promoted private enterprise through the use of various incentives and penalties. These ranged from giving further contracts to businesses that produced according to Richmond's demands, to punitively taking over businesses that didn't meet the requirements set for them. Conscription, which passed the Confederate Congress on April 16, 1862, also aided the wartime manufacturing effort by allowing the government to exempt certain industrial workers from military service. Many patriotic Confederates declared their intention to meet the manufacturing demands of war in the face of the Northern blockade of the South's coasts. In November 1861, Jefferson Davis, president of the Confederacy, wrote optimistically of the South's ability to meet its logistical requirements, even if doing so might require its citizens to sacrifice:
As long as hostilities continue the Confederate States will exhibit a steadily increasing capacity to furnish their troops with food, clothing, and arms. If they should be forced to forego many of the luxuries and some of the comforts of life, they will at least have the consolation of knowing that they are thus daily becoming more and more independent of the rest of the world. If in this process labor in the Confederate States should be gradually diverted from those great Southern staples which have given life to so much of the commerce of mankind into other channels, so as to make them rival producers instead of profitable customers, they will not be the only or even the chief losers by this change in the direction of their industry. (Davis 1906 , p. 643)
Similarly, optimistic pronouncements came from many other quarters as well. An announcement concerning the production of gunpowder published in the September 14,1861, edition of the Raleigh Daily Register proclaimed: "We are glad to see that North Carolina is taking the lead in the manufacture of this indispensable article in the prosecution of the war….This company expects soon to be able to turn out one thousand kegs a day." And in an editorial titled "An Appeal to Planters," published in the March 19,1862, Savannah (Georgia) Daily Morning News, the following is found: "We must not only have soldiers sufficient to prevent our gallant army from falling prey to superior numbers, but we must encourage domestic manufactures that shoes and clothing may be furnished for that army."
Despite the best efforts of Southern manufacturers and politicians to ensure that the war effort would be supplied, problems with supply abounded. In his 1907 memoir, Edward P. Alexander, who had served as chief of ordnance for the Army of Northern Virginia, recalled some of the logistical successes and failures. On the one hand, the Ordnance Bureau in Richmond was notable "for its success in supplying the enormous amount of ordnance material consumed during the war" (p. 54). On the other hand, the quality of at least some of the material produced by Southern manufacturers was questionable: "Our arsenals soon began to manufacture rifled guns, but they always lacked the copper and brass, and the mechanical skill necessary to turn out first-class ammunition" (p. 54). Through the capture of Northern guns, Alexander remarks, his soldiers were able to get the arms they needed, but "we were handicapped by our own ammunition until the close of the war" (p. 54). Colonel William Allan, who was the chief of ordnance of the Second Army Corp, recalled that in 1864 the Ordnance Department was not providing his men with enough nails and horseshoes. To compensate, he combed his units for blacksmiths and put them to work, producing what Richmond should have been providing him. However, this innovative plan was disrupted by a lack of iron. Allan responded by sending his own wagons and men to Richmond to obtain the iron from a manufacturer there. The story demonstrates obvious problems with Southern manufacturing and supply during the Civil War. In this case, the supply of raw materials was available but not the ability to turn them into manufactured products that could get to troops in the field.
Of course, a major problem the South faced in trying to keep its factories running stemmed from its need to supply its armies with soldiers while also supplying its manufacturing establishments with workers drawn from a population much smaller than its enemy's. Also, while the North gained over 800,000 immigrants between 1861 and 1865, the Confederacy not only did not see a gain in immigration during the war, it also lost many of its skilled workmen at the start of the conflict, as they returned to their homes in the North or donned Confederate uniforms (Dew 1966, pp. 228–264; Millett and Maslowski 1984, p. 163). As many as 750,000 men fought for the Confederacy, and this near total mobilization of young Southern white men inevitably translated into a shortage of industrial workers (Millet and Maslowski 1984, p. 163).
Despite starting the war well behind the North in terms of manufacturing capacity, the Confederacy made a strenuous effort to supply its needs. However, in the end Southern manufacturing was no match for the producing power and manpower of the Union.
Alexander, Edward Porter. Military Memoirs of a Confederate: A Critical Narrative. New York: Charles Scribner's Sons, 1907.
Beringer, Richard E., Herman Hattaway, Archer Jones,and William N. Still Jr. Why the South Lost the Civil War. Athens: University of Georgia Press, 1986.
Beringer, Richard E., Herman Hattaway, Archer Jones,and William N. Still Jr. The Elements of Confederate Defeat: Nationalism, War Aims, and Religion. Athens: University of Georgia Press, 1988.
Boritt, Gabor S., ed., Why the Confederacy Lost. New York: Oxford University Press, 1992.
Carter, Susan B. et al., eds. "Manufacturing in the Slave States: Establishments, Capital Invested, Product Value, and Employment, by State: 1860-1870" (Series Eh40-49). Historical Statistics of the United States: Millennial Edition. Cambridge, U.K., and New York: Cambridge University Press, 2006. Available from http://hsus.cambridge.org/.
Commager, Henry Steele, ed. The Civil War Archive: The History of the Civil War in Documents. New York: Bobbs-Merill, 1973.
Davis, Jefferson Finis. "Letter from Jefferson Finis Davis, November 18, 1861." A Compilation of the Messages and Papers of the Confederacy, Including the Diplomatic Correspondence, 1861-1865, ed. James D. Richardson. Nashville, TN: United States Publishing Company, 1906.
Dew, Charles B. Ironmaker to the Confederacy: Joseph R. Anderson and the Tredegar Iron Works. New Haven and London: Yale University Press, 1966.
Fayetteville (NC) Observer, July 17, 1862.
Fogel, Robert William, and Stanley L. Engerman. Time on the Cross: The Economics of American Negro Slavery. New York and London: W. W. Norton, 1974.
Fox-Genovese, Elizabeth. Within the Plantation Household: Black and White Women of the Old South. Chapel Hill: University of North Carolina Press, 1988.
Genovese, Eugene. The Political Economy of Slavery. New York: Vintage Books, 1967.
J., H. H. "An Appeal to Planters." Savannah (GA) Daily Morning News, March 19, 1862.
Luraghi, Raimondo. The Rise and Fall of the Plantation South. New York: New Viewpoints, 1978.
McPherson, James M. Ordeal by Fire: The Civil War and Reconstruction. New York: Alfred A. Knopf, 1982.
Millett, Alan R., and Peter Maslowski. For the Common Defense: A Military History of the United States of America. New York: Free Press; London: Collier Macmillan, 1984.
Morgan, Chad. "The Public Nature of Private Industry in Confederate Georgia." Civil War History 50, no.1 (2004): 27–46.
Morgan, Chad. Planter's Progress: Modernizing Confederate Georgia. Gainesville: University Press of Florida, 2005.
Nicholson, A. O. P. "The Southern Confederacy Its Commercial and Financial Independence." Raleigh
(NC) Daily Register. October 12, 1861, Issue 82; col. A.
Raleigh (NC) Daily Register. September 14, 1861, Issue 74, col. C.
"Southern Manufactures," Southern Cultivator (1843-1906). Atlanta: June 1861; Vol. 19, Issue 6.
Tindall, George Brown, and David Shi. America: A Narrative History, 3rd ed. New York and London: W. W. Norton, 1992.
Whitten, David O., and Bessie E. Whitten. The Birth of Big Business in the United States, 1860-1914: Commercial, Extractive, and Industrial Enterprise.
Westport, CT: Praeger Publishers, 2006. Wilson, Harold S. Confederate Industry: Manufacturers and Quartermasters in the Civil War. Jackson: University of Mississippi Press, 2002.
Wright, Gavin. The Political Economy of the Cotton South: Households, Markets, and Wealth in the Nineteenth Century. New York: W. W. Norton, 1978.
When the Civil War began in 1861, the large factory operation had not yet become the normal means of production in the United States. Instead, the 1860s were a transitional period between earlier forms of manufactur-ing and the factory system that would come to dominate the American landscape over the course of the decades to follow. While some industries, such as textiles and meat-packing, had already begun the transformation process from small shop and mid-sized manufactory to large factory, most industrial workers still labored in small manufacturing operations. These maintained a different rhythm and style of work than the factory system that would later transform the lives of workers and the landscapes of America's large cities and small towns.
Thus, most workers during the Civil War still worked in shops that reflected more traditional forms of manufacturing. For example, in New York City in 1860, small shops predominated in the metal trade: only fourteen out of fifty-eight establishments had more than ten workers (Stott 1990, p. 47). The smallness of shop operations meant that workers often labored alongside their supervisor, who was oftentimes the shop's owner. This individual was likely a skilled craftsman who had worked his way up in his profession to the point where he owned his own operation. The pace of work in such establishments was not regulated by the rhythm of machines, and labor was not usually subdivided into monotonous, simple tasks (Stott 1990, pp. 36–37).
One of the most important factors in the shift from small manufactories to large factories was the steam engine, which allowed factory owners to run larger and more complex machinery that could do the work previously done by several laborers. During the 1860s, around 50 percent of the country's manufacturing enterprises still relied on waterpower (Gutman 1977, p. 33). Waterpower fluctuated based on the amount of water available in streams or rivers, which rose and fell, and thus could limit the size and output of an industrial operation. Steam power, in contrast, promised a steady supply of energy, allowing costly work stoppages to be avoided. It also meant that there was no limit on the size of a manufacturing operation. The introduction of steam power also made it possible for a factory to be located anywhere, not just next to a river. Before the rise of steam power, the need to be positioned near a stream or river meant that many shops were located in rural areas, but with the rise of the steam engine these operations began to move to large cities and grow in scale (Hunter 1985, p. 104). The decade before the Civil War witnessed a massive growth in the use of steam power, and new industries using this technology sprung up. For example, in 1850 in Massachusetts, 43 percent of manufacturing operations used steam power, but by 1860, the percentage had increased to 56 percent (Hunter 1985, p. 110). This rise in the use of steam power directly contributed to the growth of American cities, with the accompanying rise of concentrated manufacturing establishments and the rapid growth of immigrant working-class populations that would fill the factories with workers. Between 1840 and 1880, the total U.S. population nearly tripled, the number of industrial workers more than quintupled, and capital investment increased more than eightfold (Hunter 1985, p. 112). Printing establishments provide an excellent example of how the steam engine enabled operations to grow to massive size. For example, just before the Civil War, the New York firm Harper's employed numerous workers with subdivided tasks, supervised by scores of foremen in a seven-story operation (Stott 1990, p. 50).
While the small shop and mid-sized manufactory were still dominant in American manufacturing at the start of the Civil War, many industries—such as meatpacking, textiles, and arms manufacturing—had begun to move toward the large factory model. Indeed, the textile mills that began springing up in Lowell, Massachusetts, during the 1820s can be considered the first modern factories. By the time of the Civil War, many of these were mechanized to an unprecedented degree, and employed large and highly regulated workforces of a thousand or more (Rodgers 1974, p. 20).
The rise of the factory system increasingly meant that artisans who had previously worked on a product from start to finish now found themselves either tending a machine or performing the same task all day, as managers sought to subdivide labor in an effort to create more efficient systems of production. While many think of the assembly line as originating with Henry Ford in the early part of the twentieth century, James Barrett (1987) points out that assembly line production was employed in the meatpacking industry well before Ford's Model T (p. 20). Whereas in a small butcher shop, a butcher performed every task in the processing of each animal, thereby requiring that butchering be done by a skilled craftsman, large meatpacking plants divided the slaughter of each animal among multiple workers along an assembly line that could be sped up at will by the foreman. Many other industries went through the same development from small shops where workers performed tasks by hand from start to finish, at their own pace, to factories where work was highly subdivided, regimented, and mechanized.
As the war began, both the North and South faced the necessity of putting hundreds of thousands of men under arms, while at the same time continuing to keep their factories running. This did not present as much of a difficulty for the North, with its large pool of immigrants, as it did for the South, with its much smaller white population and its large black population that consisted primarily of slaves living in rural areas. The Tredegar Iron Works in Richmond, Virginia, provides an excellent example of some of the unique difficulties that Southern factories encountered during the Civil War. By the time the Civil War began, Richmond was the most important center of iron production in the South, and the Tredegar Iron Works was the largest of the mills, employing over 1,500 workers (Dew 1966, p. 3). The start of the conflict meant that Tredegar would have to rapidly increase production to meet the needs of the Confederacy. However, Tredegar initially encountered problems in stepping up production due to several factors: First, a number of workers rushed to join the newly formed Confederate Army; second, immigration as a source of labor had dried up due to the war; and finally, many of the mechanics with manufacturing know-how who had been employed at Tredegar were from the North and returned home at the start of the war. To fill these gaps, the managers at Tredegar hired almost three times as many youths as they had the previous year, and they increased the number of slave workers to around 10 percent of the labor force (Dew 1966, pp. 90–91). By 1863, Tredegar's labor shortage had become so acute that its owners were forced to hire 113 black convicts (p. 253). With large numbers of slave workers, Tredegar's management also faced the constant problem of its workforce liberating itself by running away (p. 255). By 1864, even slave labor was in short supply, and slave owners were able to command higher and higher rates for the renting of their slaves. Tredegar found itself in a bidding war for the renting of slaves with other manufacturing interests and even the army itself, so that prices for renting slaves jumped from a low of $225 to as much as $400 in a short time. Additionally, federal raids throughout northern Virginia further reduced the supply of available labor (p. 259). Nonetheless, by the end of the war, Southern factories were almost completely dependent on slave labor. For example, during the last few months of the war, more than three-quarters of the workers at the Naval ordnance works at Selma were slaves (p. 263).
Southern factories were hampered throughout the war not only by labor shortages, but also by problems with the supply of resources needed for manufacturing war essentials. By contrast, the North was able to more-efficiently utilize its factories as the war progressed, a development that proved crucial to its eventual victory.
Barrett, James R. Work and Community in the Jungle: Chicago's Packinghouse Workers, 1894-1922. Urbana and Chicago: University of Illinois Press, 1987.
Dew, Charles B. Ironmaker to the Confederacy: Joseph R. Anderson and the Tredegar Iron Works. New Haven and London: Yale University Press, 1966.
Dublin, Thomas. Women at Work: The Transformation of Work and Community in Lowell, Massachusetts, 1826-1860. New York: Columbia University Press, 1979.
Gutman, Herbert G. Work, Culture, and Society in Industrializing America. New York: Vintage Books, 1977.
Hareven, Tamara K., and Randolph Langenbach. Amoskeag: Life and Work in an American Factory-City. New York: Pantheon Books, 1978.
Hunter, Louis C. Steam Power. Vol. 2 of A History of Industrial Power in the United States, 1780-1930. Charlottesville: University Press of Virginia, 1985.
Rodgers, Daniel T. The Work Ethic in Industrializing America, 1850-1920. Chicago: University of Chicago Press, 1974.
Ross, Steven J. Workers on the Edge: Work, Leisure, and Politics in Industrializing Cincinnati, 1788-1890. Los Angeles: Figueroa Press, 1985.
Stott, Richard B. Workers in the Metropolis: Class, Ethnicity, and Youth in Antebellum New York City. Ithaca, NY, and London: Cornell University Press, 1990.
The manufacturing of arms during the Civil War occurred mainly at two sites: the U.S. Government Arsenal in Springfield, Massachusetts, commonly known as the Springfield Armory, and—on the Confederate side—the Virginia Manufactory of Arms at Richmond, known as the Richmond Arsenal. Together these sites produced the majority of the shoulder-fired muskets used by soldiers in both armies during the four years of the war.
The Springfield Armory
The Springfield Armory's origins dated back to 1777, when a site once used for military training on the Connecticut River in southern Massachusetts began making gun carriages and cartridges. It became a major ammunition and weapons depot, and was a target of the 1787 Shays Rebellion, when a group of disgruntled farmers led by Daniel Shays attempted to seize control of the site and the state in order to prevent foreclosures on their land. The Springfield Armory began making muskets in 1795, after President George Washington decided that a domestic manufacturing site was necessary in order to reduce the fledgling nation's dependence on foreign-made firearms.
The nearby village of Springfield grew in size and wealth during the first half of the nineteenth century as the Springfield Armory's output expanded. Highly skilled gunsmiths, often German or Irish immigrants, were the most vital members of the workforce, but in 1819 a new lathe was developed that allowed relatively unskilled workers to mass-produce the muskets. The U.S. Ordnance Department's decision to foster the development of interchangeable parts for muskets led to several more innovations at Springfield, including an early example of an assembly-line system, in which individual workers—some skilled, some unskilled—made various parts of a product. The writer Henry Wadsworth Longfellow visited the armory, and the immense stockpiles of new weapons he saw prompted him to pen the 1843 antiwar poem "The Arsenal at Springfield."
Labor tensions arose at the Springfield Armory in the 1840s, prompting a Congressional investigation. Periodically, military control was imposed at the site in order to rein in workers who were determined to protect their job status and wages, which were extraordinarily high for the western Massachusetts area. Indeed, workers were paid so well that many remained there until retirement, and though the European-style apprentice system had long been officially abolished, many of them still managed to pass their jobs down to sons or other family members.
Military control resumed at the Springfield Armory in April 1861 when the Civil War began. Springfield became the sole federal armory after the site at Harpers Ferry, Virginia, was set ablaze by retreating federal troops at the onset of the war. The newly installed commandant at the Springfield Armory was Captain Alexander Dyer, who immediately required all workers to swear an oath of allegiance to the United States. Workers promised to "support, protect, and defend the Constitution and Government of the United States against all enemies… and… that [they would] well and faithfully perform all the duties which may be required… by law" (Whittlesey, p. 158). Dyer also instituted two ten-and-a-half-hour shifts, making the site a near round-the-clock operation, along with a raft of new rules prohibiting loitering, smoking, reading, and casual conversation on the job.
A Target of Espionage and Sabotage In 1861 the Springfield Armory employed 350 men and produced 1,500 rifle-muskets per month (Wilson 2006, p. 119). In 1864 nearly 2,500 workers were employed at the Armory, and production that same year reached a peak of 250,000 (Hattaway 1997, p. 38). During the war, however, the Armory had a difficult time maintaining an adequate workforce, as smaller rifle-musket manufacturers began springing up in the city to answer the demand of the Ordnance Department, attracting workers who sought both more money and less draconian working conditions.
The Springfield Armory was an obvious target for Confederate spies. It was ringed by an iron fence and heavily guarded by a Union Army detachment. After the war's end, revelations surfaced that in 1864 two enemy agents, possibly men disguised in women's clothing, had succeeded in planting a grenade-like device in the extremely flammable armory—but the gunpowder-filled iron pellet failed to ignite. A massive fire earlier that same year, in July, was attributed to spontaneous combustion, and destroyed the milling shop and dozens of milling machines. Because tables and other pieces of furniture were saturated with oil due to the nature of rifle manufacturing, the building was extremely flammable; however, firefighters on staff prevented the conflagration from spreading to other buildings.
The Richmond Arsenal
The Confederate states entered the war at a serious disadvantage in respect to armament stores. The decision to locate the capital of the Confederacy at Richmond, Virginia, was partly due to the fact that it was the most industrialized city in the largely agrarian South. One of Richmond's most impressive plants was the massive Tre-degar Iron Works, a privately owned foundry that had been making iron locomotive carriages and railroad tracks since the late 1830s. The city was also the site of the Richmond Armory and Arsenal, which had been in operation since the 1790s but operated at a far lower capacity than its Springfield counterpart. The Richmond site turned out an average of 2,130 muskets annually prior to war compared to the Massachusetts numbers of 1,500 muskets every month, but Confederate firepower increased with the help of machinery taken from the Union-sacked federal armory at Harpers Ferry (Bilby, 1996). The Richmond Arsenal began making the "C. S. Richmond," a .58-caliber musket that would be used by the majority of Confederate infantry.
Responsibility for finding a solution to the serious disadvantage created by the South's minimal supply of the raw materials and skilled artisans necessary for arms manufacturing fell to the Confederate Army's Ordnance Bureau chief, Josiah Gorgas. He ordered a massive material effort to be implemented: Church bells were taken down and melted to provide cannon-making material, and even the stills used by moonshine makers were seized for their copper parts, which were made into musket percussion caps. Because potassium nitrate was crucial to the production of gunpowder, Gorgas sent crews into Appalachian caves to collect saltpeter, its naturally occurring mineral form. He even decreed that chamber pots should be leached of organic nitrogen, a byproduct of human digestion.
The Tredegar Iron Works made cannons for the Confederate Army, and outfitted the South's first ironclad battleship, the C.S.S. Virginia; at the height of the war, it employed a thousand workers (Heidler, p. 1970). A natural target of enemy spies, it was guarded by the Tredegar Battalion, one of the Confederate side's home-guard units. Like the Richmond Arsenal, the Tredegar Works was forced to employ women and children as the war dragged on and labor shortages worsened; it also made use of slave labor. An article in the July 29, 1864, Richmond Whig reports the beating of a slave by the foreman of the arsenal's smithing shop, who whipped the man after he denied the accusations of other workers that he was stealing copper. The article focused not on the fact that a worker had been whipped, but on whether he had been excessively whipped, and if this punishment should be an issue for the local magistrate. "We do not advocate negro murder, or cruelty towards negroes, but certainly it is much better when negroes are caught stealing to thrash them soundly than to pester the courts with their cases," the newspaper opined (p. 2).
Spectacular and Tragic Explosions Rates of injury and even death were high at the Richmond Arsenal because of the dangerous nature of the work done there, and the large number of unskilled employees. Indeed, in July 1861 a respected scientist lost his life at the Confederate States Laboratory adjacent to the Arsenal: Joseph Laidley, a chemist working for the Confederate cause, reportedly walked into a building with a lit cigar, though this account was disputed even at the time. "The wooden out-building and the interior one in which the powder was manufactured, were found blown down, and many of the timbers wrenched, twisted and broken in a manner to show the almost inconceivable power of the powder," the Richmond Dispatch reported. "Mr. Laidley was found lying on his back, one of the most horrible objects of mutilated humanity which it is possible to conceive…. [N]othing remained to mark the features of a man, except a pair of whiskers and a portion of the neck" (July 4, 1861, p. 2).
A far deadlier explosion, which left forty-five women and children dead, occurred at the Confederate States Laboratory on March 13, 1863. A report published a week later in the Daily Morning News of Savannah, Georgia, recounted that victims jumped into the river with their clothes aflame, and a boy named Currie "had his clothing burned entirely off, and ran about crying, 'mother, mother!' He soon died." The paper attributed the fire's cause to "the ignition of a friction cannon primer." One of the employees, Mary Ryan, "was working taskwork, filling these [primers] on a board in which they were inserted. Instead of taking them from the board singly, she struck the board upon the bench in her haste to empty them, and the explosion of one of them by the concussion was the dire consequence" ("Explosion at the Confederate States Laboratory Works on Brown's Island," March 20, 1863).
As Union troops neared Richmond and the city was forced to evacuate, Confederate officials ordered troops to dump 25,000 rounds of artillery ammunition into the James River. Reportedly, they also ordered troops to set fire to the Richmond Arsenal, as well as other buildings—though some Virginians later disputed this account. The Tredegar Iron Works survived, but was consumed by flames a century later. The Springfield Arsenal continued to serve as a main supplier of rifles to the U.S. Army up until the Vietnam War.
Bilby, Joseph G. Civil War Firearms: Their Historical Background and Tactical Use and Modern Collecting and Shooting. Cambridge, MA: Da Capo Press, 1996.
"Charged with Cruelty to a Negro." Richmond Whig (VA), July 29, 1864, p. 2.
"The Explosion at the Confederate States Laboratory Works on Brown's Island." Daily Morning News (Savannah, GA), March 20, 1863.
Hattaway, Herman. Shades of Blue and Gray: An Introductory Military History of the Civil War. Columbia: University of Missouri Press, 1997.
Heidler, David Stephen, Jeanne T. Heidler, and David J. Coles, eds. Encyclopedia of the American Civil War: A Political, Social, and Military History. New York: W. W. Norton, 2002.
"Horrible Catastrophe." Richmond Dispatch (VA), July 4, 1861, p. 2.
Whittlesey, Derwent Stainthorpe. The Springfield Armory: A Study in Institutional Development. Ph.D. diss., University of Chicago, 1920.
Wilson, Mark R. The Business of Civil War: Military Mobilization and the State, 1861-1865. Baltimore, MD: Johns Hopkins University Press, 2006.
When hostilities began after the bombardment of Fort Sumter in Charleston Harbor in April 1861, both sides of the conflict geared up for war. Each side faced different challenges in order to accomplish its goals. Interestingly, both sides had to build a navy from very little, but it was the Northern states that were able to outproduce the Southern ones.
When the Confederate government was formed in February 1861, its new Congress established a Navy Department on February 21. Its biggest challenge was to build a serviceable fleet completely from scratch. Over the course of the war the Navy Department, headed by Secretary of the Navy Stephen R. Mallory (1813-1873), scrambled to accomplish a great deal with limited means. Furthermore, the Confederate Navy competed with the Army for resources and transportation. These issues plagued the Confederate Navy because President Jefferson Davis and most of the rest of the government tended to ignore the navy in favor of the army.
The Confederacy acquired ten ships by purchase or capture in 1861 that mounted only fifteen guns. After the central government in Richmond began to operate, the various Confederate states handed over another fourteen vessels. Mallory made an effort to buy and convert merchant vessels for conversion to military purposes, but this initial move was meant only as a temporary measure until new warships could be built. The South, however, had only two major shipbuilding facilities in operation at the outbreak of hostilities. The more famous of these was the Gosport Shipyard in Norfolk, Virginia, which had its name changed in 1862 to Norfolk Naval Shipyard after the Union recaptured the facility; the other was in Pensacola, Florida, and was primarily a coaling and refitting station. There were also a number of private shipyards in the South, most of which were rather modest in size. The majority of these shipyards were in the coastal towns of the South; however, river towns like Columbus, Georgia, and Selma, Alabama, became focal points of ship construction as the war lengthened.
Secretary Mallory also sent agents abroad, particularly to Great Britain and France, to secure vessels of war from a variety of shipbuilders. The most famous of these vessels were the commerce raiders Alabama, Florida, and Shenandoah. These ships would wreak havoc on the shipping of the United States.
Unable to compete with Northern industrial capacity, Mallory believed that quality was better than quantity. In a letter to a congressman, the secretary stated that he regarded the "possession of an iron-armored ship as a matter of the first necessity. [The] inequality of numbers may be compensated by invulnerability: and thus not only does economy but naval success dictate the wisdom and expediency of fighting with iron against wood" (Still 1987, p. 8). Ironclad ships became the hope of the Confederate Navy against the Federal Navy. Congress allotted two million dollars for building ironclads. There were several facilities in the South able to produce or roll iron, but only eleven were large enough to produce enough for the navy's needs.
Another way to combat the might of the Union Navy was to offer letters of marque and reprisal, which allowed private citizens to be considered enemy combatants; if these private vessels sank or captured an enemy vessel, prize money would be awarded. Several types of ships were built by civilians in order to take advantage of the letters and possibly become wealthy. These vessels tended to be experimental in design. One ironclad, the Manassas, and the submarines, like the H. L. Hunley, fell into this category. Some citizens also built ships to be given to the navy to use. One instance of this was David S. Johnson, who built the gunboat Chattahoochee in Saffold, Georgia, along the lower Chattahoochee River. In all, between 1861 and 1865, Southern shipbuilders laid down 150 vessels. The competing interests of the Confederate government, the states, and private citizens hampered the overall effort to build enough warships.
At the time of Abraham Lincoln's inauguration, the United States Navy had ninety ships, of which forty-two were in commission. Most were stationed around the globe carrying out a variety of missions outside the United States. In the aftermath of Fort Sumter, Lincoln called for a blockade of Southern ports as part of the strategy to strangle the rebellious states into submission. With over three thousand miles of coastline, the Union needed more ships than it had in 1861. Within a year a vast construction program was under way. About 300 vessels were added to the Union Navy; these newer vessels started to make the blockade effective. By the end of the war, 418 vessels had been purchased, of which 313 were steamers. More than 200 other warships were built under contract, and over sixty of these were ironclads. These ships made the U.S. Navy one of the largest, most modern, and most powerful navies in the world.
The important shipyards in the North included: Philadelphia; Boston; New York City; Portsmouth, Maine; and the Navy Yard in Washington, DC. In addition, there were dozens of private shipbuilders on the coasts, the Great Lakes, and along the Mississippi and Ohio Rivers. The state of Pennsylvania alone could outproduce the entire South. While the Merrimack was slowly going through its transformation, the shipbuilders and iron makers in New York built the Monitor in about ninety days. The average price of a Union ironclad ended up being about $400,000.
As wide-ranging as were the designs of Confederate ships, Union ships were just as varied, if not more so. There were fast cruisers to chase down the commerce raiders the South unleashed, as well as ironclads, a submarine program, and river vessels. The most successful of the river vessels may have been the double-enders, which were capable of going up- or downriver without having to turn around.
The size of the Federal Navy tipped the scales in favor of the Union during the war. Mostly it was the effort of the fleets to not only blockade the Southern coastline, but also to be able to attack shore fortifications at various points in order to slowly reduce the capacity of the Confederate war effort.
Interestingly, Northern shipbuilders also had problems with a scarcity of labor like the South. Many shipyard workers joined the military for patriotic reasons. Others went into the Navy hoping to strike it rich with prize money made from captures. Later in the war, some shipyard workers were conscripted. In the South, hundreds of qualified carpenters, mechanics, ironworkers, and many other workers simply left because they were foreign-born and had no reason to support the rebellion. Those who were left volunteered for the army or were later drafted. The army was reluctant to allow these individuals to leave the land forces in order to resume their trade. As the war dragged on, the increasingly stringent draft measures continually reduced exemptions to the skilled workers the shipyards needed. The U.S. Navy Department worked better with the Union Army to deal with these issues than the Confederates ever did. In addition, Union shipyards were willing to hire black workers who had fled northward from the Confederate states.
Because of these manpower shortages and the need to complete the ships rapidly, wages for Northern workers tended to rise throughout the war. From late 1862, wages rose from $1.42 per day to $3.51 in March 1865 (Roberts 2002, p. 135.) Even so, labor strikes were fairly common in Eastern shipyards. New York City's shipbuilders suffered through a number of strikes late in 1862 and early in 1863, well before the draft riots of the latter year. The Boston Navy Yard shut down three times during 1863. Because of a tighter labor market, the Western shipyards experienced very few strikes.
Strikes were also common during the first year of the war for Southern shipbuilders. One strike occurred in New Orleans during November 1861, when the workers pushed for an increase in pay, which delayed the construction of several vessels. A more revealing episode came in 1864 at Selma, when Commander Catesby R. Jones (1821-1877) was threatened by his workers that they would strike if not given a wage of ten dollars a day because they had heard that the Atlanta shops were paying that price (Still 1987, p. 73).
The Confederate Navy tried a variety of ways to supplement the work that required completion. There were generous offerings of overtime pay. The Navy also shifted workers from one facility to another hoping to catch up on the backlog of work. A rivalry developed among these establishments for workmen, however, which hampered cooperation when it was needed the most. Eventually, this caused the Navy simply to take control over many of the works; even then, the deteriorating value of Confederate money exacerbated the problems. When one facility raised its wages, the other local producers would also have to raise their wages or lose their workers. Further compounding the production issue was that workers were organized into local defense units. Lieutenant Robert D. Minor (1821-1871), placed in charge of the ordnance works at Richmond, complained that his men were in the field and could not supply ammunition.
Eventually Southern shipbuilders followed the example of their Northern counterparts and put African Americans to work. A cross-section of free skilled and slave unskilled laborers worked in Southern shipyards. In the ordnance works of the navy in 1865, more than half the workers were African American (Still 1987, p. 69).
In the end a greater industrial capacity and a more stable workforce allowed the Union to build a much larger navy than the breakaway states. Even though the Confederate Navy had several successes, it was eventually restricted to fighting a defensive war, always responding to the initiatives of its Union counterpart. As the coastal cities fell to Union forces, the Confederates adapted as best they could, but in the end it was not enough. Even the commerce raiders, which greatly annoyed Yankee merchants, were only a nuisance to the Union's war effort.
Canney, Donald L. Lincoln's Navy: The Ships, Men and Organization, 1861-65. Annapolis, MD: Naval Institute Press, 1998.
Hackemer, Kurt H. The U.S. Navy and the Origins of the Military-Industrial Complex, 1847-1883. Annapolis, MD: Naval Institute Press, 2001.
Luraghi, Raimondo. A History of the Confederate Navy. Annapolis, MD: Naval Institute Press, 1996.
Roberts, William H. Civil War Ironclads: The U.S. Navy and Industrial Mobilization. Baltimore, MD: Johns Hopkins University Press, 2002.
Still, William N., Jr. Confederate Shipbuilding. Columbia: University of South Carolina, 1987.
In the years leading up to the U.S. Civil War, America's transition from an agricultural economy to a nation of industry accelerated. The nation's manufacturing centers, located primarily in Northern cities, were connected to labor markets, suppliers, and wholesalers by an ever-expanding network of roads, canals, rivers, and railroads. Remarking on the mountainous wooded landscape that housed much of the nation's industry, Thomas Kettell wrote in his tract Southern Wealth and Northern Profits that "the mountain torrents of New England have become motors, by which annually improving machinery has been driven" (Kettell 1860, p. 52).
A significant portion of the nation's industrial output consisted of textiles, a word that literally means "that which is woven." During the nineteenth century, textiles were chiefly made from five natural materials—hemp, flax, silk, wool, and cotton—or combinations thereof. Textile mills powered by water or coal sprang up in every Northern state, producing spun thread, yarn, ribbons, and a range of woven fabrics. While the production of these materials had been a household function at the start of the nineteenth century, by the 1830s, family self-sufficiency was relinquished in favor of inexpensive mass-produced goods.
Technological ingenuity drove the Anglo-American textile industry. Following the development of the power loom in England in 1784, William Horrocks engineered mechanized improvements in 1813. Other advancements in textile manufacture included the transition from water power to steam; the use of iron in the carding process; advances in the techniques of bleaching; and the development of the first synthetic dyes in 1856. In a related advance, Elias Howe patented the lock-stitch machine in 1846, and Isaac Merritt Singer did the same with the flying-shuttle machine in 1851. The Singer Sewing Machine Company, which sold machines on installment and provided free instructions to homemakers, quickly became the leader in the home sewing machine business.
Cotton Becomes King
Although the United States produced a variety of raw materials for textiles, cotton quickly eclipsed all others. An early industrialist named Samuel Slater (1768-1835) established one of the first U.S. cotton mills in Pawtucket, Rhode Island, in 1793, a year after Eli Whitney's cotton gin automated the process of separating cotton fibers from cotton seed. New England retained its concentration of textile mills up until the Civil War, producing almost 70 percent of the nation's total textile output by 1840. According to Treasury Department reports, by the 1850s, the Northern states manufactured five times as much cotton goods as the Southern states (Kettell 1860, p. 37).
Silk, another textile produced for both import and local consumption, was also made in the North. The invention of the Jacquard loom in France in 1804 sparked an increase in the use of this fiber. Boasting the region's leadership role in the U.S. silk industry, a Philadelphia newspaper was able to boast in 1858 that "from indisputable data…textile fabrics are produced here on a scale which constitutes this city the great centre of that production for the whole United States" (North American and U.S. Gazette, March 5, 1858).
While the wool industry in the United States never achieved the export capacity of the cotton industry, it did keep pace with new technology and thus satisfied local demand. As Samuel Slater's biographer, George S. White, commented in 1836, England's leading position in the textile industry—particularly its share of the woolen market—might well be eclipsed by the United States, "where ingenuity and enterprise eminently mark the national character" (White 1836, p. 222).
Because of its versatility, raw cotton quickly established itself as one of the nation's primary exports. By 1834, the Southern cotton crop accounted for one-half the total cotton grown worldwide. Between 1835 and 1858, overseas demand—primarily from England—increased on average by 12.5 percent per year (Kettell 1860, p. 38); by 1860, cotton accounted for almost half of all U.S. exports. As an editorial in the New York Herald reported that same year, the Southern states exported four and a half bales of raw cotton, a quantity worth $25 million "before the merchant, the mariner, or the manufacturer had put a hand to it to double, triple and quadruple its value" (July 26, 1860, p. 4).
While U.S. cotton growers provided sufficient quantities of raw materials to keep factories in operation, quality also factored largely into the demand for Southern cotton. The highly desirable long-staple Sea Island cotton (with filaments of 1-1/8 inch in length and over) was grown only in Georgia, while New Orleans produced an especially soft and silky variety of cotton that could be had no where else. Goods produced in "the colonies" were "preferred to English, by reason of their superior texture; and also, that they shrunk much less in the process of printing," reported the Boston Daily Advertiser, quoting a London source (March 21, 1876).
England needed large amounts of raw materials in order to keep its urban industrial base growing. With regard to cotton, it found a source in the American South, and abolished import duties in order to make transatlantic trade more attractive. As a writer noted in the London Times, "the importation of cotton into this country [England]…is so large and so steady that we can steer our national policy by it; it is so important to us, that we should be reduced to embarrassment if it were suddenly to disappear" (Kettell 1860, p. 39).
Although cotton cultivation had been attempted—sometimes unsuccessfully—in India, Jamaica, Australia, China, and Africa, the quality of cotton grown in these areas proved to be inferior to that of the Southern United States. In addition, Southern growers had an efficient and cost-effective labor force: slaves. Ironically, while the slavery issue was hotly debated in the English Parliament, and some called for a ban on anything other than "Free Labour Cotton"—cotton harvested without employing the "domestic institution" of slave labor—the country's demand was so great that none but slave labor could fulfill it. The year after England repealed its import duty on raw cotton, the country's demand for plantation-grown cotton fiber increased sixteen-fold (Kettell 1860, p. 41).
"The jump which the consumption of cotton in England has just made [following the repeal of the import duty] is but a single leap, which may be repeated indefinitely," the Times writer predicted. "There are a thousand millions of mankind upon the globe, all of whom can be most comfortably clad in cotton. Every year new tribes and new nations are added to the category of cotton wearers." With cotton such a "universal necessity," England "must continue to hope that the United States will be able to supply us in years to come with twice as much as we bought of them in years past" (Kettell 1860, p. 41). Although demand from England remained strong, the moral questions swirling around the slavery debate in Parliament cast a pall over trade relations with the Southern states.
Related to the increasing demand for U.S. textiles were developments in fashion. The years just prior to the U.S. Civil War saw the advent of the hoopskirt or cage crinoline, which was introduced in 1857. This marvel of engineering—a metal cage supported at the waist and ingeniously hinged to allow for movement and collapsibility—encouraged stylish women to wear ever-more voluminous skirts, some of which reached 15 feet in circumference. Silk taffetas in plaid and striped patterns, as well as iridescent or "shot" versions—available in ever-increasing combinations of vivid hues following the invention of synthetic coal tar dyes in 1856—also gained popularity among more affluent ladies (Tortora and Eubank 1998, p. 302).
Among men, a new demand for cotton came with the 1850s introduction of blue jeans, a garment made from heavy cotton denim devised by the German-born entrepreneur Levi Strauss (1829-1902) in San Francisco to address the needs of miners for sturdy work overalls during the California gold rush. Such other items of clothing for men as frock coats, trousers, and vests, were now factory-made and easier to obtain. Manufactured in standard sizes rather than made to order, some vests and trousers featured hidden buckles that allowed their fit to be adjusted, as well as manufactured hooks and eyes, snaps, and buttons (Tortora and Eubank 1998, p. 302).
While the South exported its luxurious long-staple cottons to overseas markets and Northern textile mills, the garments worn by the men, women, and children at work in the cotton fields were crafted of cheap, rough-textured cotton or wool obtained from manufacturers in Rhode Island or Europe. Most slaves were issued two outfits per year, one for warmer months and one for cooler weather. A coarse white homespun cloth frequently used in garments worn by slaves in both the South and in the West Indies became known as negro cloth (Bishop 1866, p. 339).
Textiles in the Union
The Southern states were a primary market for Northern manufactured goods prior to the outbreak of the Civil War. Of the $60 million worth of purchases estimated to have been made by Southern consumers in 1860, over a third consisted of boots and shoes, while textiles came in a close second (Kettell 1860, p. 60). "The shipowners and the manufacturers of New England, the merchants and mechanics of New York, and the manufacturers and miners of Pennsylvania and New Jersey, all draw no small portion of their daily wages and profits from the stream that rises in the cotton fields of the South," asserted a New York Herald editorial in the summer of 1860, addressing this trade between North and South (July 26, 1860, p. 4). The outbreak of the Civil War in 1861 interfered initially with this profitable trade. In addition to leaving Northern businesses holding trade losses totaling $300 million, the war disrupted the North's supply of raw cotton and labor (Bangor Daily Whig & Courier, July 7, 1864).
While Northern textile mills experienced a temporary setback in 1860, their initial losses were mitigated by the new orders that poured in from the Union Army. The Civil War generated an immediate and substantial demand for ready-to-wear uniforms. In the Union Army alone, over 1.5 million uniforms were required each year between 1861 and 1865, and the cloth for all these uniforms was a product of the Utica Steam Woolen Company, of upstate New York (Coates 2002, p. 103). Because textile mill machinery was already operated in large part by women, the transfer of men from the factories into the Union Army also affected wartime industrial output in the North far less than it did in the South.
In fact, the war accelerated the spread of mechanization and the efficiencies of the factory system. The number of sewing machines used by Northern factories doubled between 1860 and 1865, and shoe production increased as a result of the development of a machine for sewing the uppers to the soles (Depew 1895, p. 571). To supply the textiles needed for uniforms, tents, and other supplies, smugglers provided Southern cotton while wool came from New England and the Western states. To meet demand, Northern mills also turned to Britain and France, where English woolens and Egyptian and Indian cotton were readily available. This transatlantic trade was made easier because such Northern ports as New York City were already central to the textile export business, having eclipsed such Southern ports as Charleston years before.
Also affecting the Northern textile industry during wartime was the Morrill Act. A controversial measure supported by Northern Republicans and spearheaded by the Vermont congressman Justin Smith Morrill (1810–1898), the Act established a tariff wall designed to protect Northern industrial manufacturing from foreign competition. The Morrill tariff, which went into effect on April 1, 1861, imposed a protective tax of between twenty and fifty percent on imported silk, cotton, woolens, and iron. Although the tariff provoked some manufacturers in England to shift their support to the Confederacy, its impact was so minimal that two further tariffs were enacted during the war as a way to fund the war effort. Because of the favorable balance of trade resulting from the importation of raw cotton by the North, French and English markets continued to do business with the United States despite the costs on their products imposed by the Morrill tariff (Boston Daily Advertiser, March 21, 1876).
Textiles in the Confederacy
Textile manufacturing in the South, more specifically the manufacture of cotton fabrics, had increased dramatically during the 1850s. Such companies as the Charleston Cotton Manufacturing Company, which began business in the summer of 1850, inspired one business journal to note that "our Southern friends [seem]…destined to become largely interested in fostering at home those branches of industry to which their great staple owes its importance and value" (North American and United States Gazette, July 2, 1850). Despite this expansion and the advocacy of such individuals as the South Carolina businessman William Green, the industry was still in its infancy when war broke out in 1861.
As Edward A. Pollard explained in his The Lost Cause, "the South entered the war with only a few insignificant manufactories of arms and materials of war and textile fabrics. She was soon to be cut off by an encircling blockade from all those supplies upon which she had depended" (Pollard 1866, p. 132). With President Abraham Lincoln's proclamation in April 1861, ordering naval commanders to blockade Southern ports, the importation of foreign goods to the South slowed dramatically. Apart from a small amount of goods smuggled in by blockade runners, Southern stores quickly found their shelves emptied of inventory.
The slogan "Cotton is king" reflects the heady exuberance of many in the South following the formation of the Confederacy on February 4, 1861. Representatives of the city of Columbus, Georgia, even published an open invitation to Northern "men of reason" in the New York Herald. While regretting that "circumstances have made it necessary to dissolve the government of the United States," the writers proclaimed that "the boasted shuck and hay trade of the Northern States dwindles into utter insignificance compared to this great staple," cotton. "We invite…all the manufacturing and commercial men of the North, who are our friends, to leave the bleak republican North. Unite with us; make our homes your homes, and you shall share the prosperity and happiness in store for us" (April 9, 1861, p. 2).
Unfortunately for the residents of Columbus, Georgia, and elsewhere throughout the South, prosperity and happiness were not the end result of secession. Although the much-hated Morrill tariff applied only to Northern exports, the Confederate president Jefferson Davis's policy of free trade was undermined by the antislavery debates ongoing in England. Like England, Europe "entirely misapprehended the controversy between the Northern and Southern States of the Union," according to the London Times (March 5, 1861, p. 2). Observing the tensions between North and South only weeks before the firing upon Fort Sumter drew the new Confederate States of America into war, the Times writer maintained that "the slavery question… has been merely introduced as a blind… and the real point of contention lies in the national tariff." Southern interests, in fact, "have only one object, which is to get the highest price for the greatest quantity of cotton" (March 5, 1861, p. 2).
While the South had great quantities of cotton, it had few mills to process it, resulting in clothing shortages. Homespun clothes for civilians quickly became the norm. In order to reproduce anything even remotely resembling the crinoline, fashion-conscious Southern women relied on their wits and the textiles to be found in their homes to maintain their wardrobes. Like Scarlett O'Hara in Margaret Mitchells 1936 novel Gone with the Wind, they made use of everything from window curtains to bed hangings. Skirts and blouses were cut from aprons, shawls, and other large spans of cloth, while sleeves became tighter so that new sleeves could be cut out of fuller existing sleeves. Kid gloves were replaced by silk or lace mitts, new hair ribbons were a rarity, and white summer dresses were no longer trimmed with French lace. Women also patched their clothing with scraps cut from discarded garments. As Elzey Hay wrote in Godey's Lady's Book, recalling Southern fashion during wartime, "Before the blockade was raised all learned to wear every garment to the very last rag that would hang on our backs" (Hay 1866, p. 32).
Perhaps fortunately, most American fashion magazines were published in such Northern cities as Philadelphia and Boston. With the circulation of these periodicals now restricted to the North, Southern women felt less pressure to conform to the latest styles. "We knew very little of the modes in the outer world," Hay recalled (Hay 1866, p. 32). "Now and then a [fashion magazine]…would find its way through the blockade, and create a greater sensation than the last battle." According to Hay,
…the finest traveling dress I had during the war was a brown alpaca turned wrong side out, upside down, and trimmed with quillings [small rounded ridges in a piece of cloth] made from [an] umbrella cover. I will venture to say that no umbrella ever served so many purposes or was so thoroughly used up before. The whalebones served to stiffen corsets and the waist of a homespun dress, and the handle was given to a wounded soldier for a walking stick. (Hay 1866, p. 32)
As the supply of textiles continued to diminish throughout the South, prices correspondingly went up. In the bonnet market, for example, a well-connected milliner could pay the going rate for a simple bonnet, refashion it, and then sell it for four times her investment (Hay 1866). For the average Southern woman, however, wartime required a return to home industry. In addition to their own fashion needs, the wives and mothers of Confederate soldiers had to find a way to outfit their loved ones. Hand looms and spinning wheels were retrieved from attics, and the womenfolk set about transforming King Cotton into cloth. A popular song of the era, reportedly written by Carrie Belle Sinclair, captures the spirit of the times. "The homespun dress is plain, I know,/ My hat's palmetto, too; But then it shows what Southern girls/ for Southern rights will do," are some of the lyrics to "The Homespun Dress." "We scorn to wear a bit of silk,/ A bit of Northern lace,/ But make our homespun dresses up,/And wear them with a grace" (Silber and Silverman 1960, p. 68).
After the Reconstruction, in the New South Era, the cotton industry would resume its development, fueled by inexpensive water power, low taxation, and boosterism by the likes of Henry Grady. By 1880, over 14,000,000 acres were under cultivation in the South (Steele 1885, p. 306).
Bishop, J. Leander. A History of American Manufactures from 1608 to 1860. 3 vols. Philadelphia: Edward Young & Co., 1866.
Coates, Earl J. Don Troiani's Regiments and Uniforms of the Civil War. Mechanicsburg, PA: Stackpole Books, 2002.
"Cotton and the Constitution: The Relations of Politics, Industry, and Trade." New York Herald, July 26, 1860, p. 4.
Depew, Chauncey M., ed. One Hundred Years of American Commerce: A History of American Commerce by One Hundred Americans. 2 volumes. New York: D.O. Haynes & Co., 1895.
"Factories at the South." North American and United States Gazette, July 2, 1850, p. B3.
Hay, Elzey. "Dress under Difficulties; or, Passages from the Blockade Experiences of Rebel Women."Godey's Lady's Book, July 1866, p. 32.
Kettell, Thomas Prentice. Southern Wealth and Northern Profits, as Exhibited in Statistical Facts and Official Figures: Showing the Necessity of Union to the Future Prosperity and Welfare of the Republic. New York:George W. & John A. Wood, 1860.
Letters from the 44th Regiment M.V.M.: A Record of the Experience of a Nine Months' Regiment in the Department of North Carolina in 1862-3. Boston: Boston Herald, 1863.
Lord, Daniel. The Effect of Secession upon the Commercial Relations between the North and South, 2nd ed. New York: New York Times, 1861.
"The Morrill Tariff in Europe." Daily Morning News, (Savannah, GA) March 27, 1861.
"Our Columbus Correspondence." New York Herald, April 9, 1861, p. 2.
Pollard, Edward A. The Lost Cause: A New Southern History of the War of the Confederates, 2nd edition. Chicago, IL: E.B. Treat & Co., 1890.
Silber, Irwin, and Jerry Silverman, compilers. Songs of the Civil War. New York: Columbia University Press, 1960.
Steele, Joel Dorman. A Brief History of the United States. New York: A.S. Barnes, 1885.
"Textile Fabrics: Their Production and Distribution." North American and United States Gazette, March 5, 1858, col. B.
"The Textile Trades: From the London Hour." Boston Daily Advertiser, March 21, 1876.
Times (London, England), March 5, 1861, as reported in the Daily (Savannah, GA) Morning News, March 27, 1861.
Tortora, Phyllis, and Keith Eubank. Survey of Historic Costume, 3rd ed. New York: Fairchild Publications, 1998.
White, George S. Memoir of Samuel Slater: The Father of American Manufactures, 2nd ed. Philadelphia, PA: privately printed, 1836.
Pamela L. Kester
Industrial production is the backbone for economic output in almost all countries. Over the past decades, manufacturing industrial production has been growing in most economies. The industrial sector is dominated by the production of a few major energy-intensive commodities, such as steel, paper, cement, and chemicals. In any given country or region, production of these basic commodities follows the general development of the overall economy. Rapidly industrializing countries will have higher demands for infrastructure materials, and more mature markets will have declining or stable consumption levels. The regional differences in consumption patterns (expressed as consumption per capita) will fuel a further growth of consumption in developing countries. In addition to labor costs and costs for raw materials in these "heavy" industries, energy is a very important production cost factor in the effort to achieve higher energy efficiency (see Table 1). Markets in the industrialized countries show a shift toward more service-oriented activities and hence non energy-intensive industries. Because of the great difference in energy intensity between energy intensive industries and all others, changes in output shares of these industries can have a major impact on total industrial energy use. Many commodities (e.g., food and steel) are traded globally, and regional differences in supply and demand will influence total industrial energy use. Production trends also depend on regional availability of resources (e.g., scrap) and capital. Manufacturing energy use also will depend on the energy efficiency with which the economic activities are done. In this article we will assess trends in industrial energy use and energy intensities, followed by a discussion of energy services, uses, and industrial technologies.
GLOBAL MANUFACTURING ENERGY USE
In 1990, manufacturing industry accounted for 42 percent (129 exajoules, EJ) of global energy use. Between 1971 and 1990, industrial energy use grew at a rate of 2.1 percent per year, slightly less than the world total energy demand growth of 2.5 percent per year. This growth rate has slowed in recent years and was virtually flat between 1990 and 1995, primarily because of declines in industrial output in the transitional economies of Eastern Europe and the former Soviet Union. Energy use in the industrial sector is dominated by the industrialized countries, which accounted for 42 percent of world industrial energy use in 1990. Industrial energy consumption in these countries increased at an average rate of 0.6 per year between 1971 and 1990, from 49 EJ to 54 EJ. The share of industrial sector energy consumption within the industrialized countries declined from 40 percent in 1971 to 33 percent in 1995. This decline partly reflects the transition toward a less energy-intensive manufacturing base, as well as continued growth in transportation demand, resulting in large part from the rising importance of personal mobility in passenger transport use.
The industrial sector dominates in the economies in transition, accounting for more than 50 percent of total primary energy demand, the result of the emphasis on materials production, a long-term policy promoted under years of central planning. Average annual growth in industrial energy use in this region was 2.0 percent between 1971 and 1990 (from 26 EJ to 38 EJ), but dropped by an average of -7.3 percent per year between 1990 and 1995.
In the Asian developing countries, industrial energy use grew rapidly between 1971 and 1995, with an annual average growth rate of 5.9, jumping from 9 EJ. It also accounted for the greatest share of primary energy consumption, between 57 percent and 60 percent. The fastest growth in this sector was in China and in other rapidly developing Asian countries. Growth in other developing countries was slightly lower.
The nature and evolution of the industrial sector vary considerably among developing countries. Some economies that are experiencing continued expansion
|Sector||Energy Intensity (primary energy)||Energy Costs (share of production costs)||Energy Intensity (primary energy)||Energy Costs (share of production costs)||Energy Intensity (primary energy)||Energy Costs (share of production costs)|
|Iron & Steel||30.5 GJ/t||7%||27.8 GJ/t||11%||25.4 GJ/t||8%|
|Pulp & Paper||43.1 GJ/t||6%||42.7 GJ/t||6%||32.8 GJ/t||6%|
|Cement||7.3 GJ/t||40%||5.2 GJ/t||36%||5.4 GJ/t||33%|
|Primary Aluminum||N/A||14%||17.6 MWh/t||19%||16.2 MWh/t||13%|
|Petroleum Refining||6.2 GJ/t||4%||4.3 GJ/t||3%||4.5 GJ/t||3%|
in energy-intensive industry, such as China and India, show relatively unchanging shares of industrial energy use. In other countries, such as Thailand and Mexico, the share and/or growth of the transportation sector dominate. Many smaller countries have remained primarily agrarian societies with modest manufacturing infrastructure.
ENERGY INTENSITY TRENDS
In aggregate terms, studies have shown that technical efficiency improvements of 1 to 2 percent per year has been observed in the industrial sector in the past, Between 1975 and 1983 during and after the years of major oil price increases, U.S. industrial energy intensity declined by 3.5 percent per year. Between 1984 and 1994 industrial energy intensity declined by less than 1 percent on average (Brown et al., 1998). Figure 1 gives an overview of energy of economic intensity trends in the industrial sector in industrialized countries.
The trends demonstrate the capability of industry to improve energy efficiency when it has the incentive to do so. Energy requirements can be cut by new process development. In addition, the amount of raw materials demanded by a society tends to decline as countries reach certain stages of industrial development, which leads to a decrease in industrial energy use. The accounting of trends in structural shift, material intensity, and technical energy efficiency and their interactions can be extremely difficult. To understand trends in energy intensity it is important to analyze the structure of the industrial sector. Industrial energy use can be broken down into that of the energy-intensive industries (e.g., primary metals, pulp and paper, primary chemicals, oil refining, building materials) and the nonenergy intensive-industries (e.g., electronics and food). Reduction of energy intensity is closely linked to the definition of structure, structural change, and efficiency improvement. Decomposition analysis is used to distinguish the effects of structural change and efficiency improvement. Structural change can be broken down into intra-sectoral (e.g., a shift toward more recycled steel) and intersectoral (e.g., a shift from steel to aluminum within the basic metals industry). A wide body of literature describes decomposition analyses and explains the trends in energy intensities and efficiency improvement. Decomposition analyses of the aggregate manufacturing sector exist mainly for industrialized Western countries, but also for China; Taiwan; and selected other countries, including those in Eastern Europe. The results show that different patterns exist for various countries which may be due to specific conditions as well as differences in driving forces such as energy prices and other policies in these countries. More detailed analyses on the subsector level are needed to understand these trends better. Changes in energy intensities also can be disaggregated into structural changes and efficiency improvements at the subsector level. In the iron and steel industry, energy intensity is influenced by the raw materials used (i.e., iron ore, scrap) and the products produced (e.g., slabs, or thin rolled sheets). A recent study on the iron and steel industry used physical indicators for production to study trends in seven countries which together produced almost half of the world's steel. Figure 2 shows the trends in physical energy intensity in these countries, expressed as primary energy used per metric ton of crude steel. The large differences in intensity among the countries are shown, as well as the trends toward reduced intensity in most countries. Actual rates of energy efficiency improvement varied between 0.0 percent and 1.8 percent per year, while in the case of the restructuring economy of Poland, the energy intensity increased.
ENERGY SERVICES AND ENERGY EFFICIENCY
Energy is used to provide a service (e.g., a ton of steel) or to light a specified area. These services are called energy services. Energy efficiency improvement entails the provision of these services using less energy. About half of industrial energy use is for specific processes in the energy-intensive industries. On the other hand, various general energy conversion technologies and end uses can also be distinguished, such as steam production, motive power, and lighting. Hence, energy use in manufacturing can be broken down to various uses to provide a variety of services. A common breakdown distinguishes energy use for buildings, processes and utilities and boilers. The boilers provide steam and hot water to the processes, and the buildings. Due to the wide variety of industrial processes, we will limit our discussion to two energy intensive sectors—iron and steel, and the pulp and paper industries, as well as boilers, to illustrate important cross-cutting energy-consuming processes in industry.
Iron and Steel Industry
The first record of the use of iron goes back to 2500-2000 B.C.E., and the first deliberate production of iron began in about 1300 B.C.E. Small furnaces using charcoal were used. Evidence of such furnaces has been found in Africa, Asia, and Central Europe. The relatively low temperatures in the furnace lead to low-quality iron, and the slag had to be removed by hammering the iron. High-temperature processes started to be introduced in Germany in about 1300 C.E. The design of these furnaces is essentially the same of that of modern blast furnaces. The furnaces still used charcoal, and in 1718 the first use of coke is reported in the United Kingdom. The higher strength of coke allowed larger furnaces to be built, increasing energy efficiency. By 1790 coke iron making contributed to 90 percent of British iron production. Between 1760 and 1800, energy use declined by about 2 percent per year, mainly through the use of steam engines permitting higher blast pressures. During the nineteenth century, coke demand was further reduced by 1 percent per year. The development of the modern blast furnace after World War II resulted in an annual reduction of energy intensity of 3 to 4 percent a year, due to the use of improved raw materials, ore agglomeration, larger blast furnaces, and higher air temperatures. Today the blast furnace is the main process to make iron, and provides the largest raw material stream in steelmaking.
Steel is made by reducing the carbon content in the iron to levels below 2 percent. This reduces the brittleness of the material and makes it easier to shape. The first steelmaking process was invented in 1855 by Bessemer, and was in commercial operation by 1860. In the Bessemer converter air was blown through the hot iron, which oxidizes the carbon. This principle is still followed in modern steelmaking processes. In the United States the last Bessemer converter was retired in the 1960s. In the late nineteenth century the open-hearth furnace (OHF) or Siemens-Martin furnace was invented, which uses preheated air and fuels to oxidize the carbon and melt the steel. This process is currently found only in developing countries and in Eastern Europe. The United States was one of the industrialized countries that phased out OHF at a very late stage. In the 1980s the dominant process became the basic oxygen furnace (BOF), using pure oxygen instead of air. The BOF process was developed in Austria in the 1950s. The productivity of this process is much higher than that of OHF, as is the energy efficiency. An alternative process is the electric arc furnace (EAF). The EAF process is mainly used to melt scrap. Performance of EAFs has improved tremendously; fuel and oxygen are starting to be used besides electricity. In the future it is expected that the BOF and EAF processes will follow similar developmental paths.
Liquid steel is cast into ingots or slabs and shaped in rolling mills to the final product. Although most energy use is concentrated in ironmaking and steelmaking, reduced material losses and productivity gains in casting and shaping (e.g., continuous casting and, currently thin slab casting) have contributed to dramatic increases in the energy efficiency of steelmaking.
Today, the U.S. iron and steel industry is made up of integrated steel mills which produce pig iron from raw materials (iron ore, coke) using a blast furnace and steel using a BOF, and secondary steel mills (minumills), which produce steel from scrap steel, pig iron, or direct reduced iron (DRI) using an EAF. The majority of steel produced in the United States is from integrated steel mills, although the share of minimills is increasing, growing from 15 percent of production in 1970 to 40 percent in 1995. There were 142 operating steel plants in the United States in 1997, of which 20 were integrated steelmills and 122 were minimills. The integrated mills are most often located near or with easy access to the primary resources; for example, in the United States these are concentrated in the Great Lakes region, near supplies of coal and iron ore and near key customers such as automobile manufacturers.
The worldwide average energy intensity of steelmaking is estimated at gigajoules (GJ) per metric tone, although large variations occur among countries and plants (see Figure 2). Today the most energy-efficient process would use 19 GJ/per metric ton for integrated steelmaking, and 7 GJ/per metric ton for making steel out of scrap. Analyses have shown that many technologies exist that could improve energy efficiency further. For example, in the United States the potential for energy efficiency improvement is estimated at 18 percent, using proven and cost-effective practices and technologies. Under development are new technolgies that could considerably lower the energy intensity of steelmaking. Smelt reduction in ironmaking would integrate the production of coke and agglomerated ore with that of ironmaking, leading to reductions in production costs and energy use. The development of direct casting techniques that abandon rolling would increase productivity while reducing energy use further. Combined, these technologies could reduce the energy intensity of primary steelmaking to 12.5 GJ/per metric ton of steel, and for secondary steelmaking to 3.5 GJ/per metric ton reductions of 34 percent and 50 percent, respectively. In the highly competitive and globalizing steel industry, manufacturers must continuously look for ways to lower their energy intensity and costs.
Pulp and Paper Industry
Paper consists of aligned cellulosic fibers. The fibers may be from wood or other crops, or from recycling waste paper. Starting with wood fibers, the fibers need to be separated from the wood, which is done by pulping the wood. The separation can be done by chemicals, by heat, or by mechanical means. In the chemical pulping process chemicals and hot water are used to separate the cellulosis from the ligno-cellulosis. The amount of pulp produced is about half the amount of wood used. Chemical pulping results in high-quality paper. In the mechanical process the wood is ground under pressure, separating the fibers from each other. In mechanical pulping the ligno-cellulosis is generally not removed, resulting in a lower quality of paper (e.g., paper used for newsprint) but a higher recovery (about 90%) of the used wood. In chemical pulping a lot of steam is used to heat the water and concentrate the chemical by-products. However, recovery of the by-products to be recycled in the process can actually produce sufficient steam for the whole paper mill. The mechanical process uses large quantities of electricity, while some processes can recover steam from the grinding process. Chemical pulp can be bleached to produce white paper. Waste paper is being pulped by mixing with water, after which ink is removed and the pulp refined. Paper recycling reduces the energy needs of the pulping process. Waste paper use in the production of paper varies widely, due to different structures of the industry in different countries.
While energy efficiency improvement options do exist in the pulping step, greater opportunities exist in the chemical recovery step. The most common pulping process in the United States is the Kraft pulping process. Black liquor is produced as a by-product. The chemicals are currently recovered in a recovery boiler, combusting the ligno-cellulosis. Because of the high water content, the efficiency of the recovery boiler is not very high, and the steam is used to generate electricity in a steam turbine and steam for the processes. Gasification of black liquor would allow use of the generated gas at a higher efficiency. This would make a Kraft pulp mill an electricity exporter (Nilsson et al., 1995).
In papermaking the pulp is diluted with water at a ratio of about 1:100. This pulp is screened and refined. The solution with the refined fibers (or stock) is fed into the paper machine, where the water is removed. In the paper machine, the paper is formed into a sheet, and water is removed by dispersing over a wire screen. At the end of the forming section, 80 percent of the water has been removed. The rest of the water is removed in the pressing and drying sections. While only a small amount of water is removed in the drying section, most energy is used there. Hence, those using such energy efficiency opportunities try to reduce the water content by increasing the water removal by pressing. In a long nip press, the pressing area is enlarged. The larger pressing area results in extra water removal. New technologies are under development aiming to increase the drying efficiency considerably. One technology—impulse drying—uses a heated roll, pressing out most of the water in the sheet; this may reduce the steam consumption of the paper machine by 60 to 80 percent.
|Energy Efficiency Measure||Typical Energy Savings (%)||Payback Estimate (years)|
|Reducing Steam Leaks||3-5%|
|Insulating Steam Pipes||5-10%||0.3-1.7 year|
|Condensate Return||10-20%||<1 year|
|Process Integration & Heat Recovery||5-40%||2-|
|System Operation and Maintenance|
|Decentralization Steam Supply||<40%||<4 years|
|Combustion Air Preheating||<12%||<5 years|
|Boiler Feed Preheating||2-10%||<4 years|
|New Low-NOx Boiler Type||>5%||n.a.|
|Monitoring and Control||1-4%||<3 years|
The pulp and paper industry uses approximately 6 to 8 EJ globally. Because energy consumption and intensity depend on the amount of wood pulped, the type of pulp produced, and the paper grades produced, there is a great range in energy intensities among industrialized countries of the world. In Europe energy use for papermaking varied between 16 GJ per metric ton and to 30 GJ per metric ton of paper in 1989. The Netherlands used the least energy per metric ton of paper, largely because most of the pulp was imported. Countries such as Sweden and the United States have much higher energy intensities due to the larger amount of pulp produced. Sweden and other net exporters of pulp also tend to show a higher energy intensity. Energy intensity is also influenced by the efficiency of the processes used. Many studies have shown considerable potentials for energy efficiency improvement with current technologies (Worrell et al., 1994; Nilsson et al., 1995; Farla et al., 1997), such as heat recovery and improved pressing technologies.
Cross-Cutting: Steam Production and Use
Besides the energy-intensive industries, many smaller and less energy-intensive, or light, industries exist. Light industries can include food processing, metal engineering, or electronics industries. In light industries energy is generally a small portion of the total production costs. There is a wide variety of processes used within these industries. Generally a large fraction of energy is used in space heating and cooling, in motors (e.g., fans and compressed air), and in boilers. Industrial boilers are used to produce steam or to heat water for space and process heating and for the generation of mechanical power and electricity. In some cases these boilers will have a dual function, such as the cogeneration of steam and electricity. The largest uses of industrial boilers by capacity are in paper, chemical, food production, and petroleum industry processes. Steam generated in the boiler may be used throughout a plant or a site. Total installed boiler capacity (not for cogeneration) in the United States is estimated at nearly 880 million megawatts. Total energy consumption for boilers in the United States is estimated at 9.9 EJ.
A systems approach may substantially reduce the steam needs, reduce emissions of air pollutants and greenhouse gases, and reduce operating costs of the facility. A systems approach assessing options throughout the steam system that incorporates a variety of measures and technologies is needed (Zeitz, 1997), and can help to find low-cost options. Improved efficiency of steam use reduces steam needs and may reduce the capital layout for expansion, reducing emissions and permitting procedures at the same time. Table 2 summarizes various options to reduce losses in the steam distribution and to improve system operation and the boiler itself. In specific cases, the steam boiler can be replaced almost totally by a heat pump (or mechanical vapor recompression) to generate low-pressure steam. This replaces the fuel use for steam generation by electricity. Emission reductions will depend on the type and efficiency of power generation.
Another option to reduce energy use for the steam system is cogeneration of heat and power (CHP) based on gas-turbine technology as a way to substantially reduce the primary energy needs for steam making. Low- and medium-pressure steam can be generated in a waste heat boiler using the flue gases of a gas turbine. Classic cogeneration systems are based on the use of a steam boiler and a back-pressure turbine. These systems have relatively low efficiency compared to a gas turbine system. Steam-turbine systems generally have a power-to-heat ratio between 0.15 (40 kWh/GJ) and 0.23 (60 kWh/GJ) (Nilsson et al., 1995). The power-to-heat ratio depends on the specific energy balance of plant as well as energy costs. A cogeneration plant is most often optimized to the steam load of the plant, exporting excess electricity to the grid. The costs of installing a steam turbine system strongly depend on the capacity of the installation. Gas-turbine-based cogeneration plants are relatively cheap. In many places (e.g., The Netherlands and Scandinavia) gas turbine cogeneration systems are standard in paper mills. The power-to-heat ratio is generally higher than for steam turbine systems. Aero-derivative gas turbines may have a power-to-heat ratio of 70 to 80 kWh/GJ . Aeroderivative turbines are available at low capacities, but specific costs of gas turbines sharply decrease with larger capacities.
POTENTIAL FOR ENERGY EFFICIENCY IMPROVEMENT
Much of the potential for improvement in technical energy efficiencies in industrial processes depends on how closely such processes have approached their thermodynamic limit. There are two types of energy efficiency measures: (1) more efficient use in existing equipment through improved operation, maintenance or retrofit of equipment and (2) use of more efficient new equipment by introducing more efficient processes and systems at the point of capital turnover or expansion of production. More efficient practices and new technologies exist for all industrial sectors. Table 2 outlines some examples of energy efficiency improvement techniques and practices.
A large number of energy-efficient technologies are available (see Table 3) in the steel industry, including continuous casting, energy recovery, and increased recycling. Large technical potentials ranging from 25 to 50 percent exist in most countries. New technologies are under development (e.g., smelt reduction and near net shape casting) that will reduce energy consumption as well as environmental pollution and capital costs. A few bulk chemicals such as ammonia and ethylene represent the bulk of energy use in this subsector. Potentials for energy savings in ammoniamaking are estimated to be up to 35 percent in Europe and between 20 percent and 30 percent in Southeast Asia.
|Iron and Steel|
|Heat recovery for steam generation, pre-heating combustion air, and high efficiency burners|
|Adjustable speed drives, heat recovery coke oven gases, and dry coke quenching|
|Efficient hot blast stove operation, waste heat recovery for hot blast stove, top gas power recovery turbines, direct coal injection|
|Recovery BOF-gas, heat recovery of sensible heat BOF-gas, closed BOF-gas-system, optimized oxygen production, increase scrap use, efficient tundish preheating|
|UHP-process, Oxy-fuel injection for EAF plants, and scrap preheating|
|Heat recovery (steam generation), recovery of inert gases, efficient ladle preheating|
|Use of continuous casting, 'Hot connection' or direct rolling, recuperative burners|
|Heat recovery, efficient burners annealing and pickling line, continuous annealing operation|
|Process management and thermal integration (e.g. optimization of steam networks, heat cascading, low and high temperature heat recovery, heat transformers), mechanical vapor recompression|
|New compressor types|
|Adjustable speed drives|
|Selective steam cracking, membranes|
|High temperature cogeneration and heat pumps|
|Reflux overhead vapor recompression, staged crude pre-heat, mechanical vacuum pumps|
|Fluid coking to gasification, turbine power recovery train at the FCC, hydraulic turbine power recovery, membrane hydrogen purification, unit to hydrocracker recycle loop|
|Improved catalysts (reforming), and hydraulic turbine power recovery|
|Process management and integration|
|Pulp and Paper|
|Continuous digester, displacement heating/batch digesters, chemi-mechanical pulping|
|Black liquor gasification/gasturbine cogeneration|
|Oxygen predelignification, oxygen bleaching, displacement bleaching|
|Tampella recovery system, falling film black liquid evaporation, lime kiln modifications|
|Long nip press, impulse drying, and other advanced paper machines|
|Improved boiler design/operation (cogeneration), and distributed control systems|
|Improved grinding media and linings, roller mills, high-efficiency classifiers, wet process slurry|
|Dewatering with filter presses|
|Multi-stage preheating, pre-calciners, kiln combustion system improvements, enhancement of internal heat transfer in kiln, kiln shell loss reduction, optimize heat transfer in clinker cooler, use of waste fuels|
|Blended cements, cogeneration|
|Modified ball mill configuration, particle size distribution control, improved grinding media and linings, high-pressure roller press for clinker pre-grinding, high-efficiency classifiers, roller mills|
Energy savings in petroleum refining are possible through improved process integration, cogeneration, energy recovery, and improved catalysts. Compared to state-of-the-art technology, the savings in industrialized countries are estimated to be 15 to 20 percnet, and higher for developing countries. Large potentials for energy savings exist in nearly all process stages of pulp and paper production (e.g., improved dewatering technologies, energy and waste heat recovery, and new pulping technologies). Technical potentials are estimated at up to 40 percent, with higher long-term potentials (see above). Energy savings in cement production are possible through increased use of additives (replacing the energy-intensive clinker), use of dry process, and use of a large number of energy efficiency measures (such as reducing heat losses and use of waste as fuel). Energy savings potentials of up to 50 percent do exist in the cement industry in many countries through efficiency improvement and the use of wastes such as blast furnace slags and fly ash in cementmaking.
In the United States various studies have assessed the potential for energy efficiency improvement in industry. One study has assessed the technologies for various sectors and found potential economic energy savings of 7 to 13 percent over the business-as-usual trends (Brown et al., 1998) between 1990 and 2010. Technologies like the ones described above (see Table 2) are important in achieving these potentials.
However, barriers may partially block the uptake of those technologies. Barriers to efficiency improvement can include unwillingness to invest, lack of available and accessible information, economic disincentives, and organizational barriers. The degree to which a barrier limits efficiency improvement is strongly dependent on the situation of the actor (e.g., small companies, large industries). A range of policy instruments is available, and innovative approaches or combinations have been tried in some countries. Successful policy can contain regulations (e.g., product standards) and guidelines, economic instruments and incentives, voluntary agreements and actions, information, education and training, and research, development and demonstration policies. Successful polices with proven track records in several sectors include technology development, and utility/government programs and partnerships. Improved international cooperation to develop policy instruments and technologies to meet developing country needs will be necessary, especially in light of the large anticipated growth of the manufacturing industry in this region.
Manufacturing industry is a large energy user in almost all countries. About half of industrial energy use is in specific processes in the energy-intensive industries. On the other hand, various general energy conversion technologies and end uses can also be distinguished, such as steam production, motive power, and lighting. Opportunities and potentials exist for energy savings through energy efficiency improvement in all sectors and countries. Technology development, and policies aimed at dissemination and implementation of these technologies, can help to realize the potential benefits. Technologies do not now, nor will they in the foreseeable future, provide a limitation on continuing energy efficiency improvements.
Ang, B. W. (1995) "Decomposition Methodology in Industrial Energy Demand Analysis." Energy 20(11):1081-1096.
Ang, B. W., and Pandiyan, G. (1997) "Decomposition of Energy-Induced CO2 Emissions in Manufacturing." Energy Economics 19:363-374.
Brown, M. A.; Levine, M. D.; Romm, J. P.; Rosenfeld, A. H.; and Koomey, J. G. (1998). "Engineering-Economic Studies of Energy Technologies to Reduce Greenhouse Gas Emissions: Opportunities and Challenges." Annual Review of Energy and the Environment 23:287-385.
Center for the Analysis and Dissemination of Demonstrated Energy Technologies (CADDET). (1999). CADDET Register on demonstration projects Sittard, Neth.: Author/IEA.
De Beer, J.; Worrell, E.; and Blok, K. (1998). "Future Technologies for Energy Efficient Iron and Steel Making." Annual Review of Energy and the Environment 23:123-205.
De Beer, J.; Worrell, E.; and Blok, K. (1998). "Long-Term Energy-Efficiency Improvements in the Paper and Board Industries." Energy 23:21-42.
Farla, J.; Blok, J.; and Schipper, L. (1997). "Energy Efficiency Developments in the Pulp and Paper Industry." Energy Policy 25:745-758.
Howarth, R. B.; Schipper, L.; Duerr, P. A.; and Strom, S. (1991). "Manufacturing Energy Use in Eight OECD Countries, Decomposing the Impacts of Changes in Output, Industry Structure, and Energy Intensity." Energy Economics 13:135-142.
IEA. (1997). Indicators of Energy Use and Efficiency: Understanding the Link Between Energy and Human Activity.Paris: Author/OECD.
Jones, T. (1997). "Steam Partnership: Improving Steam System Efficiency Through Marketplace Partnerships." Proceedings 1997 ACEEE Summer Study on Energy Efficiency in Industry, Washington, DC: ACEE.
LBNL. (1998). OECD Database. Berkeley, CA, Lawrence Berkeley National Laboratory.
Li, J.-W.; Shrestha, R. M.; and Foel, W. K. (1990). "Structural Change and Energy Use: The Case of the Manufacturing Industry in Taiwan." Energy Economics 12:109-115.
Nilsson, L. J.; Larson, E. D.; Gilbreath, K. R.; and Gupta, A. (1995). "Energy Efficiency and the Pulp and Paper Industry." Washington DC: American Council for an Energy Efficient Economy.
Park, S.-H.; Dissmann, B.; and Nam, K.-Y. (1993). "A Cross-Country Decomposition Analysis of Manufacturing Energy Consumption." Energy 18:843-858.
Price, L.; Michaelis, L.; Worrell, E.; and Khrushch, M. (1998). "Sectoral Trends and Driving Forces of Global Energy Use and Greenhouse Gas Emissions." Mitigation and Adaptation Strategies for Global Change 3:263-319.
Prindle, W.; Farfomak, P.; and Jones, T. (1995). "Potential Energy Conservation from Insulation Improvements in U.S. Industrial Facilities." Proceedings 1995 ACEEE Summer Study on Energy Efficiency in Industry. Washington, DC: ACEEE.
Ross, M. H., and Steinmeyer, D. (1990). "Energy for Industry." Scientific American 263:89-98.
Sinton, J. E., and Levine, M. D. (1994). "Changing Energy Intensity in Chinese Industry." Energy Policy 21:239-255.
WEC. (1995). Energy Efficiency Utilizing High Technology: An Assessment of Energy Use in Industry and Buildings, prepared by M. D. Levine, E. Worrell, N. Martin, and L. Price. London: Author.
Worrell, E.; Cuelenaere, R. F. A.; Blok, K.; and Turkenburg, W. C. (1994). "Energy Consumption by Industrial Processes in the European Union." Energy 19:1113-1129.
Worrell, E.; Levine, M. D.; Price, L. K.; Martin, N. C.; van den Broek, R.; and Blok, K. (1997). Potential and Policy Implications of Energy and Material Efficiency Improvement. New York: UN Commission for Sustainable Development.
Worrell, E.; Price, L.; Martin, N.; Farla, J.; and Schaeffer, R. (1997). "Energy Intensity in the Iron and Steel Industry: A Comparison of Physical and Economic Indicators." Energy Policy 25(7-8):727-744.
Worrell, E.; Martin, N.; and Price, L. (1999). "Energy Efficiency and Carbon Emission Reduction Opportunities in the U.S. Iron and Steel Industry." Berkeley, CA: Lawrence Berkeley National Laboratory.
Zeitz, R. A., ed. (1997). CIBO Energy Efficiency Handbook. Burke, VA: Council of Industrial Boiler Owners.
Goods made from raw materials, originally by hand; also those made by machinery.
In antiquity and into the Byzantine Empire, the Middle East was the center of Western civilization and the region from which a wide variety of goods were first made and traded. The settled farming society allowed time for handicrafts, between crop work, and for market days and market towns. Regional trade became established by land caravan, by riverboats, and by coastal vessels that sailed the Mediterranean, the east coast of Africa, and beyond Arabia, into the Indian Ocean.
The ancient Near East was the seat of civilizations that traded with one another—luxury goods for the urban elite and utilitarian items for both urban dwellers and for rural agricultural, herding, and artisan folk. Specialty products included textiles, metals, glassware, pottery, chemicals, and, later, sugar and paper. By the fourteenth and fifteenth centuries, however, Europe had progressed to the point that it was exporting to the Middle East not only high technology goods, such as clocks and spectacles, but refined types of textiles, glassware, and metals. During the following centuries the flow from Europe to the Middle East increased; by the nineteenth century, Europe overwhelmed the region with goods produced cheaply and abundantly by the machinery of the Industrial Revolution, including the railroads and steamships that transported them. The Anglo–Ottoman treaty of 1838 (called the Convention of Balta Liman) fixed import duties to the Ottoman Empire at a low 8 percent. These factors drove thousands of Middle Eastern craftsmen and artisans out of business, but some managed to retain their shops and others found employment in the new textile factories of the late nineteenth century.
World War I exposed the region's lack of industry and, with the achievement of total or partial independence, the various governments began to take measures to encourage development. Around 1930, the Commercial and Navigation Treaties regulating tariffs lapsed, and most countries regained full fiscal autonomy. They immediately raised tariffs to favor local industry. They also promoted manufacturing in various other ways, such as encouraging people to buy national goods and giving such goods preference for government purchases. Moreover, they set up special banking, such as the Sümer and Eti banks in Turkey and the Agricultural and Industrial banks of Iran and Iraq, to promote manufacturing and mining; they also channeled credit through existing banks, such as Bank Misr in Egypt. Local entrepreneurs also became more active in the economic field, including manufacturing. In Egypt, the Misr and Abboud groups set up various industries, and in Turkey, the Iş Bank promoted development. In Palestine, where some European and Russian
Jewish immigrants brought with them both capital and skills, some set up factories or workshops in a wide variety of fields.
It is difficult to estimate the rate of industrial growth: In Turkey, between 1929 and 1938, net manufacturing production increased at 7.5 percent a year and mining advanced at about the same pace. In Egypt, the rate of growth was slightly lower and in the Jewish sector of Palestine distinctly higher. In Iran, between 1926 and 1940, some 150 factories were established with a paid-up capital of about US$150 million and employing 35,000 persons. Nevertheless, industry still played a minor role in the basically agricultural Middle Eastern economy. By 1939, employment in manufacturing and mining was everywhere less than 10 percent of the labor force, and in most of the countries it was closer to 5 percent. Industry's contribution to gross domestic product (GDP) was put at 8 percent in Egypt, 12 in Turkey, and 20 in the Jewish sector of Palestine; in the other countries it was lower. Industry still depended on imports of machinery, spare parts, raw materials, and technicians—and there were no exports of manufactured goods. A wide range of light industries, including textiles, food processing, building materials, and simple chemicals, had developed in Egypt, Turkey, Iran, Palestine, and, to a smaller extent, in Lebanon, Syria, and Iraq. In addition, Turkey had the beginnings of heavy industry—iron, steel, and coal. Petroleum production and refining had become important to Iran, Bahrain, and Iraq. Several countries were meeting most of their requirements of such basic consumer goods as textiles, refined sugar, shoes, matches, and cement.
World War II gave great stimulus to Middle Eastern industry. Imports were drastically reduced and Allied troops provided a huge market for many goods. The Anglo–American Middle East Supply Center helped by providing parts, materials, and technical assistance. By 1945, total output had increased by some 50 percent. With the resumption of trade, from 1946 to 1950, many firms were hit by foreign competition, but the governments gave them tariff and other protection, so output continued to grow at about 10 percent per annum from 1946 to 1953. This rate was maintained, and in some countries (like Iran) exceeded through the 1970s, but in the 1980s it fell off sharply because of such factors as the Iran–Iraq War, the Sudanese and Lebanese civil wars, and the 1980s fall in oil prices.
Table 1 shows a breakdown of the structure of Middle Eastern industry. The main industries are still textiles (including garments); food processing (sugar refining, dough products, confectionery, soft drinks, beer); tobacco; building materials (cement, bricks, glass, sanitary ware); and assembly plants for automobiles, refrigerators, radio and television sets, and so forth, with some of the components produced locally. Important new industries have also developed—notably chemicals—including basic products, fertilizers, and various kinds of plastics; basic metals and metal products; and many types of machinery. A particularly rapidly growing branch is petrochemicals, using gases produced in the oil fields or in refineries. Only in petrochemicals, textiles, and food processing does the region's share approach or exceed 5 percent of world output. Similarly, only in phosphates and chromium is the region's share of mineral production significant.
Israel, however, has a large diamond-cutting industry and is a significant exporter of precision instruments. It is also a large exporter of arms, as is Egypt; in the late 1980s each country exported more than US$1 billion worth of weapons; they ranked third and fourth, respectively, among exporters from developing countries, and twelfth and fifteenth, among world exporters of arms. The Arab boycott has, of course, restricted some of Israel's economic pursuits within the region as well as with some international trade.
Today, manufacturing plays an important part in the Middle East's economy, accounting in many countries for 15 to 20 percent of GDP. Industry,
|SOURCE: world bank. world development report, 1990, table 6. world development report, 1986, table 7.|
|Table by GGS Information Services, The Gale Group.|
|Value Added (millions of U.S. dollars)|
|United Arab Emirates||—||—||2,155|
|Distribution of Value Added (percent)|
|Food Beverages Tobacco||Textiles Clothing||Machinery & Transport Equipment|
|United Arab Emirates||14||1||—|
|Distribution of Value Added (percent)|
|United Arab Emirates||—||84|
in the broader sense, which includes mining (and therefore oil), construction, electricity, water, and gas as well as manufacturing, generally constitutes over 30 percent of GDP. In the major oil nations it is 60 percent or more, usually employing 20 to 30 percent of the labor force (including immigrant labor).
Factors for Low Productivity
With rare exceptions, industries still export very little and survive through government protection. Productivity is low; for example, gross annual value added in 1974 was only worth US$4,000 to US$5,000 in most countries (compared to $20,000 in West Germany). This is particularly marked in the more capital-intensive industries, such as steel, automobiles, and aircraft. In the late 1970s, in the Turkish state-owned steel mill in Iskenderun, a ton of steel took 72 worker-hours, compared with 5 in the United States and 7 in Europe; in Egypt, annual output per worker in the automobile industry was one car, compared with 30 to 50 in leading Japanese firms. In the more labor intensive industries, such as textiles, however, physical output per worker is about 30 to 50 percent of European output. Here, very low wages offset low productivity and enable the Middle East to compete. In 1980, hourly wages in the textile industry were equal to US$1 in Syria and Turkey and 40 cents in Egypt, compared to US$8.25 in Western Europe.
Low productivity in the Middle East is caused by many factors. First, capital investment per employee is low, although governments have poured large amounts into industry; in the late 1970s the share of manufacturing, mining (including oil), and energy was over 40 percent of total investment in Egypt, Iraq, and Syria, and 30 percent in Iran. In the Gulf region's petrochemical industry, however, capital intensity is high and up-to-date machinery is used. Second, industry is greatly overstaffed; many governments compel firms to take on more workers—to relieve unemployment or for other political purposes. Third, the poor health, education, and housing of workers adversely affect their productivity—but conditions are improving. Fourth, there has been much bad planning, with factories being located far from suitable raw materials or good transport.
General conditions are also unfavorable for industrial development. The region is, on the whole, poor in raw materials. Wood and water have become very scarce. Minerals are generally sparse, remote, and often low grade. Most agricultural raw materials are of poor quality, lacking the uniformity required for industrial processes. The protection given to manufacturers of producers' goods (e.g., metals, chemicals, sugar) creates a handicap for industries that use their products. The main exceptions are natural and refinery gas, which are available almost free of cost, and raw cotton, which is of fine quality. The small size of the local market makes it impossible to set up factories of optimum size and the general underdevelopment of industries prevents profitable linkages among industries; both factors raise unit costs. Although the infrastructure has greatly improved, it still does not serve manufacturing adequately; for example, the frequency of power failures led many firms to install their own generators and transport costs remain high. A dependence on imported machinery, spare parts, and raw materials, although declining, is still great—hence, when a shortage of foreign exchange curtails imports, factories work below capacity, further raising unit costs.
Middle East industry also suffers from a lack of competition. Because of the small size of the local market and the high degree of protection, firms often enjoy a quasi monopoly—and behave accordingly. Finally, a great shortage of industrial skills exists at both the supervisory and foreperson levels. Even more serious is the shortage of managers; this is compounded where the government has nationalized the bulk of industry—as in Egypt, Iran, Iraq, Sudan, and Syria. Here market discipline has been replaced by bureaucratic control, so efficiency has been sharply reduced.
On the whole, then, manufacturing does not make the contribution to the Middle East's economy commensurate with either the efforts or the capital invested in it. Conditions may be expected to improve, however, as the society and the economy continue to develop and as some measure of peace takes hold.
see also arab boycott; balta liman, convention of (1838); commercial and navigation treaties; trade.
Aliboni, Robert, ed. Arab Industrialization and Economic Integration. London: St. Martin's, 1979.
Economist Intelligence Unit. Industrialization in the Arab World. London, 1986.
Hershlag, Z. Y. Contemporary Turkish Economy. London and New York: Routledge, 1988.
Turner, Louis, and Bedore, James. Middle East Industrialization. Fainborough, U.K., 1979.
United Nations. The Development of Manufacturing in Egypt, Israel, and Turkey. New York, 1958.
Manufacturing Control via the Internet
Manufacturing Control via the Internet
Manufacturing control via the Internet (e-manufacturing) refers to the process of integrating information and communication networks as well as Internet-supported robotics into the production systems, processes, and structures of the firm. As such, the firm uses the Internet to link production equipments and control functions of its manufacturing systems through series of real-time monitoring gadgets such as computers and mobile communication devices. E-manufacturing comes with many advantages that include: ubiquitous accessibility, remote-controlled monitoring possibilities, real-time communications capabilities, and increased production efficiency as a result of information-integrated production systems.
The unlimited linkage of manufacturing operations to the global communication infrastructure is a phenomenon that relies heavily on the Internet's networking technology. The Internet consists of physically-networked servers and advanced communication linkages that relay information across Web-based servers and client computers. The advent of the Internet and the subsequent advancement of digital technologies have introduced new economic frontiers characterized by the emergence of revolutionary and information-driven economic institutions. The increased access to the global information and communication infrastructures has closed the gaps among consumers, manufacturers, and suppliers by eliminating the political, economic, and geographic barriers.
BRIEF HISTORY OF THE INTERNET
The origins of the Internet can be traced to the 1960s military research activities of the United States Army. In the book titled Internet Literacy, Fred Hofstetter credits the United States Department of Defense for developing the first ever viable Internet called ARPANET through the Advanced Research Projects Agency (ARPA) in 1969. The main objective of ARPANET was to provide the U.S. military with a communication network capacity that could withstand turbulence and obstructions that rose from enemy attacks. It would accomplish this by relying on sets of networked computers to transmit labeled and addressed packets of information to designated destinations, even if one or more of the computers along the way stopped functioning. Thus, in the event of enemy attacks (such as a massive bombing campaign), the packets of information would automatically be routed through alternative paths to their intended destinations.
Commercial use of the Internet has been evident ever since the early 1990s following the liberalization of the National Science Foundation Network (NSFNET) in the United States. The move opened up the routing of high-speed Internet traffic to different interconnected Internet service providers (ISPs), thereby providing easy access to pioneer online auction and commercial entities such as Amazon, eBay, and PayPal. The Internet has since become a powerful tool for trade, commerce, and manufacturing because of its formidable infrastructure that runs across the globe.
THE INTERNET AND MANUFACTURING PROCESSES
The use of computer-based networking in production activities is based on the materials and requirement planning (MRP) framework that is demand dependent and specifically geared to assembly operations in firms. MRP was developed in the 1960s and still remains an important component of industrial manufacturing processes. Computerized MRP has evolved into MRP II, which involves linking and streamlining operations in different departments such as marketing, purchasing, production planning and control, human-resource management, and financial accounting.
MRP II integrates different functions in the firm into a central monitoring and decision system by collecting and relaying data and additional production inputs. The advancements in MRP II have given rise to enterprise resource planning (ERP), which integrates different types of industrial data, processes, and functions into a unified database through comprehensive linkages of software and hardware systems. Unlike MRP and MRP II, ERP has the capacity to link organizational functionalities through multiple systems. Instead of functions such as human resource management, production control, customer relations, financial accounting, and supply chain management existing in independent software applications and individual databases, ERP brings all these functions under one roof to share a single database and software applications.
The ability for ERP to streamline workflows, track processes, and improve productivity makes it easy for manufacturing companies to integrate e-manufacturing in controlling and managing industrial production processes. The implementation of e-manufacturing strategies through the existing ERP systems definitely revolutionizes the monitoring and functioning of the engineering capacity of machines, quality control, material control, and workflow processes.
Firms can use either in-house teams and software applications or software vendors and consultants to implement customized e-manufacturing systems. For example, Jain has contracted Rockwell Automation, a leading industrial automation software and service provider in the United States, to manage its entire processes of real-time automation and track the company's manufacturing data.
Companies can also outsource the management of e-manufacturing to industrial automation and software companies that have global presence such as Oracle and IBM.
The twenty-first century has experienced unprecedented increase in the use of telephone modems, Ethernet wireless connections, cable modems, digital subscriber lines (DSL), and satellite communications to access Web-based services such as e-mails, newsgroups, chat rooms, real-time messaging, and list servers through either computers or mobile devices. Manufacturing companies are embedding digital devices and sensors that range from micro-scale to macro-scale sizes in all aspects of production. For example, as Kwon Yongjin and Rauniar Shreepud point out in their 2007 journal article titled “E-Quality Manufacturing (EQM) Within the Framework of Internet-Based System” and contained in the IEEE Transactions on Systems, Man and Cybernetics, Part C: Application and Reviews, manufacturers use advanced tools such as the Ethernet SmartImage sensor and the Internet Controllable Yamaha Scara robot to initiate continuous correspondence in production processes with the objective of monitoring and achieving sustained quality control.
Manufacturing companies are continuously taking advantage of the advancements in Internet tracking and communications technologies to entrench quality control and monitor daily production activities in firms. In addition to using the Internet to remotely monitor and track production processes, companies also use the Internet to diagnose faulty functionalities of equipment and processes in the entire production system. Remote and automated access to manufacturing systems enables operations managers and production line experts to sustain quality control and initiate instant responses to sudden changes in a firm's manufacturing environment.
The most common real-time solutions that companies employ in production control include chat rooms and instant messaging (IM), which allows the use of voice calls, file sharing, webcams, information-on-demand (such as news, weather, auctions, and stock trading), and online status reporting. Leading IM providers include AOL Instant Messaging (AIM), Microsoft's MSN Messenger, Yahoo Messenger, and Skype.
The twenty-first century has also experienced the increased use of Ethernet video on the floor of the manufacturing plants to monitor operations, streamline coordination, train workers, and control repairs and maintenance. In an Internet article titled Video via Ethernet Now, Martin T. Hoske acknowledges that in addition to enhancing security applications during production processes, Ethernet video applications have also proved to be effective time-saving tools.
SECURITY AND THREATS TO E-MANUFACTURING
Real-time manufacturing control via the Internet is prone to enforceable inconveniences that are beyond the control of the organizations. Such inconveniences include network failure of ISPs or time lag as a result of network congestions. Moreover, the unauthorized access to network systems by hackers, crackers, state intelligence agencies, and other types of intruders remains the biggest threat to Internet security. Internet security threats come in the form of Internet break-ins, Internet fraud, and message sniffing.
Internet break-ins . Internet break-ins are particularly committed by crackers who gain unauthorized access into Web sites to collect private information about individuals, companies, and organizations. Crackers can disastrously land on a firm's private information such as credit card numbers, bank account details of individuals, or classified company information such as production formulas, secret codes, and classified data. In the United States, Internet break-in is treated both as theft and trespassing by the federal laws; offenders can be handed up to a five-year prison sentence for stealing money and ten years for fraudulent acquisition of a company's classified information.
Internet fraud. Internet fraud involves the use of Web site tools such as chat rooms, e-mails, or newsgroups by fraudsters to offer services and products that do not exist with the aim of convincing unsuspecting Internet users to transfer money or goods to the fraudsters. The increased prevalence of Internet fraud, particularly in online auctions, has prompted regulatory authorities in the United States to respond by setting up the Internet Fraud Complaint Center (IFCC). Consisting of a partnership between the Federal Bureau of Investigation (FBI) and the National White Collar Crime Center, the IFCC controls and coordinates campaigns against Internet fraud by providing Internet fraud reporting structures and mechanisms for forwarding fraud cases to law enforcement agencies.
Message sniffing. Message sniffing involves intercepting e-mail communication messages on the Internet with the aim of gaining access to the content of the e-mail messages. Sniffing targets the routes and gateways that link the networks to the Information Superhighway. Incidentally, each computer on the network is a gateway prone to hacking by crackers. For example, the FBI scans both local and international Internet communications in the United States using a customizable electronic sniffing gadget called Carnivore.
Carnivore can be installed in one or more ISPs to monitor the Internet traffic in regard to transmissions of e-mail, instant messaging, chat rooms, and newsgroups, and it automatically forwards any suspect communications
to the FBI data repositories. Although the use of Carnivore by the FBI to spy on private and public communications in the United States raises major privacy concerns, the action is fully backed by the USA Patriot Act of 2001 and the USA Patriot Improvement and Reauthorization Act of 2005, which have broadened the authority of U.S. intelligence and counterintelligence agencies to apply Internet-based surveillance systems in investigations.
DATA PROTECTION MEASURES FOR E-MANUFACTURING
So many security threats lurk in the communication network systems that no company can afford to run an unprotected Internet network. There are several measures that a company can employ to protect its Internet network from unauthorized access by intruders. Use of password protection, data encryption, firewall, data filters, and employee training are some of the probable measures that companies can adopt as protection against Internet security risks.
Use of passwords . Use of passwords enables companies to limit Web site access to users with authorized passwords. However, password codes should never be fully trusted because crackers can use sophisticated software to break the codes and access the private content in the Web site and e-mail messages.
Data encryption. Data encryption protects data from crackers and sniffing gadgets during the process of transmitting information between computers and network servers. Encrypted messages do not allow access to people who do not have the keys to the encryption codes. Pretty Good Privacy (PGP) is one good example of encryption programs. Fred Hofstetter contends that PGP provides a reliable mode for encrypting messages because it can run on any brand of computer. Companies can acquire messaging software such as Mozilla Thunderbird and Microsoft Outlook which are equipped with built-in encryption abilities.
Firewalls . Firewalls stand out as reliable Internet-security-enhancing tools because they prevent a company's data from flowing beyond the domain restrictions, in addition to preventing users of other domains from accessing the company's domain. Companies implement firewall restrictions by combining software tools, hardware equipment, and relevant IT security policies that block the movement of restricted data across the company's network and computers.
Firewalls are particularly used to protect the company's intranet from unlimited public access through programming of the firewall software to regulate minimum and maximum access levels between the company's intranet and the public Internet.
Data filters. Companies can use data filters to scan and sift the outgoing and incoming data for certain types of Internet content. Data filters can be applied in situations where the company is seeking to block employees from accessing certain Web sites such as adult content and Internet gambling sites. Data filters can be set either on the user or client servers.
Employee training. Employee training through Internet education programs can tremendously improve safety of Internet use in the company apart from improving the capacities of employees to detect and handle fraud. Employees should always be discouraged from responding to everything that they read on the Internet. Employees should be made aware of the dangers of revealing their personal bank account and credit card information and the company's classified data to information seekers with concealed identities.
SEE ALSO Enterprise Resource Planning
Hofstetter, Fred, T. Internet Literacy, 4th ed. McGraw-Hill Companies Inc., 2006.
Hoske, Mark, T. “Video via Ethernet Now.” Control Engineering, January 12, 2007. Available from: http://www.controleng.com/article/CA6510487.html/.
National Institute of Standards and Technology. “Software Tackles Production Line Machine ‘Cyclic Jitters’.” Science Daily, 5 April 2008. Available from: http://www.sciencedaily.com/releases/2008/04/080402101656.htm/.
Smith-Atakan, Serengul. Human-Computer Interaction. Middlesex University Press: Thomson Learning, 2006.
“Three Tiered Web-Based Manufacturing System—Part 1: System Development.” Robotics and Computer Integrated Manufacturing. 23, no. 1, (2007): 138–151. Available from: http://portal.acm.org/toc.cfm?id=J1050&type=periodical&coll=GUIDE&dl=GUIDE&CFID=183639&CFTOKEN=75352989.
University of Wisconsin-Milwaukee. “Merging Control Software with Smart Devices Could Optimize Manufacturing.” Science Daily, 22 May 2008. Available from: http://www.sciencedaily.com/releases/2008/05/080521105255.htm/.
Winschhsen, Molly, Janet Snell, and Jenny Johnson. Diploma in Digital Applications, Book 4. Heinemann, 2006.
Wolf-Ruediger, Hansen, and Frank Gillert. RFID for the Optimization of Business Processes. Wiley, 2008.
Yongjin, Kwon, and Rauniar Shreepud. “E-Quality Manufacturing (EQM) Within the Framework of Internet-Based Systems.” IEEE Transactions on Systems, Man and Cybernetics, Part C: Application and Reviews, 2007. Available from: http://cat.inist.fr/?aModele=afficheN&cpsidt=19180416.
MANUFACTURING. Rather than undergoing a single, rapid "industrial revolution," manufacturing in America has evolved over four centuries of European settlement. While the first colonists introduced some manufacturing processes to their "new world," manufacturing did not become a vital part of the economy until the achievement of national independence. Over the first half of the nineteenth century, all forms of manufacturing—household, artisanal, and factory based—grew and expanded, and textile manufacturing in particular spawned important new technologies. From the Civil War through the early twentieth century heavy industry grew rapidly, transforming the national economy and the very nature of society. After a period of manufacturing prosperity due, in part, to World War II, heavy industry began to decline and Americans suffered from deindustrialization and recession. The growth of high technology and the service sector in the final decades of the century offered both challenges and opportunities for American manufacturing.
The Colonial Era to 1808
Both of the major early English settlements hoped to establish manufacturing in America. The Virginia Company attempted to set up iron foundries and glass manufactories on the James River while the Puritans built several iron foundries in Massachusetts. As colonization proceeded, however, manufacturing became increasingly peripheral to the economy. With quicker and easier profits to be made from cash crops and trans-Atlantic trade, colonists exerted little effort toward manufacturing. Beginning in the late-seventeenth century, colonial manufacturing was further hindered by mercantilistic restrictions imposed by the English, most notably the Woolen Act (1699), Hat Act (1732), and Iron Act (1750). All three of these acts were designed to limit nascent colonial competition with English manufacturers in keeping with the developing mercantilistic perception that colonies should serve the empire as producers of raw materials and consumers of finished products from the mother country. While large-scale iron and steel manufacturing continued to have a presence in the colonies, most colonial manufacturing would still be performed in the farm household and, to a lesser extent, within craft shops.
It was only after the French and Indian War (1689– 1763) that Americans, propelled by their new quest for independence from England, began to turn toward manufacturing in a systematic way. Colonial resistance to the Sugar Act (1764), Stamp Act (1765), Townshend Duties (1767), and Coercive Acts (1774/1775) all involved economic boycotts of British goods, creating a patriotic imperative to produce clothing, glass, paint, paper, and other substitutes for British imports. Empowered by this movement and increasingly politicized by the resistance, urban artisans began to push for a permanently enlarged domestic manufacturing sector as a sign of economic independence from Britain.
The Revolution itself offered some encouragement to domestic manufacturing, particularly war materiel such as salt petre, armaments, ships, and iron and steel. But it also inhibited manufacturing for a number of reasons. Skilled laborers, already scarce before the war, were now extremely difficult to find. Wartime disruptions, including the British blockade and evacuation of manufacturing centers such as Boston, New York City, and Philadelphia further hindered manufacturing.
In the years immediately following the war, manufacturing began to expand on a wider scale. Lobbying efforts by urban mechanics as well as some merchants swayed state governments and later the new federal government to establish mildly protective tariffs and to encourage factory projects, the most famous of which was Alexander Hamilton's Society for Establishing Useful Manufactures in Patterson, New Jersey. New immigrants brought European industrial technologies. The best known case was that of Samuel Slater, who established some of the new nation's first mechanized textile mills in Rhode Island in the 1790s. But the great majority of manufacturing establishments still relied on traditional technologies to perform tasks such as brewing beer, refining sugar, building ships, and making rope. Moreover, craft production and farm-based domestic manufacturing, both of which grew rapidly during this period, continued to be the most characteristic forms of American manufacturing.
From 1808 to the Civil War
Factory production, particularly in the textile industries, became an important part of the American economy during the Embargo of 1808 and the War of 1812. During these years imports were in short supply due to the United States' efforts to boycott European trade and disruptions caused by the British navy during the war. Economic opportunity and patriotic rhetoric pushed Americans to build their largest textile factories to date, from Baltimore's Union Manufactory to the famous establishments financed by the Boston Associates in 1814 in Waltham and in 1826 in Lowell, Massachusetts. America's first million-dollar factories, they used the latest technologies and employed thousands of workers, many of them women and children. After the war promanufacturing protectionists pushed for high tariffs to ensure that manufacturing would continue to flourish. These efforts culminated with the so-called Tariff of Abominations of 1828, which included rates of 25 percent and more on some imported textiles. Protectionism was a vital part of the Whig Party's American System, consisting of tariffs, improved transportation, and better banking. But after 1832, as Southerners successfully fought to lower tariffs, government protection of manufacturing waned.
During these years the proportion of the workforce involved in manufacturing grew more rapidly than in any other period in America's history, rising from only 3.2 percent in 1810 to 18.3 percent by 1860. Growth in textile manufacturing led the way. Cotton production capacity alone increased from 8,000 spindles in 1808 to 80,000 by 1811 and up to 5.2 million by the dawn of the Civil War. By 1860 the United States was, according to some calculations, the world's second greatest manufacturing economy, behind only England. Spectacular as this growth was, it did not come only from the revolution in textile manufacturing. In fact, American manufacturing was extremely varied. While even Europeans admired American inventors' clever use of interchangeable parts and mechanized production, traditional technologies also continued to flourish. Household production, although declining relative to newer forms, remained a significant element of American manufacturing. Many industries other than textiles, and even some branches of textiles, relied on more traditional processes. Established urban centers such as New York City experienced metropolitan industrialization that relied more on the expansion and modification of traditional craft processes than on construction of large vertically integrated factories on the Lowell model.
From the Civil War to World War II
During the latter part of the nineteenth century the United States became the world's leading industrial nation, exceeding the combined outputs of Great Britain, France, and Germany by 1900. Between 1860 and 1900 the share of manufacturing in the nation's total production rose from 32 percent to 53 percent and the number of workers employed in manufacturing tripled from 1.31 million to 4.83 million. Heavy industry, particularly steel, played the most dramatic role in this story. Between 1873 and 1892 the national output of bessemer steel rose from 157,000 to 4.66 million tons. Geographically, the trans-Appalachian midwest was responsible for a disproportionate amount of this growth. Major steel-making centers such as Pittsburgh, Cleveland, and Chicago led the way. The combined population of these industrial metropolises grew by more than 2,500 percent between 1850 and 1900. Yet, even smaller midwestern towns rapidly industrialized; by 1880 60 percent of Ohio's population was employed in manufacturing, and ten years later Peoria County, Illinois, was the most heavily industrialized in the United States. To a far lesser extent manufacturing also extended into the New South after the Civil War. Here industries based on longtime southern agricultural staples such as cotton manufacturing and cigarette making led the way, following some mining and heavy industry.
Besides the growth of heavy industry and large cities, this era marked the onset of big business. The railroad industry, which benefited from the ease of coordination offered by large units, set the pace, but it was in the steel industry that bigness really triumphed, culminating in the creation of United States Steel, America's first billion-dollar firm (it was capitalized at $1.4 billion in 1901). By 1904, 318 large firms controlled 40 percent of all American manufacturing assets. Firms grew due to vertical integration (incorporating units performing all related manufacturing functions from extraction to marketing) as well as horizontal integration (incorporating new units providing similar functions throughout the country). Such growth was hardly limited to heavy industry; among the most famous examples of vertical integration was the Swift Meat Packing Corporation, which, during the 1870s and 1880s, acquired warehouses, retail outlets, distributorships, fertilizer plants, and other units that built on its core businesses.
While consumers welcomed the increasing availability of mass-produced goods ranging from dressed meat to pianos, the growth of big industry also worried many Americans. Concerns that the new colossuses would serve as monopolies spurred government concern, beginning with state actions in the 1880s and the federal Sherman Antitrust Act of 1890 and followed by a number of largely ineffectual efforts by federal courts to bust trusts such as those alleged in the whiskey and lumber industries to keep the market competitive for smaller players. Perhaps more importantly, workers were also frightened by the increasing amount of economic power in the hands of a few industrial giants who were able to slash wages at will. Major labor actions against railroad and steel corporations helped to build new unions such as the Knights of Labor (established 1869), the United Mine Workers (1890), and the American Federation of Labor (1886). In the 1890s there were an average of 1,300 work stoppages involving 250,000 workers per year. Such actions sometimes ended in near-warfare, as in the famous case of the 1892 strike at Carnegie Steel's Homestead, Pennsylvania, plant.
The most important new manufacture of the twentieth century was the automobile. In 1900 the United States produced fewer than $5 million worth of automobiles. Only sixteen years later American factories turned out more than 1.6 million cars valued at over half a billion dollars. Henry Ford's assembly line production techniques showcased in his enormous River Rouge factory transformed industry worldwide. Automobile production also stimulated and transformed many ancillary industries such as petroleum, rubber, steel, and, with the development of the enclosed automobile, glass. Automobiles also contributed significantly to the growth of a consumer culture in the era before World War II, leading to new forms of commuting, shopping, traveling, and even new adolescent dating rituals. While the development of new forms of consumption kept the economy afloat during good times, reluctance to purchase goods such as automobiles and radios during the Great Depression would intensify the economic stagnation of the 1930s.
World War II to 2000
After the fallow years of the depression, heavy industry again thrived during and after World War II, buoyed by defense spending as well as consumer purchases. Due partly to the politics of federal defense contracts and partly to lower labor costs, the South and West experienced more rapid industrial growth than the established manufacturing centers in the Northeast and Midwest. While workers in the Pacific coast states accounted for only 5.5 percent of the nation's manufacturing workforce in 1939, by 1969 they accounted for 10.5 percent of the total. Manufacturing employment in San Jose, Phoenix, Houston, and Dallas all grew by more than 50 percent between 1960 and 1970.
Industrial employment reached its peak in 1970, when 26 percent of Americans worked in the manufacturing sector. By 1998 the percentage had plunged to 16 percent, the lowest since the Civil War. Deindustrialization struck particularly hard during the 1970s when, according to one estimate, more than 32 million jobs may have been destroyed or adversely affected, as manufacturing firms shut down, cut back, and moved their plants. Due to increasing globalization, manufacturing jobs, which previously moved from the northern rust belt to the southern and western sun belt, could now be performed for even lower wages in Asia and Latin America. These developments led some observers to label the late twentieth century a post-industrial era and suggest that service industry jobs would replace manufacturing as the backbone of the economy, just as manufacturing had superseded agriculture in the nineteenth century. They may have spoken too soon. In the boom years of the 1990s the number of manufacturing jobs continued to drop, but increased productivity led to gains in output for many industries, most notably in the high technology sector. Additionally, other economic observers have argued that manufacturing will continue to matter because the linkages that it provides are vital to the service sector. Without manufacturing, they suggest, the service sector would quickly follow our factories to foreign countries. Thus, at the dawn of the twenty-first century the future of manufacturing and the economy as a whole remained murky.
Bluestone, Barry, and Bennett Harrison. The Deindustrialization of America. New York: Basic Books, 1982.
Clark, Victor. History of Manufactures in the United States, 1893–1928. 3 vols. New York: McGraw Hill, 1929.
Cochran, Thomas. American Business in the Twentieth Century. Cambridge, Mass.: Harvard University Press, 1972.
Licht, Walter. Industrializing America: The Nineteenth Century. Baltimore: Johns Hopkins University Press, 1995.
Porter, Glenn. The Rise of Big Business, 1860–1910. New York: Caswell, 1973; Arlington Heights, Ill.: Harlan Davidson, 1973.
Tryon, Rolla M. Household Manufactures in the United States,1640–1860. Chicago: University of Chicago Press, 1917. Reprint, New York: Johnson Reprint Company, 1966.
Manufacturing Resources Planning
Manufacturing Resources Planning
Manufacturing resource planning, also known as MRP II, is a method for the effective planning of a manufacturer's resources. MRP II is composed of several linked functions, such as business planning, sales and operations planning, capacity requirements planning, and all related support systems. The output from these MRP II functions can be integrated into financial reports, such as the business plan, purchase commitment report, shipping budget, and inventory projections. It has the capability of specifically addressing operational planning and financial planning, and has simulation capability that allows its users to conduct sensitivity analyses (answering “what if” questions).
The earliest form of manufacturing resource planning was known as material requirements planning (MRP). This system was vastly improved upon until it no longer resembled the original version. The newer version was so fundamentally different from MRP that a new term seemed appropriate. Oliver Wight coined the acronym MRP II for manufacturing resource planning.
A basic understanding of MRP is essential to understanding MRP II. The following paragraphs begin with a description of MRP before moving on to MRP II.
MATERIAL REQUIREMENTS PLANNING
Material requirements planning (MRP) is a computer-based, time-phased system for planning and controlling the production and inventory function of a firm from the purchase of materials to the shipment of finished goods. All MRP systems are computer based since the detail involved and the inherent burden of computation make manual use prohibitive. MRP is time phased because it not only determines what and how much needs to be made or purchased, but also when.
MRP first appeared in the early 1970s and was popularized by a book of the same name by Joseph Orlicky. Its use was quickly heralded as the new manufacturing panacea, but enthusiasm slowed somewhat when firms began to realize the difficulty inherent in its implementation.
The MRP system is composed of three primary modules, all of which function as a form of input. These are the master production schedule, the bill-of-materials, and the inventory status file. Each module serves a unique purpose that is inter-related with the purpose of the other modules, and produces several forms of usable output.
Master Production Schedule. The master production schedule (MPS) is basically the production schedule for finished goods. This schedule is usually derived from current orders, plus any forecast requirements. The MPS is divided into units of time called “buckets.” While any time frame may be utilized, usually days or weeks is appropriate. The MPS is also said to be the aggregate plan “disaggregated.” In other words, the plan for goods to be produced in aggregate is broken down into its individual units or finished goods.
Bill-of-Materials . The bill-of-materials is a file made up of bills-of-material (BOM). Each BOM is a hierarchical listing of the type and number of parts needed to produce one unit of finished goods. Other information, such as the routings (the route through the system that individual parts take on the way to becoming a finished good), alternate routings, or substitute materials may be also be contained with the BOM.
A tool known as a product structure tree is used to clarify the relationship among the parts making up each unit of finished goods. Figure 1 details how a product structure tree for a rolling cart might appear on a bill-of-material. This cart consists of a top that is pressed from a sheet of steel; a frame formed from four steel bars; and a leg assembly consisting of four legs, each with a caster attached. Each caster is made up of a wheel, a ball bearing, an axle, and a caster frame.
The BOM can be used to determine the gross number of component parts needed to manufacturer a given number of finished goods. Since a gross number is determined, safety stock can be reduced because component parts may be shared by any number of finished goods (this is known as commonality).
The process of determining gross requirements of components is termed the “explosion” process, or “exploding” the bill-of-material. Assuming 100 rolling carts are needed, the example product structure tree can be used to compute the gross requirements for each rolling cart component. In order to produce 100 rolling carts, 100 tops are needed, which would require 100 sheets of steel; 100 leg assemblies, which would require 400 legs and 400 casters (requiring 400 wheels, 400 ball bearings,
400 axles, and 400 caster frames); and 100 frames, which would require 400 bars.
Inventory Status File. The inventory status file, or inventory records file, contains a count of the on-hand balance of every part held in inventory. In addition, the inventory status file contains all pertinent information regarding open orders and the lead time (the time that elapses between placing an order and actually receiving it) for each item.
Open orders are purchase orders (orders for items purchased outside the firm) or shop orders (formal instructions to the plant floor to process a given number of parts by a given date) that have not been completely satisfied. In other words, they are items that have been ordered, but are yet to be received.
The MRP Process. The MRP logic starts at the MPS, where it learns the schedule for finished goods (how many and when). It takes this information to the BOM where it “explodes” the gross requirements for all component parts. The MRP package then takes its knowledge of the gross requirements for all components parts to the inventory status file, where the on-hand balances are listed. It then subtracts the on-hand balances and open orders from the gross requirements for components yielding the net requirements for each component.
The process not only shows how many components are needed but when they are needed in order to complete the schedule for finished goods on time. By subtracting the lead time from the due date for each part, it is possible to see when an order must be placed for each part so that it can be received in time to avoid a delay in the MPS. A manual version of MRP for a part with requirements of 100 in period 3 and 250 in period 6 and with a two-period lead time is shown in Figure 2.
In order for the firm to meet demand on time (the MPS), it must place an order for 25 in period 1 and an order for 200 in period 4. Note that this is an overly simplified version of MRP, which does not include such relevant factors as lot sizing and safety stock.
EXPANDING INTO MRP II
With MRP generating the material and schedule requirements necessary for meeting the appropriate sales and inventory demands, more than the obvious manufacturing resources for supporting the MRP plan was found to be needed. Financial resources would have to be generated in varying amounts and timing. Also, the process would require varying degrees of marketing resource support. Production, marketing, and finance would be operating without complete knowledge or even regard for what the other functional areas of the firm were doing.
In the early 1980s MRP was expanded into a much broader approach. This new approach, manufacturing resource planning (MRP II), was an effort to expand the scope of production resource planning and to involve other functional areas of the firm in the planning process, most notably marketing and finance, but also engineering, personnel, and purchasing. Incorporation of other functional areas allows all areas of the firm to focus on a common set of goals. It also provides a means for generating a variety of reports to help managers in varying functions monitor the process and make necessary adjustments as the work progresses.
When finance knows which items will be purchased and when products will be delivered, it can accurately project the firm's cash flows. In addition, personnel can project hiring or layoff requirements, while marketing can keep track of up-to-the-minute changes in delivery times, lead times, and so on. Cost accounting information is gathered, engineering input is recorded, and distribution requirements planning is performed.
An MRP II system also has a simulation capability that enables its users to conduct sensitivity analyses or evaluate a variety of possible scenarios. The MRP II system can simulate a certain decision's impact throughout the organization, and predict its results in terms of customer orders, due dates, or other “what if” outcomes. Being able to answer these “what if” questions provides a firmer grasp of available options and their potential consequences.
As with MRP, MRP II requires a computer system for implementation because of its complexity and relatively large scale. Pursuit of MRP or MRP II in a clerical fashion would prove far too cumbersome to ever be useful. When MRP and MRP II were originally developed, hardware, software, and database technology were not sufficiently well advanced to provide the speed and computational power needed to run these systems in real time. Additionally, the cost of these systems was prohibitive. With the rapid advances in computer and information technology since the 1980s, these systems have become more affordable and widely available.
CLASSES OF FIRMS USING MRP AND MRP II
MRP and MRP II users are classified by the degree to which they utilize the various aspects of these systems. Class D companies have MRP working in their data processing area, but utilize little more than the inventory status file and the master production schedule, both of which may be poorly used and mismanaged. Typically, these firms are not getting much return for the expense incurred by the system.
Class C firms use their MRP system as an inventory ordering technique but make little use of its scheduling capabilities.
Class B companies utilize the basic MRP system (MPS, BOM, and Inventory file) with the addition of capacity requirements planning and a shop floor control system. Class B users have not incorporated purchasing into the system and do not have a management team that uses the system to run the business, but rather see it as a production and inventory control system.
Class A firms are said use the system in a closed loop mode. Their system consists of the basic MRP system, plus capacity planning and control, shop floor control, and vendor scheduling systems. In addition, their management uses the system to run the business. The system provides the game plan for sales, finance, manufacturing, purchasing, and engineering. Management then can use the system's report capability to monitor accuracy in the BOM, the inventory status file, and routing, as well as monitor the attainment of the MPS and capacity plans.
Class A firms have also tied in the financial system and have developed the system's simulation capabilities to answer “what if” questions. Because everyone is using the same numbers (e.g., finance and production), management has to work with only one set of numbers to run the business.
A further extension of MRP and MRP II has been developed to improve resource planning by broadening the scope of planning to include more of the supply chain. The Gartner Group of Stamford, Connecticut, coined the term “enterprise resource planning” (ERP) for this system. Like MRP II systems, ERP systems rely on a common database throughout the company with the additional use of a modular software design that allows new programs to be added to improve the efficiency of specific aspects of the business.
With the improvement of lean manufacturing and just-in-time (JIT) systems that has occurred because of the same technological advances that made MRP and MRP II more accessible, some firms have come to feel that MRP, MRP II, and even ERP systems are obsolete. However, research has found that in certain environments with advance demand information, MRP-type push strategies yield better performance in term of inventories and service levels than did JIT's kanban-based pull strategies, and they continue to be used by big businesses and many medium and smaller businesses even today. In 2007, author Phil Robinson noted that “when properly implemented, an ERP package can be the most cost effective project a company has ever seen.”
By the early twenty-first century, MRP and ERP systems were so entrenched in businesses that they no longer provided a source of competitive advantage. In 2005, the authors of Manufacturing Planning and Control for Supply Chain Management pointed out that sustaining competitive advantage would require that manufacturing planning and control (MPC) systems cross organizational boundaries to coordinate company units that have traditionally worked independently. They recommend that organizations need to begin working in pairs or dyads to develop jointly new MPC systems that allow integrated operations. Organizations will learn as much as possible from each dyad and then leverage what they have learned into other dyads. They termed this approach the “next frontier” for manufacturing planning and control systems.
SEE ALSO Competitive Advantage; Enterprise Resource Planning; Inventory Types; Lean Manufacturing and Just-in-Time Production; Quality and Total Quality Management
Krishnamurthy, Ananth, Rajan Suri, and Mary Vernon. “Re-Examining the Performance of MRP and Kanban Material Control Strategies for Multi-Product Flexible Manufacturing Systems.” International Journal of Flexible Manufacturing Systems 16, no. 2 (2004): 123.
Orlicky, Joseph. Material Requirements Planning. New York, NY: McGraw-Hill, 1975.
Robinson, Phil. “ERP (Enterprise Resource Planning) Survival Guide.” The Business Improvement Consultancy, 2007. Available from: http://www.bpic.co.uk/erp.htm.
Stevenson, William J. Production Operations Management. Boston, MA: Irwin/McGraw-Hill, 2004.
Vollmann, Thomas E., William L. Berry, D. Clay Whybark, andF. Robert Jacobs. Manufacturing Planning and Control for Supply Chain Management. Boston, MA: McGraw-Hill, 2005.
Wight, Oliver. Manufacturing Resource Planning: MRP II. Essex Junction, VT: Oliver Wight Ltd., 1984.
Zhou, Li, and Robert W. Grubbstrom. “Analysis of the Effect of Commonality in Multi-Level Inventory Systems Applying MRP Theory.” International Journal of Production Economics 90, no. 2 (2004): 251.
Manufacturing and Processing
Manufacturing and Processing
Restrictions on Manufacturing. The vast majority of colonists worked in the agricultural sector as farmers and planters, yet they were familiar with a wide array of manufactured goods. The colonists themselves had no large factories for manufacturing many of the products they used every day. British regulations forbade most manufactures in the colonies because the authorities wanted to prevent any competition with English industries. But there were other obstacles too, and in the end these probably were more significant. For one, manufacturing required a large labor force. Compared to agriculture, it also demanded a large amount of capital. Both of these factors of production, as economists call them, were relatively scarce and expensive in the colonies. Besides, the colonists could not protect their domestic industries by imposing tariffs on British goods, so the colonial-made products would have had to compete with affordable, well-made, and high-quality ones from Britain. Not surprisingly, most colonists concluded that it was not worth the effort to manufacture such goods on a large scale.
Successes. Even so, there were some significant exceptions. Imperial authorities encouraged the colonists to produce iron although they were only allowed to produce the raw iron, not the finished goods. In 1645 John Winthrop Jr., the son of the governor of Massachusetts Bay, established the first iron furnace in Saugus, Massachusetts. His venture did not last, but many later ones did. By 1775 at least 82 furnaces and 175 forges operated in the colonies, mainly in Pennsylvania, Maryland, and New Jersey. The colonial iron industry was larger than that of England and Wales, and it accounted for a full 15 percent of the total world output. Shipbuilding was another manufacturing success story. The vast supply of timber available in the colonies made shipbuilding there relatively cheaper than in Europe. In the 1770s nearly half of the ships built in the colonies were sold to overseas buyers, sometimes as part of the cargo. Up to 10 percent of workers in Boston and Philadelphia were involved directly in shipbuilding. Colonial merchants also had some success in manufacturing consumer products. Among the most prominent were the four Brown brothers of Providence, Rhode Island. In the early 1760s the Browns produced high-quality spermaceti candles made from the oil of the sperm whale. The Browns packaged these candles in a distinctive box with their company’s logo, one of the earliest examples of a recognizable brand in American history. Colonial manufacturers succeeded in establishing cottage industries—wherein goods are produced in households rather than factories—for earthenware, nails, footwear, and textiles. Women made a large proportion of the textile products, including table linens, blankets, undershirts, shawls, and hosiery. Their contribution was significant: even in the early nineteenth century, the total value of homemade cloth was ten times that of cloth manufactured outside the home.
Artisans. Most colonial manufacturing was not done in factories. Instead it was done primarily in small work-shops or households by artisans, farmers, women, children, and slaves. The term artisan encompassed many occupations, including coopers, tailors, cordwainers (shoemakers), weavers, and silversmiths. Men who made their living primarily by doing artisinal work headed from 7 to 10 percent of all colonial households; most lived in villages and towns. They were self-employed, owned their own tools, did their own accounts, and worked at home or in small workshops attached to their homes. All of these shops produced goods in small quantities or made them to order for a few customers. Wives, children, and a few apprentices contributed to the artisan’s work, and sometimes two or more artisans pooled their resources to form a joint workshop. Although few artisans became wealthy, most owned enough property to qualify as voters. Because of their numbers and their ability to influence the outcome of elections, colonial artisans played a larger social and political role than did their European counterparts. Disruptions in local politics during the Revolution gave artisans even greater opportunity to participate in the political process. They formed mechanics committees to discuss and act upon issues that were important to them. In Boston, New York, and Philadelphia artisans organized or joined with other patriots to force reluctant merchants to abide by nonimportation agreements.
COLONIAL MANUFACTURED AND PROCESSED GOODS
Although the colonies depended on the mother country for many of their manufactured products, they engaged in a substantial amount of manufacturing and processing activities themselves. By the 1760s the colonial economy had become large and diversified.
Food and related products:
Fermented and distilled beverages
Other food products
Textiles and textile products:
Other textile goods
Casks and other wooden containers
Masts, spars, and other ship timbers
Pitch, tar, and turpentine
Other forest products
Paper and printed materials:
Newspaper and other periodicals
Other paper products
Chemicals and allied substances:
Other chemical products
Stone, clay, and glass products:
Other stone, clay, and glass products
Iron and steel products
Other metal products
Equipment and apparatus:
Machinery, agricultural and nonagricultural
Source: John J. McCusker and Russell R. Menard, The Economy of British America, 1607-1789 (Chapel Hill: University of North Carolina Press, 1985), pp. 328-329.
Processing. Mills, distilleries, and refineries used some of the colonies’ most advanced technologies to transform raw and semifinished products. Sawmills—the largest of them located in Pennsylvania, Delaware, and New Jersey—cut lumber into boards. Mills processed iron into the pigs and bars that were turned into finished goods by some colonial artisans but mostly shipped to Britain. Tanneries, as well as papermaking and textile establishments, also used mills extensively. Sometimes mills were clustered near sources of water power. Wilmington, Delaware, emerged as a milling center that by the 1790s processed large volumes of cloth, lumber, paper, snuff, cotton, and iron. Distilling and refining also were important processing activities. From the mid seventeenth century the colonists had distilled molasses imported from the West Indies into rum. Domestic rum was a cheaper alternative to imported rum and brandy, and colonial demand for it remained high over the next century and a half. In 1770 about 140 rum distilleries, most of them run by merchants in the northern port towns, were in operation. Colonial distilleries produced nearly five million gallons of rum that year, or about 60 percent of the 8.5 million that the mainland colonies consumed annually. The colonists also processed large amounts of another West Indian product: muscovado sugar, which they refined into the more costly white sugar that colonial consumers had come to prefer. The colonists used the sugar to sweeten their imported tea, coffee, and chocolate drinks. Some twenty-six sugar refineries were in operation in 1770, and they met about 75 percent of the rising domestic demand. Like the distilleries, most sugar refineries were run by Northern merchants as adjuncts to their West Indian importing business.
Self-Sufficiency. Beginning in the 1760s the colonists tried to decrease their dependence on Great Britain by becoming more self-sufficient in manufacturing. They formed various organizations for this purpose. After the Sugar Act was passed in 1764 New York established the Society for the Promotion of Arts, Agriculture, and Economy. The colonists’ resolve intensified when Parliament imposed the Townshend duties in 1767. Colonial governments and eventually the Continental Congress began offering inducements to support native industry. These included bounties, loans, guaranteed markets at set prices, monopoly privileges, tax exemptions, and land grants. The Americans succeeded best in producing cloth, especially linens and woolens, made mostly by women working at home. In 1775 the United Company of Philadelphia for Promoting American Manufactures was formed to encourage textile production. Philadelphians established a manufactory that became among America’s largest enterprises, eventually employing hundreds, perhaps even thousands, of women. The home manufacture of cloth became a celebrated activity during the early years of the Revolution, and spinning schools were established in cities and villages. In 1769 the women of Middletown, Massachusetts, wove 20,522 yards of cloth. Women in Lancaster, Pennsylvania, produced 35,000 yards. Spinning bees became popular, and entire communities sometimes turned out for these events. Ezra Stiles estimated that the spinning bee held at his house in 1769 drew some six hundred spectators. Newspapers cheered on these patriotic women by referring to them as the “Daughters of Liberty” and reporting on their achievements. At times the newspapers used harsher tactics to encourage production. One newspaper in 1774 lectured women to “cease trifling their time away [and] prudently employ it in learning the use of the spinning wheel.” In the end the value of the formal spinning groups and spinning bees was more symbolic than real. Most did not even meet regularly. But they focused public attention on the importance of supporting native industry and allowed many women to make a political statement in support of the Revolution.
War Industries. The colonists tried to manufacture items other than cloth, but many of these could be produced only in households using inefficient tools and methods. Nevertheless several industries succeeded in becoming more permanently productive and efficient, especially when the colonies declared their independence and broke away from imperial restrictions. The war stimulated the domestic manufacturing sector as the demand for war matériel and other products that the colonists could no longer get directly from Britain increased. Armies on both sides bought locally produced items such as shoes—which became a major enterprise in Massachusetts and New Jersey—tents, clothing, and other military supplies. Great Britain prohibited the exportation of gunpowder, firearms, and other military stores, so the Americans had to produce these items locally. Maryland, located far from most of the fighting, became a center for gun making; Connecticut farmers produced saltpeter for gunpowder. In 1777 Congress established an armory in Springfield, Massachusetts. War supplies had to be transported, and this led to some permanent enhancements of the road network. The military demand for munitions stimulated the development of the iron and steel industries, and Americans erected new forges, mills, foundries, and shops. From 1775 to 1783 Pennsylvania alone built at least eleven new forges and furnaces. Along with the improvements in inland transportation the invigorated manufacturing sector allowed Pennsylvanians to enlarge their markets in the South. The war also proved a boon for paper mills because the number of newspapers rose from only thirty-seven in 1776 to more than a hundred by 1789.
Victor S. Clark, History of Manufactures in the United States, 3 volumes (Washington, D.C.: Carnegie Institution of Washington, 1929);
John J. McCusker and Russell R. Menard, The Economy of British America, 1607-1789 (Chapel Hill: University of North Carolina Press, 1985);
The turn toward manufacturing was one of the most notable economic developments of the early national period. From a series of agricultural and mercantile colonies in the 1750s, the United States had begun to evolve into an important manufacturing power by the 1820s.
before the transformation
The growth of manufacturing would have been very difficult to predict in the years before the Revolution. Although Benjamin Franklin half-jokingly wrote an English correspondent in 1764, "As to our being always supplied by you, 'tis a folly to expect it," the reality was that during the colonial era the colonists' manufactured goods were supplied primarily from overseas. British mercantilist legislation such as the Wool Act (1699), Hat Act (1732), and Iron Act (1750) was intended to prevent large-scale colonial manufacturing. Colonists for the most part were satisfied with this situation, provided that British merchants continued to pay good prices for American raw materials such as rice, tobacco, wheat, naval stores, and fish. Although some types of commercial manufacturing—such as iron forges—flourished, and some regions, particularly New England, developed a number of manufacturing establishments, the vast majority of Americans were content to stick to agricultural or mercantile pursuits throughout the colonial period. The dearth of manufacturing did not by any means signify a lack of goods; in fact, Americans participated in a consumer revolution during the eighteenth century as the number, types, and quality of imported manufactures grew exponentially.
The American Revolution brought an end to this colonial economic configuration. Most obviously, it destroyed the legal basis of British mercantilism. But several other related developments proved to be at least as significant. During the years of conflict in the 1760s and 1770s, the colonists turned toward economic protest as a means to coerce the British government into repealing obnoxious legislation such as the Stamp Act (1765) and Townshend Duties (1767). The most important weapon in their arsenal was the boycott used against British manufactures. Consequently, nonconsumption of British manufactures and production of domestically made articles became patriotic and profitable, spurring many Americans to begin manufacturing projects of their own. Some of these manufactories were built by individual entrepreneurs, while others, such as Philadelphia's socalled American Manufactory, were the products of patriotic civic committees. The boycotts also politicized for the first time America's artisans, who became very active in urban committees, such as the Sons of Liberty, that took it upon themselves to enforce the boycotts. Finally, the war itself impelled a certain level of economic independence as the British army and navy impeded Americans from importing goods as readily as they had during times of peace.
Enthusiasm for domestic manufactures and economic independence continued to grow after the war, and many sorts of people lobbied the national and state governments to encourage manufacturing. The newly politicized artisans initially led the movement. In most of the major cities they formed umbrella organizations that pushed the states to implement protective legislation. They were most successful in Massachusetts, New York, and Pennsylvania, all of which enacted significant tariffs on foreign manufactures in the years before the ratification of the Constitution. Some merchants also saw the potential profits from manufacturing. In many cities they formed manufacturing societies that sponsored fairly large-scale textile factories to raise interest in the potential possibilities for domestic manufactures. Some, such as the Pennsylvania Society for the Encouragement of Manufactures and the Useful Arts, were briefly profitable. Merchant members of these societies also joined with mechanics to lobby for government encouragement of manufacturing. Finally, a number of agricultural societies also publicized home manufacturing and larger-scale textile manufacturing as a means of stimulating new markets for agricultural products.
The most famous attempt to promote manufacturing during these years, Treasury Secretary Alexander Hamilton's Report on Manufactures (1791), owed much to these efforts. Co-written with Tench Coxe, assistant treasurer of the United States and founder of the Pennsylvania Society for the Encouragement of Manufactures and the Useful Arts, the report urged greater investment in factory production and more government encouragement to manufactures, especially in the form of bounties. Although the report died in Congress, it did spawn the Society for Establishing Useful Manufactures, a multifactory corporation in Paterson, New Jersey, that resembled a larger version of the earlier manufacturing societies, attracted many of the same wealthy investors, and which benefited from a valuable package of incentives from the state of New Jersey.
Technological change and new legal developments were two other factors stimulating manufactures in the early Republic. The industrial revolution was already well under way in England, where factory technologies were zealously guarded. However, new technologies seeped into the United States along with heavy immigration of skilled Europeans—both free men and servants. Samuel Slater, alerted to America's need for industrial technology by the propaganda of one of the manufacturing societies, is perhaps the most famous example of an immigrant who smuggled detailed information into the United States. Slater, credited with establishing modern textile-producing technology in American mills, was not an isolated example; in fact, it was often government policy during the early Republic to encourage such technology piracy. The most important indigenous technological development was Eli Whitney's system of interchangeable parts, which came to be known as the "American System" of manufacturing and which made possible the widespread development of mass production. Additionally, the early national legal system increasingly encouraged manufacturing. Many states offered various forms of pecuniary inducements to manufacturers. Although their exact role is now debated, corporate charters issued by state legislatures encouraged manufacturing companies by providing them a solid legal foundation and, in some cases, state subsidies. Finally, the emerging doctrine of "creative destruction," most famously elaborated in the U.S. Supreme Court's ruling in Charles River Bridge v. Warren Bridge (1837), made it easier for industrial projects to proceed, despite claims from local landowners (often farmers whose lands were flooded by mill dams) that such development impinged on their right to enjoy their own property.
By 1808 a new set of concerns further encouraged manufacturing. The immediate catalyst was the challenge to American shipping by the Napoleonic Wars (1799–1815). President Jefferson's Embargo of 1808 was intended to coerce Britain and France to respect American neutrality at sea. It ultimately failed, but by cutting off all foreign imports it had the largely unintended effect of further encouraging American manufacturing. The War of 1812, which ensued when economic coercion failed, also acted as a continuing incentive for domestic manufacturing by further isolating America from European imports. With the end of the war, many American manufacturers and their political allies forcefully argued for the need to pass new legislation to protect America's emerging factories, resulting in the tariffs of 1816, 1824, and 1828. The last of these acts, sometimes derided as the Tariff of Abominations, proposed to raise many tariffs well above the 25 percent mark and nearly precipitated civil war during the Nullification Crisis of 1832.
All of these factors led to a significant rise in manufacturing by 1830. The most notable sector was textiles. Cotton production capacity, for example, increased from 8,000 spindles in 1808, to 80,000 by 1811, an estimated 350,000 by 1820, and 1.2 million by 1830. The most famous of all the textile projects was the large, vertically integrated factories created in Waltham and Lowell, Massachusetts, by corporations founded by wealthy merchants retrospectively known as the Boston Associates. The Waltham-Lowell factories were typical insofar as they relied on pirated technology and were begun when the War of 1812 offered protection from competing imports. They initially employed large numbers of young farm women from the surrounding rural areas, many of whom lived in company boardinghouses. By 1836 Lowell alone could boast of twenty textile mills employing nearly 7,000 workers, for an average of 350 workers per mill.
Further to the south, Philadelphia also was a major manufacturing center by 1830, but without large, vertically integrated factories. Instead, manufacturing there was characterized by proprietary capitalism, a flexible mixture of small, highly specialized, generally privately owned firms. Well over one thousand workers labored in the thirty-nine Philadelphia textile firms that responded to the census of 1820, for an average of fewer than thirty workers per manufactory. Factories also flourished in the countryside, usually near likely sources of waterpower. For example, Oneida County, New York, lightly settled and almost entirely agricultural in the 1790s, supported twenty-one textile factories producing a total of half a million dollars worth of goods by 1832.
But textile factories, while having a high profile, were only one aspect of the rise of manufactures. The years just after the Revolution witnessed the growth of many sorts of nonmechanized manufacturing establishments such as sugar refineries, ropewalks, and small shoe manufactories. New York City was moving toward "metropolitan industrialization," characterized by growing numbers of nonmechanized manufactories using traditional technologies but often employing wage laborers rather than the traditional configuration of master, journeyman, and apprentice. Home manufacturing grew, too. One contemporary estimated that New England farm families manufactured twice as much in 1790 as they had twenty years earlier. However, by 1820 factory production was beginning to be accepted as the new standard. While the 1810 census of manufactures had included all sorts of manufacturers—nonmechanized, factory, and household—the 1820 census generally assumed that manufacturing would be performed outside the home by wage workers rather than by apprentices or family members.
By the time of the Civil War, the United States would be on the verge of becoming one of the world's largest manufacturing economies. It was not quite there by 1830, but it had advanced a very long way from the dependent, agricultural, colonial economy of sixty years earlier.
Bezís-Selfa, John. Forging America: Ironworkers, Adventurers, and the Industrious Revolution. Ithaca, N.Y.: Cornell University Press, 2004.
Dublin, Thomas. Women at Work: The Transformation of Work and Community in Lowell, Massachusetts, 1826–1860. New York: Columbia University Press, 1979.
Jeremy, David J. Transatlantic Industrial Revolution: The Diffusion of Textile Technologies between Britain and America, 1790–1830s. Cambridge, Mass.: MIT Press, 1981.
Licht, Walter. Industrializing America: The Nineteenth Century. Baltimore: Johns Hopkins University Press, 1995.
Peskin, Lawrence A. Manufacturing Revolution: The Intellectual Origins of Early American Industry. Baltimore: Johns Hopkins University Press, 2003.
Scranton, Philip. Proprietary Capitalism The Textile Manufacture at Philadelphia, 1800–1885. New York: Cambridge University Press, 1983.
Wilentz, Sean. Chants Democratic: New York City and the Rise of the American Working Class, 1788–1850. New York: Oxford University Press, 1984.
Lawrence A. Peskin
One can trace the origins of modern manufacturing management to the advent of agricultural production, which meant that humans did not constantly have to wander to find new sources of food. Since that time, people have been developing better techniques for producing goods to meet human needs and wants. Since they had additional time available because of more efficient food sources, people began to develop techniques to produce items for use and trade. They also began to specialize based on their skills and resources. With the first era of water-based exploration, trade, and conflict, new ideas regarding product development eventually emerged, over the course of the centuries, leading to the beginning of the Industrial Revolution in the mid-eighteenth century. The early twentieth century, however, is generally considered to mark the true beginning of a disciplined effort to study
and improve manufacturing and operations management practices. Thus, what we know as modern manufacturing began in the final decades of the twentieth century.
The late 1970s and early 1980s saw the development of the manufacturing strategy paradigm by researchers at the Harvard Business School. This work focused on how manufacturing executives could use their factories' capabilities as strategic competitive weapons, specifically identifying how what we call the five P's of manufacturing management (people, plants, parts, processes, and planning) can be analyzed as strategic and tactical decision variables. Central to this notion is the focus on factory and manufacturing trade-offs. Because a factory cannot excel on all performance measures, its management must devise a focused strategy, creating a focused factory that does a limited set of tasks extremely well. Thus the need arose for making trade-offs among such performance measures as low cost, high quality, and high flexibility in designing and managing factories.
The 1980s saw a revolution in management philosophy and the technologies used in manufacturing. Just-in-time (JIT) production was the primary breakthrough in manufacturing philosophy. Pioneered by the Japanese, JIT is an integrated set of activities designed to achieve high-volume production using minimal inventories of parts that arrive at the workstation "just in time." This philosophy—coupled with total quality control (TQC), which aggressively seeks to eliminate causes of production defects—is now a cornerstone in many manufacturers' practices.
As profound as JIT's impact has been, factory automation in its various forms promises to have an even greater impact on operations management in coming decades. Such terms as computer-integrated manufacturing (CIM), flexible manufacturing systems (FMS), and factory of the future (FOF) are part of the vocabulary of manufacturing leaders.
Another major development of the 1970s and 1980s was the broad application of computers to operations problems. For manufacturers, the big breakthrough was the application of materials requirements planning (MRP) to production control. This approach brings together, in a computer program, all the parts that go into complicated products. This computer program then enables production planners to quickly adjust production schedules and inventory purchases to meet changing demands during the manufacturing process. Clearly, the massive data manipulation required for changing the schedules of products with thousands of parts would be impossible without such programs and the computer capacity to run them. The promotion of this approach by the American Production and Inventory Control Society (APICS) has been termed the MRP Crusade.
The hallmark development in the field of manufacturing management, as well as in management practice in general, is total quality management (TQM). Although practiced by many companies in the 1980s, TQM became truly pervasive in the 1990s. All manufacturing executives are aware of the quality message put forth by the so-called quality gurus—W. Edwards Deming, Joseph M. Juran, and Philip Crosby. Helping the quality movement along was the creation of the Baldrige National Quality Award in 1986 under the direction of the American Society of Quality Control and the National Institute of Standards and Technology. The Baldrige Award recognizes up to five companies a year for outstanding quality management systems.
The ISO 9000 certification standards, issued by the International Organization for Standardization, now play a major role in setting quality standards, particularly for global manufacturers. Many European companies require that their vendors meet these standards as a condition for obtaining contracts.
The need to become or remain competitive in the global economic recession of the early 1990s pushed companies to seek major innovations in the processes used to run their operations. One major type of business process reengineering (BPR) is conveyed in the title of Michael Hammer's influential article "Reengineering Work: Don't Automate, Obliterate." The approach seeks to make revolutionary, as opposed to evolutionary, changes. It does this by taking a fresh look at what the organization is trying to do, and then eliminating non-value-added steps and computerizing the remaining ones to achieve the desired out-come.
The idea is to apply a total system approach to managing the flow of information, materials, and services from raw material suppliers through factories and warehouses to the end customer. Recent trends, such as outsourcing and mass customization, are forcing companies to find flexible ways to meet customer demand. The focus is on optimizing those core activities in order to maximize the speed of response to changes in customer expectations.
Based on the work of several researchers, a few basic operations priorities have been identified. These priorities include cost, product quality and reliability, delivery speed, delivery reliability, ability to cope with changes in demand, flexibility, and speed of new product introduction. In every industry, there is usually a segment of the market that buys products—typically products that are commodity-like in nature like sugar, iron ore, or coal—strictly on the basis of low cost. Because this segment of the market is frequently very large, many companies are lured by the potential for significant profits, which they associate with the large unit volumes of the product. As a
consequence, competition in this segment is fierce—and so is the failure rate.
Quality can be divided into two categories: product quality and process quality. The level of a product's quality will vary with the market segment to which it is aimed because the goal in establishing the proper level of product quality is to meet the requirements of the customer. Overdesigned products with too high a level of quality will be viewed as prohibitively expensive. Underdesigned products, on the other hand, will result in losing customers to products that cost a little more but are perceived as offering greater benefits.
Process quality is critical since it relates directly to the reliability of the product. Regardless of the product, customers want products without defects. Thus, the goal of process quality is to produce error-free products. Adherence to product specifications is essential to ensure the reliability of the product as defined by its intended use.
A company's ability to deliver more quickly than its competitors may be critical. Take, for example, a company that offers a repair service for computer-networking equipment. A company that can offer on-site repair within one or two hours has a significant advantage over a competing firm that only guarantees service only within twenty-four hours.
Delivery reliability relates to a firm's ability to supply the product or service on or before a promised delivery due date. The focus during the 1980s and 1990s on reducing inventory stocks in order to reduce cost has made delivery reliability an increasingly important criterion in evaluating alternative vendors.
A company's ability to respond to increases and decreases in demand is another important factor in its ability to compete. It is well known that a company with increasing demand can do little wrong. When demand is strong and increasing, costs are continuously reduced because of economies of scale, and investments in new technologies can be easily justified. Scaling back when demand decreases may require many difficult decisions regarding laying off employees and related reductions in assets. The ability to deal effectively with dynamic market demand over the long term is an essential element of manufacturing strategy.
Flexibility, from a strategic perspective, refers to a company's ability to offer a wide variety of products to its customers. In the 1990s companies began to adjust their processes and outputs to dynamic and sometimes volatile customer needs. An important component of flexibility is the ability to develop different products and deliver them to market. As new technologies and processes become widespread, a company must be able to respond to market demands more and more quickly if it is to continue to be successful.
Manufacturing strategy must be linked vertically to the customer and horizontally to other parts of the enterprise. Underlying this framework is senior management's strategic vision of the firm. This vision identifies, in general terms, the target market, the firm's product line, and its core enterprise and operations capabilities. The choice of a target market can be difficult, but it must be made. Indeed, it may lead to turning away business—ruling out a customer segment that would simply be unprofitable or too hard to serve given the firm's capabilities. Core capabilities are those skills that differentiate the manufacturing from its competitors.
In general, customers' new-product or current-product requirements set the performance priorities that then become the required priorities for operations. Manufacturing organizations have a linkage of priorities because they cannot satisfy customer needs without the involvement of R&D and distribution and without the direct or indirect support of financial management, human resource management, and information management. Given its performance requirements, a manufacturing division uses its capabilities to achieve these priority goals in order to complete sales. These capabilities include technology, systems, and people. CIM, JIT, and TQM represent fundamental concepts and tools used in each of the three areas.
Suppliers do not become suppliers unless their capabilities in the management of technology, systems, and people reach acceptable levels. In addition, most manufacturing capabilities are now subjected to the "make-or-buy" decision. It is current practice among world-class manufacturers to subject each part of a manufacturing operation to the question: If we are not among the best in the world at, say, metal forming, should we be doing this at all, or should we subcontract to someone who is the best?
The main objectives of manufacturing strategy development are (1) to translate required priorities into specific performance requirements for operations and (2) to make the necessary plans to assure that manufacturing capabilities are sufficient to accomplish them. Developing priorities involves the following steps:
- Segment the market according to the product group.
- Identify the product requirements, demand patterns, and profit margins of each group.
- Determine the order winners and order qualifiers for each group.
- Convert order winners into specific performance requirements.
It has been said that America's resurgence in manufacturing is not the result of U.S. firms being better innovators than most foreign competitors. This has been true for a long time. Rather, it is because U.S. firms are proving to be very effective copiers, having spent a decade examining the advantages of foreign rivals in product development, production operations, supply chain management, and corporate governance then putting in place functional equivalents that incrementally improve on their best techniques. Four main adaptations on the part of U.S. firms underscore this success:
- New approaches to product-development team structure and management have resulted in getting products to market faster, with better designs and manufacturability.
- Companies have improved their manufacturing facilities through dramatic reductions of work-in-process, space, tool costs, and human effort, while simultaneously improving quality and flexibility.
- New methods of customer-supplier cooperation, which borrow from the Japanese keiretsu (large holding companies) practices of close linkages but maintain the independence of the organizations desired by U.S. companies, have been put in place.
- Better leadership—through strong, independent boards of directors who will dismiss managers who are not doing their jobs effectively—now exists.
In sum, the last few decades of the twentieth century witnessed tremendous change and advancement in the means of producing goods and the manner of managing these operations that have led to higher levels of quality and quantity as well as greater efficiency in the use of resources. In the new millennium, because of global competition and the expansive use of new technologies, including the Internet, a successful firm will be one that is competitive with new products and services that are creatively marketed and effectively financed. Yet what is becoming increasingly critical is the ability to develop manufacturing practices that provide unique benefits to the products. The organization that can develop superior products, sell them at lower prices, and deliver them to their customers in a timely manner stands to become a formidable presence in the marketplace.
see also Factors of Production
What It Means
Manufacturing involves transforming raw materials into new products using mechanical, physical, and chemical processes. Generally, manufactured goods are made in large quantities by machinery or by manual labor using mass-production techniques. This results in goods that are identical to one another and that are relatively inexpensive to produce. Many goods that modern consumers purchase, from plastic products to tools to mattresses to jelly beans, are manufactured using these methods.
The production, sale, and consumption of manufactured goods form the basis of modern economies. An industrialized country manufactures goods in order to build and maintain the nation’s infrastructure (its roads, bridges, airports, and other public works) and to develop advanced areas of industry, such as information technology and national defense. In order to industrialize, a developing country must first establish manufacturing industries to provide the products it needs to build an infrastructure.
Manufacturing employs many workers. In the United States, manufacturing jobs represent over 10 percent of all employment. In 2006 approximately 14 million U.S. workers had manufacturing jobs. Among the largest manufacturing industries are those that produce iron and steel, textiles, automobiles, aircraft, lumber, and chemicals.
Manufacturing techniques involve a variety of mechanical processes, including machining, forging, casting, and injection molding. Machining is the most important manufacturing process. It involves using a machine tool to remove material from an object (by drilling, turning, milling, or grinding, for example) to create a final product. Machined parts, such as pins and fasteners, are used in many other manufacturing industries. Forging is a process in which heated metal is hammered, rolled, and shaped. Forging produces strong iron and steel parts that are used to make cars, trains, and other manufactured goods. In casting (also called founding), metal is shaped by being poured into a mold made out of sand or other materials. The casting process is used to make a variety of materials, including wheels and jewelry. Injection molding is a process used to shape heated plastic by pouring it into a mold. It is used to make a variety of plastic products, such as computer parts, toys, and containers.
When Did It Begin?
Before the American Revolution the economy of colonial America was based largely on agriculture. Some types of commercial manufacturing, such as iron forges, existed and even prospered, but in general the colonies imported manufactured goods from England in exchange for such goods as tobacco, wheat, and fish.
After the American Revolution many Americans began their own manufacturing businesses as they sought economic independence from England. In New York, Massachusetts, and Pennsylvania, artisan-manufacturers pressured the state governments to pass legislation protecting manufacturing. Organizations such as manufacturing societies helped support and fund large-scale textile factories. In 1791 U.S. Treasury Secretary Alexander Hamilton called for greater investment in factory production in his Report on Manufactures . One of the results of the report was the creation of a multifactory corporation in Paterson, New Jersey, which attracted many wealthy investors.
The Industrial Revolution transformed the production of goods, especially the production of textiles, iron, steel, and transportation, in eighteenth-century England, and much of the knowledge about factory technologies was brought to the United States by skilled European immigrants. For example, British industrialist Samuel Slater (1768–1835) came to the United States in 1789 and helped establish the manufacturing technology for the nation’s modern production of textiles, which involved the use of water-powered mills.
What eventually came to be known as the “American system of manufacturing” started in the 1790s with the technological innovations developed by U.S. inventor Eli Whitney (1765–1825). Whitney discovered the benefits of using “interchangeable parts” to make guns for the U.S. government. Until then guns had been made by hand, and each was unique. Whitney designed a new gun and the machines to make it. His machines produced many identical (or interchangeable) parts. The parts were put together by workers on an assembly line, a concept also developed by Whitney. Each finished gun was exactly identical. Whitney’s manufacturing methods made it possible to produce better goods quickly and more cheaply.
More Detailed Information
The manufacture of iron and steel in the United States is located primarily in the northeastern and midwestern states. Ohio has the largest number of iron and steel forging employees. The aerospace, national defense, and automotive industries are the primary purchasers of iron and steel. Agriculture, construction, mining, and industrial-equipment manufacturers also use large quantities of forged steel. Many consumer products are forged, including hardware tools and bicycles.
The manufacturing process of forging heats iron and then presses, hammers, and shapes it. There are several different types of forging processes. Closed die forging, or impression die forging, uses metal blocks called dies to create a specified form. Open die forging hammers metal as it moves between flat dies. Seamless rolled ring forging punches a hole in a thick, round piece of metal and then rolls and squeezes it into a thin ring. Different types of metal can be forged; carbon steel and alloy steel are the most commonly produced metals.
Textile manufacturing plays an important role in U.S. manufacturing because it was of central importance to the Industrial Revolution in the United States. In its early years, textile manufacturing was simple enough to be conducted under the roof of one factory. Since then textile manufacturing has developed into a large industry encompassing several smaller industries, including those that produce fibers, those that produce cloth and dye, and those that spin yarn and print goods. Approximately one-third of the cloth manufactured in the United States is produced for use in clothing. Textiles are also used to make floor coverings, home furnishings, and cloth goods used in agriculture, construction, and medical supply.
At one time all textiles came from plant or animal sources and were made by hand, but modern textile manufacturing is a complicated process. The production of most textiles involves multiple processes, including the formation of the yarns or threads, formation of the fabric by weaving or by another method, processing the fabric using dyes or other chemicals, and fabrication (the assembly of a garment). Many different mechanized processes in textile production have replaced the manual production methods that were used for centuries. For example, cotton used to be picked by hand, a slow and labor-intensive task. Today machines called cotton gins quickly remove seeds from the cotton fiber.
Chemical manufacturing is an important part of modern industry because chemicals are essential to the production of motor vehicles, paper, electronics, pharmaceuticals, cosmetics, and other products. Chemical manufacturers make synthetic materials (such as polyester), agricultural chemicals (such as fertilizers), paints and adhesives (such as glue), cleaning chemicals (such as soap and cleansers), and pharmaceutical chemicals (such as aspirin). Based on the type of product they make, different chemical manufacturers use differently designed and operated factories. Companies that produce synthetic materials typically have large plants, while those that manufacture paints and adhesives tend to operate smaller plants. Chemical plants are usually located near other kinds of manufacturing. There are many chemical plants located in the Great Lakes region near automotive manufacturers, on the West Coast near centers of electronic production, and on the Gulf Coast near petroleum and natural gas manufacturers.
The U.S. manufacturing sector has been an important player in the development of the labor movement. Labor unions and trade unions are organizations of employees who unite to improve working conditions, benefits, and pay. In 1938 the Congress of Industrial Organizations (CIO) was formed to represent the basic manufacturing industries, including iron and steel, automobile, rubber, electric and radio, and shipping. Workers and union leaders in the CIO organized strikes, or work stoppages. The most effective work stoppage was the “sit-down strike,” during which workers took over the plants in which they worked until their requests were met. The result was that many of the country’s largest manufacturing firms were forced to negotiate collective bargaining agreements, which set out the terms of employment for members of a labor union and describe such aspects as wages, working hours, health-insurance benefits, and vacation time. One of the greatest achievements of labor unions came in 1955, when U.S. auto manufacturers accepted a union-sponsored plan that guaranteed that workers would receive pay during unemployment.
In the latter decades of the twentieth century, the number of U.S. manufacturing jobs declined because of outsourcing, or the practice of sending production activities that were previously done in the firm’s plant to remote plants. For example, many manufacturers have outsourced the assembly of their products, recognizing that they do not need to build or purchase plants or equipment if they can have their products assembled at so-called contract manufacturers. In the 1970s contract manufacturers were typically smaller operations that offered a single service, such as the assembly of circuit boards, to a larger firm. Since then they have grown into multiservice specialists that can not only assemble and manufacture products but also design and distribute them. The main reason that manufacturers outsource aspects of production is so that they can save on materials, labor, and expenses.
Although it has many advantages, contract manufacturing has disadvantages as well. Since 2001 the United States has outsourced millions of manufacturing jobs to contract manufacturers located in other parts of the world, often to countries where workers have lower wages and fewer protections, such as negotiated wages, safe-working conditions, and health benefits. This is a major concern of international human-rights and labor-issues groups. Also, when a firm employs a contract manufacturer it gives up a certain element of control that can make it hard to make extensive changes to engineering. Contract manufacturing is best for firms that have a steady demand for their products and that do not depend on assembly processes or equipment that is very complicated.