The term deindustrialization was first popularized in the United States in 1982 by Barry Bluestone and Bennett Harrison in their book, The Deindustrialization of America. They defined it as widespread disinvestment in industrial capacity, caused mainly by the transfer of productive industrial investment into unproductive and speculative acquisitions, mergers, and foreign investments. Bluestone and Harrison believed that deindustrialization, as they defined it, occurred primarily after 1970, but if one defines it more broadly, as a loss in industrial capacity and even more critically a loss in manufacturing jobs, the phenomenon has a much longer history in American cities.
Even in the late nineteenth century, after only a few decades of industrialism, many smaller cities experienced declining industrial output and employment. Larger cities garnered an increasing share of manufacturing output and employment at the expense of their smaller counterparts. In the railroad age, transportation rates and services favored larger cities. As the extent of mass production increased, larger cities provided economies of scale, larger and more specialized labor markets, and better access to financial resources. And in an age of more rudimentary communications, bigger cities provided superior access to technological innovations and new industrial methods.
In the first half of the twentieth century, urban deindustrialization took new forms. During the 1920s, a number of specialized industrial centers, particularly those in coal mining and textiles, saw the beginnings of long-term loss of jobs and industrial capacity. The textile industry, foreshadowing later developments in other kinds of manufacturing, moved south to areas of lower wages and less urbanization. Some textile cities, such as New Bedford and Fall River, Massachusetts, never recovered from the economic decline that began during the 1920s.
While deindustrialization in the late nineteenth century had occurred mainly in smaller cities, in the twentieth century the most important manifestation of the phenomenon was the movement of industry from larger central cities to suburbs. As early as 1915, Graham R. Taylor in Satellite Cities: A Study of Industrial Suburbs described the shift of factories to fringe areas that was already well under way in many American metropolises. In the 1920s, the proportion of manufacturing employment located in central cities declined in every American city with more than 100,000 residents. During World War II, factories built under the auspices of the Defense Plant Corporation were rarely located in central cities, furthering the trend toward dispersed manufacturing. Still, industry remained the largest employer in central cities until the 1950s, but in that decade the race away from core areas accelerated. The 16 largest and oldest central cities lost an average of 34,000 factory jobs between 1947 and 1967, while the suburbs of those metropolises gained an average of 87,000. By 1963, more than 50 percent of industrial employment was located in suburbs, and by 1981 almost two-thirds of manufacturing was found there.
A number of factors contributed to the suburbanization of industry during the twentieth century. The growing use of trucks, particularly since the 1920s, ended the absolute necessity of railroad connections and allowed manufacturing to move out of central cities into suburbs where land was cheaper and taxes were lower. Cheaper land was particularly important as assembly-line production, which preferred large-area, one-story factories, became common. The dominance of electric power in factories after World War I also freed them from the constraints of central coal transfer facilities required to generate steam. The widespread ownership of automobiles and the mass movement of Americans to suburbs meant that industry no longer relied on a labor force who walked or rode streetcars to work.
Many central cities tried to reverse the trend of deindustrialization in the 1950s and 1960s by clearing land for light industrial development, using funds made available by the federal urban renewal program. In cities such as St. Louis, Milwaukee, Cincinnati, and Boston, these projects failed to revitalize industry with the result that the municipalities had to sell the land for warehouses and service activities (the latter of which were increasingly dominating urban employment).
After 1970, urban industrial decline, particularly the loss of factory jobs, became more pronounced and even spread to suburban areas as well as newer cities in the Sunbelt. A significant part of the job loss resulted from foreign competition. Economies such as those of Japan and Germany had rebuilt their industrial sectors after World War II and possessed advanced technological bases because of this retooling. The United States had accepted much of the cost and responsibility for defending these nations, thus freeing their capital and expertise for sophisticated civilian industrial activities. In addition, as American industrial technology and methods spread around the world, American industry found it increasingly difficult to compete with mass-produced goods made in countries where labor received lower wages.
Also contributing to the manufacturing crisis of the 1970s and 1980s was the “paper entrepreneurialism” of the period. Leveraged buyouts, hostile takeovers, “greenmail” defenses, and an overall preoccupation with short-term profits diverted capital and talent from more productive activities. When real investment was made in industry, it often paid for new technologies such as robots that reduced the need for workers in the already shrinking industrial job base.
The 1970s was a particularly devastating decade for the industrial base of America’s largest cities. The loss of manufacturing jobs ranged from 25 percent in Minneapolis, to 38 percent in Youngstown, Ohio, and 40 percent in Philadelphia. These trends continued in the 1980s, and even Chicago’s relatively healthy economy lost 33 percent of its factory jobs.
By the middle of the 1990s, American industrialism had recovered some of these losses. Some metropolises such as Pittsburgh, Buffalo, Cincinnati, Phoenix, Boston, and San Francisco had shifted to a high technology manufacturing base. And some writers described advanced industrial cities, such as Cleveland and Pittsburgh, where “knowledge workers” in cities with advanced service bases provided command-and-control administration to worldwide industries.
As promising as these developments might be for the economy as a whole, they did little to stem the national slide of manufacturing employment that dropped from 35 percent of the workforce in 1947 to 16 percent in 1993. And the metropolises and central cities suffered even heavier losses. New factories, such as those built by foreign investors in industries like automobiles, were almost always built outside of central cities and were usually located outside of metropolitan areas as well. Meanwhile, the drive toward automation continued unabated.
In the late nineteenth century, the industry of the United States was overwhelmingly urban. One hundred years later, few factories could be found in America’s central cities, and the suburbs were also losing manufacturing employment. The loss of relatively well paying manufacturing jobs, particularly in central cities, had become one of the most serious economic and social problems of urban America.
Related Video: Deindustrialization
Tags: deindustrialization, deindustrialization of america, deindustrialization of detroit, deindustrialization of the united states