It’s mostly accepted wisdom in policy circles: automation means that manufacturing employment is destined to follow the historical example of farm work, becoming a negligible share of the U.S. workforce. Even those who embrace the Trump administration’s efforts to “onshore” factories say that we should be “honest about the facts,” especially to those Americans still out of work. The erudite Ben Sasse, a senator with a Ph.D. in history, has called the transformation an “irreversible trend.”
On the other hand, the latest Bureau of Labor Statistics (BLS) data show that over the past 12 months, industrial domains created twice as many jobs as did the health-care sector, and IT, retail, leisure, transportation, and government saw little net job creation. As part of the industrial sector, the BLS counts “goods-producing” activities in manufacturing, construction, and mining—all of which saw employment expand over the past year. Manufacturing alone added 25 percent more jobs than did “professional and technical services.”
What gives? The received wisdom is that we, the advanced “postindustrial” nations, live in a time of declining demand for factory work and the “old” manual-centric trades that produce “goods.” So is this recent trend a short-term aberration? Sasse is not the only one prophesying the disappearance of American manufacturing employment; lots of respected analyses say that we’re at the beginning of a long-term trend in which artificial intelligence and robots will ultimately wipe out factory labor-hours. But it is precisely in the long term that the postindustrial trope—the analogy between what happened to agricultural labor and what will happen, we’re told, to factory work—breaks down.
Farm jobs disappeared for a simple reason: technology vastly reduced the labor-hours needed to produce a unit of food, while the growth in demand for food was (and remains) limited. Unlike food, however, the demand for things that we invent and make is essentially unlimited. Demand for food rises from population growth, boosted by the malnourished eventually eating about as much as people in developed nations. After that threshold is reached, though, there is a limit to how much people eat: per capita calorie consumption in the richest and fattest countries is only two times the subsistence level.
The big changes in societal consumption are found in the variety and quantity of fabricated things that we own or use. Globally, unmet demand for manufactured items is at least 10 to 100 times greater than present consumption. In emerging economies, per capita ownership of everything from air conditioners and cars to computers and home furnishings is so low that potential demand growth far outstrips anything in agriculture. Thus, even big gains in manufacturing productivity—fewer labor-hours per unit of output —can leave overall factory employment largely unchanged, assuming that global demand keeps growing.
Compared with America circa 1870, today’s productivity in both farming and industry is fantastically better. Farming has declined from 60 percent to a 2 percent share of all employment. But the industrial share of employment has remained remarkably similar over that entire period.
As for the “unexplained” decline in the share of Americans employed in industry over the recent decade, we can’t blame automation, because manufacturers’ IT investment has actually been stagnant. And, over the same period, Japan and Germany experienced less than half the U.S. decline in industrial share of employment. The new factor that pushed many U.S. firms and jobs into “involuntary” offshoring over the past decade was the unprecedented proliferation of regulations covering every aspect of industrial processes and a doubling of federal regulators, which made competitive manufacturing impossible.
The mantra of the post-industrialists is always “this time it’s different.” If America is undergoing an inevitable transition to a post-manufacturing economy, then reining in regulations is irrelevant, they say. Further, as emerging nations modernize, getting richer and buying more things, they’ll increasingly do their own manufacturing anyway.
However, in a technologically sophisticated world where innovation can flourish, novel things will always be invented, creating demands for new manufactured goods. We need look no further back than a decade ago, to the invention of the iPhone, and the hundreds of billions of dollars spent to create the infrastructure of the Internet that has collaterally enabled all manner of new online services. The earlier advent of commercial jet travel offers another historical lesson. The hundreds of billions of dollars in annual aviation manufacturing supports today’s massive vacation and leisure industry. Only 15 percent of airline passengers are business travelers.
It’s nearly impossible for pundits and economists to predict what new products will emerge. But that’s precisely where innovators and engineers come in: they don’t predict the future, they invent it. Before cars were invented, there was no demand for cars—or for paved roads. Before plastics, there was no demand for plastics. Chemical industries employed only 100,000 people early in the 20th century and now employ nearly 1 million Americans.
No one in 1963, during the Kennedy era, predicted the personal computer or the Internet. (Well, almost no one.) No one, at the time of the Chicago World’s Fair of 1893—when the world was afire with innovation—predicted the radio. Yet by 1927, one-third of the money Americans spent on furniture was spent on radios. (Commercialization of home radios drove RCA’s stock up higher and faster than has happened for any tech company in the more recent past.)
Nonetheless, it is possible to predict the future “that has already happened,” as the great management consultant Peter Drucker once said—by looking at what has already been invented but hasn’t yet been made commercially viable. The still-nascent Internet of Things will ultimately need trillions of sensors and “smart” devices, not to mention a new and expanded infrastructure far bigger in scope than the Internet of people requires. Then there’s the emergence of bio-electronics: biologically compatible computers, sensors, and communications. Think of these as smart sensors that you can “tattoo,” implant, or eat, for myriad imaginable and unimaginable diagnostic and therapeutic health functions. It will lead to a manufacturing industry as big as today’s $400 billion semiconductor sector.
Does anyone doubt that there will be more drones, or new classes of video displays? Innovators have already created paper-thin and conformable displays; odds are that these will one day be manufactured on a scale that windows are made today. And while last year saw the manufacture of only about 400,000 industrial robots, 10 million service robots were sold. Demand will take off as these technologies improve—call it an “irreversible” trend. Useful robots and cobots—those that safely and intuitively work alongside people—will not only improve manufacturing but will also transform retail and security, surgery, physical rehabilitation, elder care, and hospital cleaning. If ever there were a service sector in need of a tech-driven productivity boost, it’s health care. Global robot manufacturing will eventually rival today’s $3-trillion-a-year automobile sector.
Since Joseph Schumpeter created the phrase “creative destruction” to describe the evolution of an industrial economy, pundits and economists have been preoccupied with the “destruction” side. But it’s the “creative” feature of tech that matters. Demand for manufactured goods is on the cusp of the greatest expansion in history. The scale and nature of that demand can’t be met unless manufacturing productivity and sophistication continue to see rapid gains. Robots, both physical and virtual, will be essential to achieving those gains. In short: the argument that manufacturing employment is destined to become irrelevant is wildly overstated.
Photo by Matt Cardy/Getty Images