This question of technology leading to a reduction in demand for labor is not some hypothetical prospect,” says Larry Summers, former U.S. secretary of the Treasury and president of Harvard University. “It’s one of the defining trends that has shaped the economy and society for the last 40 years.”

Summers’s view represents conventional wisdom across the political spectrum and around the world, especially with respect to the industrial economy. Factory automation “has already decimated jobs in traditional manufacturing,” said the late physicist Stephen Hawking. The New York Times calls automation “The Long-Term Jobs Killer” and editorializes that “many economists believe that automation has had a much bigger impact” than trade on manufacturing jobs. Speaking just before Brexit, former British Conservative Party leader Iain Duncan Smith attributed global political turmoil in part to “automation and technological change,” to which “so many manufacturing jobs have already been lost,” while Brookings Institution scholar Mark Muro said just after Donald Trump’s election that it was “secretly about automation.”

These ideas seem self-evident to many people because blue-collar workers have been hurting for a long time. Manufacturing employment has fallen by nearly one-third since 2000, and millions of less educated Americans—especially men—have abandoned the labor force entirely. Wages have stagnated since at least 1980; only half of Americans in their thirties are better off financially than their parents were at the same age a generation ago. Most people expect the next generation to fare even worse.

Economists often blame automation. Technology, they argue, allows us to produce more output with fewer people, displacing less skilled workers from high-paying factory jobs into the lower-paying service sector or sending them out of the workforce entirely. Breakthroughs in robotics and artificial intelligence will only accelerate the trend, threatening perhaps the majority of jobs in the decades ahead.

Yet these claims find no support in either the data of recent economic performance or careful analyses of future labor-force trends. If automation were rendering workers obsolete, we would see evidence in rising productivity, major capital investments, and a shift in the ratio of production workers to managerial workers. None of these things has occurred. If technology could render workers obsolete, the radical advancements of past generations should have done it. They did not. If this time is different, we should find evidence that a large share of current workers are uniquely vulnerable to the particular set of technologies on the horizon. We do not.

What we find, instead, is that the industrial economy has stalled. Technology-driven productivity gains have continued as in the past, if a bit slower. But whereas output used to grow at least as quickly, it now grows barely at all. The dynamic has shifted from one in which workers produce more each year, and total output rises, to one in which fewer workers are needed each year to deliver roughly the same output as the year before.

What has changed, and what deserves our attention, is not the trend in automation but rather the dramatic slowdown in output growth. This has its own explanation, one that makes far more sense than the idea that the blessings of rising productivity have suddenly turned wicked: we have pursued a wide range of public policies that harmed the industrial economy broadly—the manufacturing sector, in particular—and the blue-collar workers relying on them. Only by acknowledging that reality, rather than scapegoating technological trends beyond our control, can we begin to make amends.

Mathematically, automation does destroy jobs. If an activity that required ten workers can, with automation, be done by five, then the economy can maintain its prior level of output with five fewer jobs. Automation is just one specific case of a more general rule: every means of increasing the rate of output per worker—measured as “productivity”—can also be understood as reducing the number of workers required to achieve the existing level of output.

This is not, generally, a problem. To the contrary, producing more stuff with less labor—whether through improved skills, more efficient processes, or better tools—is, by definition, how a society achieves economic progress. This means that any time we talk about rising productivity, wages, and standards of living, we are also talking about reducing the need for workers providing the current products and services.

The crucial question is what happens to output as productivity rises. If we achieve the 2.8 percent annual productivity growth that translates to a 100 percent increase after 25 years—the typical worker producing twice as much as a generation earlier—this also means, using the language of the automation debate, that every 25 years, we will destroy half of the economy’s jobs. And that would indeed be the result, if output remained at its initial level. But if output also doubles, then everyone remains working and material living standards can double, too. This is precisely what happened from 1947 to 1972, widely seen as the golden age for American manufacturing and the nation’s middle class. Economy-wide productivity increased by 99 percent; only 50 workers were needed by the end of the Vietnam War to do the work that 100 could complete at the end of World War II. The result was not mass unemployment. Instead, America produced more stuff. The same share of the population was working in 1972 as in 1947, and men’s median income was 86 percent higher.

Increasing output can take different forms. Between 1947 and 1972, workers produced greater quantities of the same goods and services, which were now more affordable for more consumers. U.S. vehicle sales doubled during this period, as did vehicle-miles-traveled per person; more than 85 percent of cars and trucks were still manufactured domestically. Goods and services also improved in quality—vehicles were safer and lasted longer, and some came with power steering and an automatic transmission. And with fewer workers required to produce the output of 1947, many could serve markets in 1972 that hadn’t existed a generation earlier, or that had been much smaller. In 1947, U.S. airlines flew 13 million passengers a combined 6.5 billion miles; in 1972, they flew 191 million passengers 152 billion miles. Fatalities per mile flown, meanwhile, fell more than 95 percent.

This is how growth works. When agriculture mechanized, we didn’t continue living as subsistence farmers while lamenting that nearly all our able-bodied adults were now jobless. We produced more and better food. Then we produced more of other things, too. For that matter, when we introduced hundreds of thousands of ATMs across the country, we did not design welfare programs for the armies of unemployed bank tellers—because bank-teller employment never even declined. As Boston University’s James Bessen has shown, ATMs lowered the banks’ cost of doing business, and they responded by opening more branches.

In theory, modern automation could produce different effects from what it did in the past—if, for instance, the pace of change accelerated to the point that workers became unnecessary in existing roles much faster than they could find new ones. But when the New York Times argues that “this time may really be different” because “over the same 15-year period that digital technology has inserted itself into nearly every aspect of life, the job market has fallen into a long malaise,” it is passing off correlation as causation. If we are to believe that the very process most responsible for economic progress and middle-class prosperity over the past century is now undermining those accomplishments, we should want to see some proof.

In fact, the evidence is plentiful—and it points in the opposite direction. Automation has not spurred accelerating productivity. During the second half of the twentieth century, economy-wide productivity increased by an annual average of 2.1 percent. From 2000 to 2016, the increase was only 1.8 percent, and slowing: from 3.2 percent (during 2000–2005) to 1.9 percent (2005–10) to 0.7 percent (2010–15). In 2016, productivity declined for the first time since the early 1980s.

The manufacturing sector tells a parallel story: no better progress in the early twenty-first century than the late twentieth, a steadily worsening picture, and catastrophic recent performance. From 2010 to 2016, productivity growth in manufacturing has averaged less than 1 percent—a period of unprecedented stagnancy.

These data yield slightly different results depending on the particular metrics and time frames used: an industry’s total output or only its “value added”; output per worker or per hour worked; or figures published by the Bureau of Labor Statistics or the Bureau of Economic Analysis. But no view alters the underlying reality that recent productivity growth is no better, and likely much worse, than in the past.

Might the data be misleading? Certainly, measuring output and productivity is tricky. Debate rages, for instance, over how best to account for the rapid improvement in digital products. If the same workers can produce a computer processor this year with twice as many transistors as last year’s, are they twice as productive? Some economists worry that taking such rapid productivity gains at face value overstates the real-world effects. After all, the ability to make faster computers this year does not mean that the previous year’s computers could be made with half the workers. Nor does a computer with twice the processing power offer fully double the value to the typical user.

Meantime, other economists believe that accounting methods developed almost a century ago are understating gains by missing much of the value created by newer types of products and services. What is a smartphone worth, if buying separately its camera, map, encyclopedia, music player, and so forth would have cost thousands of dollars only a decade ago—and a memory chip of comparable capacity would have cost millions of dollars?

Other indicators provide no better support for the claim that automation is spurring disruptive productivity increases. Larry Mishel and Josh Bivens of the Economic Policy Institute show that growth in capital investment generally, and in information technology particularly, has been slowing and stands well below pre-1990 levels. Robert Atkinson and John Wu of the Information Technology & Innovation Foundation likewise find that the rate of labor-market “churn”—employment disappearing from some occupations or appearing in others—stands at an all-time low.

Nor, within the manufacturing sector, are blue-collar jobs under particular pressure. If automation were accelerating, allowing factories to produce more with fewer people on the shop floor, the share of sector employment classified as “production and nonsupervisory” should decline. In the past, that happened. From 1947 to 1972, the production-worker share of the workforce fell more than ten points, from 87 percent to 76 percent—or, put more dramatically, the ratio of line workers to managers fell by half, from 6.8:1 to 3.2:1. By 1982, the share had reached 71 percent—a 2.4:1 ratio. Since then, for an entire generation, there has been virtually no change. In 2016, the share stood at 70 percent.

“As Boston University’s James Bessen shows, ATMs lowered the banks’ cost of doing business, and they opened more branches.”

Econometric studies offer a final line of evidence that explores the statistical relationship between robot installation and employment. A landmark 2015 study from London’s Centre for Economic Performance found that the introduction of industrial robots corresponded to faster productivity growth during 1993–2007, without affecting overall manufacturing employment, though high-skilled workers did see some gains at the expense of low-skilled ones. Another 2015 study, by the authors of the “China Shock” paper showing strong negative effects from exposure to Chinese trade, found that technological change did cause some employment shifts within sectors but did not reduce employment overall. Further, while the effect of trade exposure was increasing over time, the effect of technology seemed to be dissipating.

One prominent study does report a negative relationship between robots and jobs, but the magnitude of its finding is instructive. Released in 2017 by the National Bureau of Economic Research, the paper was headlined in the New York Times as “Evidence That Robots Are Winning the Race for American Jobs” and in the Washington Post as “We’re So Unprepared for the Robot Apocalypse.” The actual effect identified: a loss of 360,000–670,000 jobs over 17 years, of which only half were in manufacturing. This amounts to a loss of only 15,000 manufacturing jobs per year, roughly 0.1 percent of average manufacturing employment during the period. At that rate, robots might, over a century, eliminate as many manufacturing jobs as were, in fact, lost during 2001.

The difference this time, and the cause of economic distress, isn’t that productivity is rising. It’s that output is not. From 1950 to 2000, while productivity in the manufacturing sector rose by 3.1 percent annually, value-added output grew by 3.6 percent—and employment increased, from 14 million to 17 million. During 2000–2016, productivity rose by a similar 3.3 percent annually. But output growth was only 1.1 percent—and employment fell, from 17 million to 12 million. Even with all the technological advancement of the twenty-first century, had manufacturers continued to grow their businesses at the same rate as in the previous century, they would have needed more workers—a total of 18 million—by 2016.

Faced with this reality, efforts to blame job losses on automation require the fatally flawed assumption that output is not supposed to increase. Holding output constant and comparing the past’s lower productivity with the present’s higher productivity will always reveal that employment would be higher but for the productivity gain. This is how, for instance, an oft-cited report from Ball State University reaches the conclusion that 88 percent of manufacturing job losses are attributable to automation: “Had we kept 2000-levels of productivity and applied them to 2010-levels of production, we would have required 20.9 million manufacturing workers. Instead, we employed only 12.1 million.”

But, of course, we’re employing fewer manufacturing workers than if productivity had not increased—and the same could be said about every decade. Using this reasoning, we could say: “Had we kept 1960 levels of productivity and applied them to 1970 levels of production, we would have required 25 million manufacturing workers. Instead we employed 18 million.” Did automation destroy 7 million jobs in the 1960s? Maybe. But this wasn’t a problem, because output rose 62 percent, so manufacturing employment was higher at decade’s end. By contrast, the output of the manufacturing sector in 2016 was only 3 percent higher than in 2006. Single years achieved higher growth than that 32 times during 1950–2000, at least five times during every decade before 2010.

We want to know what has changed in our economic trends that has dislocated so many manufacturing workers and produced so much distress. The answer is not higher productivity growth; it is slower growth in output.

Of Hair Dryers and Haircuts

Does manufacturing even matter? Christina Romer, former chairwoman of President Obama’s Council of Economic Advisers, once observed that “American consumers value health care and haircuts as much as washing machines and hair dryers.” An emphasis on manufacturing, she worried, derived from “sentiment and history” and “the feeling that it’s better to produce ‘real things’ than services.”

Manufacturing matters for two reasons. First, it remains among the most productive economic activities for less educated workers. As a share of overall employment, manufacturing has fallen below 10 percent, but in 2016, the industry still accounted for more than 20 percent of private-sector jobs in traditionally blue-collar occupations with a median wage above $15 per hour. The manufacturing, construction, and resource-extraction industries combined to provide almost 40 percent of such good-paying, blue-collar jobs.

Second, “tradables”—goods and services that can be consumed far from where they’re produced—are the lifeblood of local economies. Americans take for granted that they can buy what products they want from around the world. But how can someone whose work consists entirely of serving others in his community expect a firm halfway around the world to make something for him? Consider the local physician who provides care only to those in his town. He may be well compensated, but he can’t sell his work to the makers of cars or phones halfway around the world or even to the medical-equipment supplier in the next state. He must trust instead that some of his patients produce goods or services that can be sent to those places and, in purchasing his medical services, give him the resources to acquire the goods that he needs.

Or consider the plight of a local economy as a whole. It wishes to receive from elsewhere almost all of its food, medicine, vehicles, electronics, energy, and more. It must send tradables of equal value. Not every individual must do so; most may work in the local services economy—but they cannot all cut one another’s hair.

What happens to a community whose economy does not produce anything that the world wants? It has one export that it can always fall back on: need. Every resident enrolled in a program of government benefits entitles the community to more goods and services from the outside world. For instance, a prominent criticism of recent proposals to cut food-stamp benefits has been that it would harm not just the individual recipients but also the local economies reliant on the outside income. Food-stamp recipients, in effect, are the community’s “exporters.”

Some speak of local health-care systems as bright spots in depressed regions, but these industries usually indicate the government’s commitment to health care as the surest way to generate hard currency for these economies. A retiree on the front porch laments to a reporter, “When I was young we had dances at the community centers. Now they have nothing. No work around here unless you are a nurse, or a doctor, or lawyer.” The list is not of especially productive professions, just those for which some government will pay. A common sight in the most dilapidated town is a sparkling occupational-therapy office. The people working there are selling to the nation’s taxpayers their care of the local residents on disability.

The automation question shapes debates about not only past economic trends and public policies but also those to come. President Obama, in his farewell address, warned that “the next wave of economic dislocations . . . will come from the relentless pace of automation that makes a lot of good middle-class jobs obsolete.” The Atlantic’s cover story, “A World Without Work,” introduces its topic thusly: “For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving.”

The grandest visions emanate from Silicon Valley, whose technologists believe that they will lead a social transformation. Many have become interested in the policy proposal known as “universal basic income,” by which government would pay a livable stipend to the no-longer-employable masses. “Robots will be able to do everything better than us,” warns Tesla’s Elon Musk. “There is a pretty good chance we end up with a universal basic income, or something like that, due to automation. . . . I am not sure what else one would do.”

Such expectations for automation get four things wrong. First, they magnify the importance of innovations at the cutting edge while taking for granted the equally fundamental innovations of the past. Advanced robotics and artificial intelligence are impressive, but so were the Internet, the computer, the internal combustion engine, and electricity, to say nothing of democratic capitalism and the industrial revolution. Yet, as Northwestern’s Robert Gordon shows, for no period in the past two centuries has growth in American output per person exceeded 2.5 percent. If anything, the innovative leaps of the past benefited from access to the lowest-hanging fruit. The elements of human labor easiest to replace with machines—the processes easiest to optimize—are the ones that get tackled first. In some respects, technological progress accelerates exponentially—the processing power of computer chips has been doubling every year or two for 50 years. But in others, it becomes ever harder to maintain even a constant rate of progress, as operations managers struggle to find yet another 2 percent to 3 percent efficiency gain on top of the last year’s. Accelerating technological progress does not produce accelerating productivity; it is the minimum required to deliver even constant productivity gains.

Second, these grandiose claims ignore the gradual timeline on which transformations inevitably occur. Technology takes time to adapt by fits and starts to real-world conditions. And organizations take time to adapt to new technologies. Thus, for instance, the head of Google’s self-driving-car project acknowledged in 2016 that the company’s goal of fielding a fully autonomous vehicle by decade’s end was implausible—the timeline was more like 30 years. “How quickly can we get this into people’s hands?” he asked. “If you read the papers, you see maybe it’s three years, maybe it’s thirty years. And I am here to tell you that honestly, it’s a bit of both. . . . This technology is almost certainly going to come out incrementally.”

And even after such new technology exists, it takes decades longer to penetrate fully its various applications. Existing vehicles will need to reach the end of their useful life, and new processes will have to be developed to capture the benefits. An automated delivery truck sounds wonderful, until it arrives and no one hops out to unload.

Deployment is always slow. Thomas Edison presented a lightbulb lit with power from his Pearl Street generating station in 1882, but 40 years later, less than 10 percent of the nation’s 6 million farms had electricity. Digital is not inherently faster. Walmart took 18 years to grow from $1 billion to $100 billion in U.S. sales. Amazon, likewise, hit the $1 billion mark in 1999 but took until 2017 to reach $100 billion across North America.

In manufacturing, every year will see extraordinary upgrades in some processes. But firms will have to decide how many areas they want to risk changing at one time and how much capital they can devote to the effort. With each decision, they will have to balance the benefits of investment in the ideal tool for a specific task with the loss of flexibility that comes from committing to that tool and that task. Even the steel industry, a poster child for automation, required 40 years to increase its per-worker output from 260 tons to 1,100 tons—an annual improvement of less than 4 percent. Across the manufacturing sector globally, the Boston Consulting Group reports that the stock of installed robots grew only 2 percent to 3 percent per year during 2005–14—no faster than total manufacturing output.

As Rodney Brooks observed in MIT Technology Review, “I regularly see decades-old equipment in factories around the world. I even see PCs running Windows 3.0—a software version released in 1990.” Anyone who has peered around a checkout counter to see the readout on the cashier’s screen knows this feeling. Brooks continued: “The principal control mechanism in factories, including brand-new ones in the U.S., Europe, Japan, Korea, and China, is based on programmable logic controllers, or PLCs. These were introduced in 1968. . . . I just looked on a jobs list, and even today, this very day, Tesla Motors is trying to hire PLC technicians at its factory in Fremont, California.”

The state-of-the-art Tesla plant has faced severe challenges, missing deadlines and postponing targets. In 2016, the company announced a goal of producing 100,000 to 200,000 of its new Model 3 sedans in the second half of 2017; in fact, it managed just 2,700. AutoWeek notes that, far from being a “temple of lean manufacturing,” the plant produced only about ten vehicles per worker in 2016, across all Tesla models. By comparison, when General Motors and Toyota launched their joint venture on the same Fremont site in the 1980s, they produced 26 vehicles per worker in the first year; by 1997, that rate had reached 74. And yes, despite productivity tripling, the plant had also doubled its workforce.

The ultimate check on the rate of productivity growth is the labor force itself. Firms can adopt new technologies and processes only as quickly as they can train workers to operate and perform them. The automated factory may require a few more maintenance technicians and many fewer line workers, but if those line workers can’t become technicians, where will firms turn? And even if the same workers can be retrained, instilling new skills can take years. If different workers with different capabilities are needed, more fundamental shifts in the labor market must occur. Schools must develop new programs, and talent must seek other career paths; manufacturers will need to attract higher-skill talent from other industries—and those industries may have something to say about that.

The dynamic has already emerged, in the “skills gap” that manufacturers lament. They say that they need hundreds of thousands of new workers with more advanced skills than their existing workforce can supply, a problem they present as a market malfunction. On the contrary, the market is working properly and sending a clear signal: automation will be harder, slower, and more expensive than you may like; and to make it work, you’ll have to design your processes with the capabilities of the nation’s workforce in mind.

This leads to the third misconception about expectations for automation: that many jobs can be automated entirely. Technology often makes incremental improvements to workers’ productivity, generally leading to more and higher-quality output rather than lower demand for their work. One reason people worry that this time may be different is that they believe that robotics and artificial intelligence can now fully replicate human functioning and can therefore substitute for, rather than complement, the worker. Driving the fear of replacement are self-promotional quotes from Silicon Valley moguls about their plans for transforming society, as well as Oxford University’s much-cited 2013 study warning that 47 percent of U.S. jobs are automatable within “a decade or two.” Among the most “computerizable” of 700 occupations, say the study’s authors, are tour guides, real-estate agents, and fashion models. They rate school bus driver as among the easiest jobs to automate, along with baker and rail-track layer, and well ahead of other types of truck and bus drivers.

Oxford’s research usefully illustrates the limitations of such studies. From a tall enough ivory tower, or a heady corner of Silicon Valley, the claim about school bus drivers might seem to make sense. What could be easier than driving a school bus? The route is the same every day, it’s short, and it gets canceled for snow. For parents, though, the idea of locking 20 kids in a self-driving vehicle for half an hour, with no adult supervision, sounds dubious at best.

These misfires point to a more general problem with automation predictions: an abstract description rarely captures the full complexity of any job. Ironically, broad classification of jobs as automatable looks like something that a computer might produce if one took such classification work as an automatable task—rote and lacking in nuance and specificity. When researchers at the Organisation for Economic Co-operation and Development tried focusing on the actual tasks required of various jobs, instead of the occupation description itself, they found that only 9 percent of jobs were automatable. McKinsey & Company, likewise, finds that only 5 percent of jobs could be entirely automated with technology that has at least been demonstrated in labs, whereas 60 percent of jobs could have at least 30 percent of their activities automated; overall, the analysis forecasts that half of tasks could be automated by 2055. This implies that automation could drive a doubling of productivity over 40 years, or less than 2 percent annually.

Finally, dire predictions ignore the positive. Even stipulating the unprecedented nature of future technological changes, the full implications of those changes extend far beyond automating low-skill jobs—and no law dictates that they should operate on balance to the disadvantage of workers. A tool like 3-D printing, for instance, has the potential to reduce the demand for labor in large factories. But it might also spawn new industries, in which smaller businesses can perform small-scale, highly customized manufacturing with relatively little capital investment. If large factories begin to employ highly skilled and compensated robotics specialists, such specialists will relocate to the communities where the factories operate, changing demographic and economic profiles for the better.

If a truck driver can leave his cab and work from a central control facility, in which he toggles to whichever of numerous trucks requires human guidance at a particular moment, he will likely be happier, healthier, and better paid as well. If e-commerce supplants retail, the countless sales jobs concentrated in high-cost urban centers will be redistributed as logistics, delivery, and customer-service positions. Michael Mandel of the Progressive Policy Institute finds that e-commerce is creating new jobs much faster than retail is destroying them, and those jobs pay better, too. Anyway, employment in the retail sector reached an all-time high in 2017, with the share consisting of nonsupervisory and production workers unchanged from 1997.

By all means, we should continue scanning the horizon for a hypothetical threat that some fear might materialize. But we cannot let those fears derail the real-world discussion about how to fix the problems that we actually have. Historically, economists and policymakers have led the effort to explain that technological innovation is good for workers throughout the economy, even as its “creative destruction” causes dislocation for some. So why, suddenly, are economists so eager to throw robots under the bus? It’s hard to avoid the conclusion that they wish to direct attention away from themselves. Without automation to blame, the stalling of the nation’s industrial economy begins to look like the result of poor policy choices, not unstoppable forces.

A good place to start would be the regulatory frameworks that have ballooned the cost of industrial activity. Studies of the Clean Air Act have found that the stricter dictates imposed on regions with higher pollution levels eliminated hundreds of thousands of jobs, slowed the creation of new plants by 26 percent to 45 percent, and reduced the sector’s profitability by 9 percent. The broader regime of environmental regulation has not only raised costs but also lengthened timelines and created uncertainty over the ability to proceed with projects. It has also raised energy costs and stalled infrastructure upgrades.

“Government has made it harder and less profitable to build businesses that employ Americans to make things.”

In other ways, government has made it harder, riskier, and less profitable to build businesses that employ large numbers of Americans to make things. Aggressive safety requirements harnessed to a costly litigation system might be justifiable in isolation, but not when duplicated by sclerotic unions that pile burdens higher. The decline in unionized manufacturing jobs over the past four decades accounts for more than 100 percent of the total job loss; nonunion manufacturing employment has actually increased. (See “More Perfect Unions,” The Shape of Work to Come, 2017.)

Other misguided policies have eroded the prospective workforce. Repeated expansions of the safety net, extending to programs even for able-bodied men, have not only made a job less necessary but also created penalties for taking one. The education system spurned vocational training in favor of college for all. Hundreds of billions of dollars flowed annually toward the fortunate half of students headed for postsecondary degrees, while society designated the other half as losers and offered them nothing. The cultural imperative to work fell away, and many young people chose simply to stay home.

Finally, any discussion of automation as the dog that has not barked requires acknowledgment of the dog barking loudly: international trade. For all its myriad benefits to consumers, trade drags heavily on domestic manufacturing output when goods previously made in America are instead imported from abroad and no commensurate rise follows in overseas demand for what Americans can make, as is the case with our trade relations with China. The explosion of the U.S. trade deficit since the 1990s reflects growth in American consumption of manufactured goods, but with the associated growth in output occurring far from home.

Nor are jobs lost to trade comparable with those lost to automation. Productivity gains occur bit by bit, year by year, but a plant that shuts down and moves overseas is here today and gone tomorrow. Compare also the situation facing a worker dislocated by trade with that of a worker laid off or never hired because of automation. In the former case, the facility is gone, and the production now occurs somewhere else. In the latter, the facility is still operating, likely producing more output than before. Total demand for labor from the firm and its surrounding ecosystem is likely larger, and if capital has replaced labor, the remaining workers are likely earning more—and some other highly paid professionals may be arriving in town with new demand for services of their own. In which town might it be easier to find a new, good-paying job?

Recall that one study did find a small negative effect on employment from robotics—on the order of 30,000 jobs lost per year. Using a similar methodology but focusing on the effects of exposure to trade with China, researchers found annual job losses five times larger. A reporter telling the story of automation speaks to a laid-off worker at a lunch counter; one telling the story of trade reports from the empty parking lot of an abandoned building. There is no one left to talk to.

In the discussion of what has happened to America’s working class, automation is a distraction. While it can be hard to acknowledge that the problem is instead one of our own making, the news is not all bad. If bad public policies rather than irresistible forces are responsible for the nation’s predicament, then with better policy choices that prioritize the needs of the industrial economy might come better results.

Illustrations by Dante Terzigni


City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next