Economists love technological change because it translates into increased productivity. Rising productivity can then be measured, albeit imperfectly, and feeds directly into economic growth. However, while we can use standard measures to observe some forms of economic progress, other technological and scientific advances do things for humanity that the usual measures of output fail to reflect.

When a new virus attacks the human race, we get our best scientists together and fight back. It’s a horrible battle, but in the end, we win. A century ago, a flu virus killed a substantial percentage of the world’s population. Since then, immunology and virology have advanced by a quantum leap. Had Covid-19 appeared in 1900, it would have been far more devastating—the virus responsible for the 1918 Spanish flu was not identified until 1935. In 2020, science identified the virus right away, sequenced its genome within weeks, and produced vaccines within a year. Technology, more than ever, determines how we live. Yet such triumphs rarely make it to the national income accounts and do not show up in gross domestic product.

One technology that has dramatically advanced in the past century could be called “imitating life”—creating images and sounds that reproduce aspects of reality, making the observer “experience” something that he or she is not actually physically living through. Past attempts to create such a technology rarely pretended to be the real thing: while watching a movie, you knew that it was a movie. It was fake—a virtualization of reality—but if well-done, it worked. The history of imitative technology, much like that of medical technology, follows a “punctuated” pattern: for hundreds of years, the technology is more or less static; then comes a sudden eruption of new knowledge and capabilities—and the world changes irreversibly.

Human attempts to imitate reality go back into the foggy millennia of prehistory. The earliest-known paintings, found on the Indonesian island of Sulawesi, have been dated to 44,000 years ago. Medieval Christians could “witness” the birth of Jesus and the crucifixion from paintings hanging on their church walls, and they surely were moved by these images. The resemblance to reality, however, required suspending disbelief. Depth, space, and precise human features were unrealistic. During the Renaissance, painters in the West got better at mimicking reality: facial expressions became more lifelike, colors more vivid, proportions more realistic.

Most revolutionary was the invention of perspective, attributed to a Florentine polymath, Filippo Brunelleschi (1377–1446). The basic idea: the use of “vanishing points”—faking depth using a linear perspective system projected on a two-dimensional plane, to which all lines converged, at eye level, on the horizon. The idea caught on, and a century or so after Brunelleschi, painting could achieve a far more sophisticated realism. The vivid, seemingly three-dimensional paintings of the Baroque era are far more lifelike than those of the fourteenth century. For a gorgeously vivid illustration, try Kitchen Scene (1605), by Dutch painter Joachim Wtewael.

The vivid, seemingly three-dimensional paintings of the Baroque era, such as Kitchen Scene (1605), by Joachim Wtewael, are far more lifelike than those of the fourteenth century. (© RMN-GRAND PALAIS/ART RESOURCE, NY)
The vivid, seemingly three-dimensional paintings of the Baroque era, such as "Kitchen Scene" (1605), by Joachim Wtewael, are far more lifelike than those of the fourteenth century. (© RMN-GRAND PALAIS/ART RESOURCE, NY)

The nineteenth century saw an explosion of progress in imitative technology. Virtualization of images became simple and realistic with photography. Then, in the twentieth century, technologies emerged that could reproduce both sound and motion, with movies being the ultimate form of imitating reality. Further technical progress made movies increasingly lifelike: colors, resolution, surround sound, even somewhat dubious attempts of creating three-dimensional experiences—all managed to make audiences laugh, weep, or cringe. Music reproduction went through similar stages: listening to a long-play stereophonic recording of a Mozart quartet conveyed the illusion of being at a concert far better than the scratchy 45 rpm recordings of the 1930s. By 1970 or so, the technology of imitating reality had advanced far beyond where it had been in the age of silent black-and-white movies.

But the best was still to come. The digital revolution, full of miracles, has taken imitative technology on a new trajectory. The high resolution of digital-disk movies and music, further augmented by Blu-ray disks and ultrahigh definition, has created a true lifelike experience. Streaming and high-resolution flat-screen TVs have extended the fake reality beyond movie theaters and concert halls and into the home.

Equally revolutionary, advanced video games let people immerse themselves in an artificial reality, in which they can fight monsters, build cities, and fly spacecrafts, all from their living rooms. This simulated world, fully interactive, is often hugely more colorful and exciting than the actual reality of many individuals. With VR (virtual reality) and AR (augmented reality), the lines between the real world and the world of representation, still quite sharp in, say, a Chaplin movie, get even blurrier. As Kevin Kelly puts it in The Inevitable, “Virtual Reality is a fake world that feels absolutely authentic.” The illusion feels real. Of course, it isn’t. The player may sweat, or become nauseated or euphoric, but is never in real danger.

Is the Great Fake a positive development? Every new technological idea, from the invention of the ax onward, provoked fears about possible misuse, and imitative tech is no exception.

Some of the dangers are obvious. Advanced video games can be addictive and may cause some people to miss out on education, socialization, and work experience. But the downsides of imitative technology seem minor relative to the benefits. Watching and hearing an opera on a high-resolution disk may only be an imitation of being there in person, but the technology makes the experience less expensive and accessible to more people. Violent video games trouble some critics, but they are indisputably less socially harmful than the real thing, and some social scientists, such as Regan Mandryk at the University of Saskatchewan, have found that they relieve stress. Either way, as long as imitative technology is confined to entertainment, its risks seem limited.

Other hazards of a virtual world, however, go beyond the potential problems created by hyperrealistic movies and games. One concern is “deep-fake” videos, in which computer programs realistically imitate the looks and voices of well-known people—to the point where they can be made to say anything. Political opponents can be shown committing an immoral act or saying something outrageous when they have done no such thing. If simulation grows indistinguishable from reality, more and more people will refuse to believe anything. In Stalinist Russia, photographs notoriously were photoshopped (they did not use that term, of course) to edit out (or in) individuals whose status had changed after the picture was taken. The result was a deep societal skepticism about official information. Imagine a Stalinist regime with access to deep-fake technology. It may be closer than we think.

But even this technology could have beneficial uses. Former soccer star David Beckham produced a video to promote global awareness of malaria. Though Beckham spoke in English, an AI algorithm made it seem as though he was saying the words in nine languages. A fake, yes, but for a good cause.

What, exactly, can modern imitative technology do for us that a Caravaggio painting or a Fritz Lang silent movie does not? Thirty years ago, MIT mechanical engineer Thomas Sheridan wrote presciently about the potential of virtual reality. “What do the new technological interfaces add, and how do they affect this sense, beyond the ways in which our imaginations (mental models) have been stimulated by authors and artists for centuries?” He gave three answers that nicely capture the new technology’s capabilities. First is the degree of sensory information: the resolution, the colors, the sound quality, the naturalness of movement, and so on, reach new levels. Second is participation: Can the player actually control his actions in the game and react to given changes in the environment and respond to the stimuli that the environment poses or the statements that simulated figures make? Third, can the agent control not just his own movements but also the fake environment itself, so that the player is in total control of all the parameters and, in principle, can even play against himself?

Besides entertaining and titillating us, one area that such advanced imitative technology can revolutionize is teaching. The transmission of knowledge from teacher to student depends on the kind of knowledge imparted. Formal or “codified” knowledge can be taught from books and through formal lectures. Here, imitative technology easily can replace physically present teachers by videos or online sessions, but the experience remains a lecture. Teaching “tacit” knowledge is different. The teacher cannot fully describe and express what has to be learned; he must show the student more directly: “this is how you do it.” In the past, most artisanal knowledge was conveyed that way, with knowledge passing from master to apprentice. Today, many fields of study—from carpentry to surgery to piano playing—still require personal contact and “hands-on” teaching, as every postdoc who has worked in a lab knows well. Here, VR could do more, producing, for example, fake patients on whom aspirant dentists and surgeons could try out their skills, assisted by high-precision robotics. Military training already utilizes versions of this kind of VR—it is cheaper, faster, and safer than more traditional training.

Virtual and augmented reality are promising teaching tools because, as Sheridan predicted, they provide a “sense of presence,” supported by student participation. Virtual reality eventually may teach history students about the Roman Empire by making them experience what it was really like to be “a Roman when in Rome,” or feel what it was like to participate in a slave gang, picking cotton. This kind of knowledge is difficult to get from books and even from movies. Direct “involvement” through imitative technology could prove an effective teaching tool.

These new technologies can address other limitations of current teaching. A lecture before a class of 30 students necessarily has a one-size-fits-all property, and, as every teacher knows, this means that lectures may bore some and confuse others. A teaching device using advanced imitative technology can receive feedback from a student, speed up or slow down as needed, and discern areas where that student has difficulty and customize the instruction, with potentially large gains in effectiveness. It sounds good, but as always, a new powerful technology carries risks. Advanced imitative technology could be abused to teach falsehoods, to condition people to become unquestioningly loyal to despots, and to create a conformism that may be destructive to a pluralist society.

The most far-reaching effect of imitative technology could well be telepresence, a term pioneered by another MIT scientist, Marvin Minsky, in 1980. The idea of doing something in a different place from where you are is a particular feature of imitative technology. The advantages of telepresence and its sister concept of telecommuting seemed so obvious and decisive that two decades ago, it seemed inescapable that they would grow very rapidly. After all, working from home has obvious advantages. It means no commute, hence no more rush-hour traffic jams or packed subways; for many, it means not being confined to a soulless cubicle with no privacy. It means flexibility for working parents.

For the economy as a whole, the efficiency gains are equally obvious. After all, if people spend a third of their time away from their apartment, that space goes underutilized; if they spend only a third of the day in their office, that space is now empty two-thirds of the time. Commercial real estate could be reallocated if less office space is needed. Less driving to work also means less wear and tear on highways, less air pollution and lower carbon emissions, and less space allocated to employee parking. Further, reducing commuting time will indeed increase well-being, even if the gains will not show up directly in productivity calculations, because the time spent commuting is counted in the national income accounting as “leisure” (since the employee is not at work and is not compensated for the hours in transit).

Until about 1750, people earned their living and ate their bread by the sweat of their brows, but almost all worked from home and chose their own hours. This was true not only for farmers and artisans but also for doctors, shopkeepers, and teachers. The industrial revolution disrupted that reality and created, as Karl Marx famously pointed out, a new form of production: “the factory system.” When production grew more complex, more machine-intensive, and more dependent on a fine division of labor, employers concentrated workers in “mills.” Hours became rigid, and workers found themselves crammed in to grimy, drafty, and noisy factory halls. After this system achieved dominance in manufacturing, it expanded into large retail stores, office buildings, and schools. The factory system had conquered the workplace for most workers. It was largely concentrated in cities, so workers had to migrate to urban areas if they sought employment. It was one of history’s greatest revolutions.

“Under current technology, the best estimates are that 40–45 percent of workers can work remotely.”

Telecommuting, relying on imitative technology, could partly return work to the home. For many years, the rate of growth of telecommuting was slow. According to the Federal Bank of St. Louis, the percentage of workers toiling “remotely” rose from 0.7 percent in 1980 to about 3 percent in 2016. There are different explanations for this. Among them: teleworking hardware needed to be improved, broadband upgraded and made more accessible and reliable, and the software of teleconferencing made more sophisticated.

For a large part of the labor force, then, remote working was not an option—and that remains the case. Under current technology, the best estimates are that 40 percent to 45 percent of workers can conceivably work remotely, though not all may want to. On the whole, it is more highly educated workers who can benefit from telecommuting, though some high-skill professions require in-person presence. A large range of laborers, from waiters and landscapers to proctologists, still have to be physically at their place of work and interact in person with objects or with people. But modern technology is advancing, and as workers and customers get more comfortable with telepresence, new possibilities for remote work will emerge.

At least in this regard, we may eventually see a silver lining in the Covid-19 disaster. Tens of millions of Americans have had to adjust suddenly to remote work. They discovered that if they spent much of the day working in front of a computer monitor in an office, they could do the same from home. Once this realization sinks in, further technological adjustment will occur, and the infrastructure supporting it will expand. With good connectivity, you can live anywhere and do most of the things you need to do while living in a cramped urban apartment, without fighting through traffic to get to the office cubicle on time. Having a private study where you can do your work becomes more affordable if you live in a more rural area where real estate is cheap, and you might get a nice yard and clean air, to boot. Both work and consumption can be done, in large part, online. Covid-19 is a cruel teacher, but it has concentrated our minds on how we could use imitative technology to beat the Tyranny of Distance.

Imitative technology increases our options, but it does not lock us in. When we go back to a world where we’re not fearful of being infected by others, telepresence will be an addition, not a replacement. If you’re the kind of person who enjoys the watercooler encounter with a gregarious colleague, or taking a client out to lunch, there is nothing to stop you. It seems likely that hybridizing will be a good outcome: work three days a week at home and two days in the office. If employers think that nothing stimulates creativity more than personal contact, they can mandate some degree of that—without necessarily precluding you from doing the rest of the work in your home study.

The shift may not even necessarily cut commutes. One initially perplexing finding of the Federal Reserve of St. Louis study was that office workers who telecommuted some of the time actually drove more miles yearly than their counterparts coming to the office daily. The reason: the telecommuters chose to live farther away and thus had longer, if fewer, drives—but presumably also enjoyed less confined and less expensive housing and chose their hours, so that they were likely better-off overall. Imitative technology gives us the flexibility to work in ways that suit us best.

Many bugs—both technical and human—in telepresence work will be fixed. Access to broadband technology has been uneven, and for many, it has been no less nerve-racking than the daily commute. But as broadband gets cheaper, more powerful, and more reliable, and its screen resolutions higher, colors more vivid, and sound clearer, teleconferencing and Zoom meetings will slowly become the rule rather than the exception. A “Zoom meeting” is an oxymoron: the meeting is fake. But as long as it is a reasonably good substitute for the real thing, and as the substitute gets closer and closer, more people will choose to avoid traffic jams and crowded airports and choose telepresence.

The old way of work won’t entirely vanish. Some places will retain the 9-to-5-in-a-cubicle mode of production. Airplanes will still be assembled in huge facilities. I myself am looking forward to meeting my students again face-to-face in my campus office. But for a growing number of people, working in the office or even the factory may turn from a necessity into an option. Robots will do more of the heavy lifting in plants and warehouses, and as they are computer-controlled, the workers controlling them may be far away. Medicine, law, and education will develop new systems of rendering their services, as retail already has.

Not everyone will be happy. Technological progress never comes without costs. Commercial real estate and urban rentals may fall and never fully recover. Hotels and airlines may have to downsize. Especially for single people, remote work may increase loneliness, already an underdiagnosed plague of modern society pre-pandemic. But again, modern imitative technology can provide some help: social media, at its best, can facilitate some kinds of companionship. Not the same as being together, true, but a substitute.

In the twentieth century, movies were the ultimate form of imitating reality. (UNITED ARTISTS/ALBUM/ART RESOURCE, NY)
In the twentieth century, movies were the ultimate form of imitating reality. (UNITED ARTISTS/ALBUM/ART RESOURCE, NY)

Whether telepresence is actually good for firms remains a matter of debate. Marissa Meyer, then-CEO of Yahoo, notably banned telecommuting. Her argument, expressed in a much publicized 2013 memo, was: “Some of the best decisions and insights come from hallway and cafeteria encounters, meeting new people, and impromptu team meetings. Speed and quality are often sacrificed when we work from home.” Modern research has not unequivocally confirmed this opinion. Personal interaction and brainstorming are historically documented ingredients of creativity. Perhaps by organizing company retreats, firms can make up for some of the lost watercooler effects.

For the average worker, how much personal interaction is needed, given good telepresence technology? Most evidence supports the findings of Stanford’s Nicholas Bloom. In the group he studied (which he acknowledged was not representative), the productivity of remote workers was substantially higher, partly because of the time and energy saved from avoiding the commute and having fewer distractions. Similar results were found by researchers in Europe. Remote workers were more productive, put in more hours, were better motivated, and even increased unpaid overtime hours. Bloom, however, added that there was enough diversity in both the nature of workers and the demands of the job to suggest that Meyer’s insistence on physical presence may not necessarily have been wrong.

The abrupt surge in remote work that the pandemic forced upon us is precisely the opposite of how such a shift should be implemented. The pandemic made employees telecommute who would rather not, and for many, the costs of isolation—compounded by the punishing weight of social distancing—may have outweighed any gains. Moreover, school closures forcing children to stay home can be a major cause of lower productivity (so that productivity data collected this year are suspect). Above all, remote work—precisely because it is most suitable for computer-literate, well-educated workers who are often self-driven and “intrinsically motivated,” in the jargon of economists—has produced a sudden and disastrous sharpening of inequality.

The shock treatment that the pandemic inflicted on modern economies has had terrible consequences. These effects will take years to remedy. All the same, Covid-19 has accelerated telepresence like nothing before. As the MIT report “Work of the Future” pointed out: “Our technologies have been instrumental in enabling us to adapt [to the pandemic] via telepresence, online services, remote schooling, and telemedicine. While they don’t look anything like robots, these remote work tools too are forms of automation.”

As long as imitative technology is added as an option to an employment menu instead of being a necessity, it may do away with some of the more disagreeable features of the factory system and the harshness of urban commutes and business travel. Change is more tolerable if it is introduced gradually and carefully. After all, the factory system took a century and a half to become dominant, and it cannot and should not be eliminated in one blow. Adjustment to the new imitative technology in the workplace will require closing digital divides and refitting many jobs to make them suitable for remote work. It will take decades to adjust and debug the new world that will replace the factory system. It remains to be seen to what extent doing so will lead to a decline in the attractions of urban living and spark an exodus to exurban living areas.

Technology often advances through catastrophes, fears, and traumas. Nothing so concentrates the mind as the knowledge that one is to be hanged in a fortnight, quipped Dr. Johnson. Covid-19 has forced us to learn how to organize our lives without getting physically close to one another. For that, imitative technology has been ideal, whether through Zoom, online shopping, or entertainment options. Nobody wants a world without human in-person contact, social or economic. But a world with more options will be a better world.

Top Photo: With virtual reality and augmented reality, the line between the real world and the world of representation has gotten even blurrier. (MARK RIGHTMIRE/MEDIANEWS GROUP/ORANGE COUNTY REGISTER/GETTY IMAGES)

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next