Data centers in outer space? It makes sense, according to Elon Musk, hence the recent tectonic, trillion-dollar merger announced between his SpaceX and xAI companies. Whether it’s sensible, or just the latest manifestation of AI hype, depends on the answer to a broader question: Is AI itself a boom or bubble?

Despite the already-stratospheric level of hype, promise, and anxiety, most analysts realize that it’s still early days for artificial intelligence. Never mind the anecdotes—surveys, including one from the Census, make clear that AI is already widely employed. But the level of adoption across all business sectors, though rising, remains low. Another recent survey, this one querying thousands of CEOs, found that companies are widely experimenting with AI and selectively adopting it. Already, nearly one-third report that AI boosted revenues; one-fourth say that it lowered costs. It’s a start.

Forecasting how much demand there will be for AI services presents the same challenges as forecasting demand for all previous revolutionary technologies. Few imagined, for example, that the automobile’s invention would lead to more registered cars in America than drivers and also stimulate entirely new industries, such as “fast food.” Similarly, few imagined circa 1984 that the PC would lead to more computers than humans, or that e-commerce would emerge as a transformational economic force.

One has to step back from specifics to see that AI is at least as consequential as the shift from central computers to personal ones, and for similar reasons. Invention of both personal computers and personal AI constitute profound advances in computer accessibility. PCs achieved that with their small size and graphical user interface. AI brings true natural language as the interface, while simultaneously allowing non-experts to extract useful information from complex sources or situations. Projecting future demand for AI is not about guessing the demand for specific hardware but instead about guessing demand for useful information in general.

Previous technology revolutions have all yielded new kinds of services—measurable by, say, road-hours, air-hours, viewing hours, restaurant eating hours, and so on. All have clear limits. But for information and data? Demand is infinite for information, and there’s an infinite supply of data.

We will always want more granular data about everything, from the operations of machines and processes to every feature of nature and our environment, including ourselves. Data, the resource that AI refines, is unique because it can be created by sheer imagination. What do we want to measure and analyze? The answers are unlimited. In scientific domains, for example, a single modern instrument generates more data than existed in the entire world a half-century ago. AI makes sense of all that.

History points to where we’re headed. Back in 1962, it was obvious to the Council on Library Resources—yes, librarians—that computers would change the game. The Council commissioned MIT computer scientist J. C. Licklider, a prescient thinker, to undertake a detailed analysis of a future “neolibrary” that would be “a network.” Licklider estimated that all printed information in all books at that time constituted about 1,000 terabytes (a petabyte) of data. He guessed that computerization would increase that quantity five-fold by 2000 A.D. Instead, the amount of data expanded 100,000-fold by 2000, and it’s grown another 1,000-fold since then. This is where history will both rhyme and repeat.

AI deployment will accelerate by a two-factor phenomenon. The power of the underlying software is progressing faster than it did for conventional computing a half-century ago, and its deployment is accelerated by using and building on the already existing cloud. Before AI came on the scene, more than 5,000 data centers were already operating in the United States alone. That the cloud is on track to be the largest infrastructure ever built is not hyperbole—it’s measurable in dollars, material tons, and network route-miles.

Which brings us back to orbital data centers. Since the epicenter of AI is within U.S. companies, one might ask how best to bring AI services to everyone, everywhere globally? The answer: either build data centers thousands of miles away, often in places that can be geographically or politically challenging, or let those citizens connect to data centers a few hundred miles away, overhead in low-earth orbit.

Is that even possible? Well, consider that SpaceX already has what amounts to a huge data center in orbit. Collectively, its 8,000 Starlink satellites contain nearly a half-million computers powered by solar arrays that, in total, offer about 100 MW. That’s roughly the same number of computers and power used for a big earthbound data center. The orbital one is just spread out, its nodes connected by free-space optical links instead of wired fiber. It’s not hard to imagine adding AI chips to the Starlink architecture.

Of course, Starlink’s computers today are used solely to analyze and manage data routing and transmission, not AI data analysis. But that’s a distinction with essentially no engineering difference. Earthbound data centers use computers both to process and transport information—and, notably, the latter function accounts for nearly half of the cloud’s overall electricity use.

But despite “free” solar power, an orbital data center will still cost more—much more. One reliable estimate, based on Starlink’s record, calculated that putting a 1 gigawatt data center (the hyperscale benchmark) in orbit would add $35 billion to overall costs. And that assumes that SpaceX meets its future operational goals. Even so, given the current eye-watering levels of big-tech spending, the key barrier likely won’t be cost, but whether future global AI demand is big enough that such a market niche makes sense for what orbital AI offers. Google, for one, has already published its own detailed analysis of the costs of an orbital (distributed) data center.

As with all technologies, there’s rarely a one-size-fits all answer. For example, the arrival of commercial helicopters three decades after fixed-wing aircraft offered profound convenience advantages, created new markets and businesses, but didn’t replace aircraft. Odds are SpaceX’s vision will, in hindsight, be seen as riding a similar wave of a far broader revolution.

Photo by Kevin Carter/Getty Images

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading