© 2026 | Jefferson Public Radio
Southern Oregon University
1250 Siskiyou Blvd.
Ashland, OR 97520
541.552.6301 | 800.782.6191
Listen | Discover | Engage a service of Southern Oregon University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
The Jefferson Journal is JPR's members' magazine featuring articles, columns, and reviews about living in Southern Oregon and Northern California, as well as articles from NPR. The magazine also includes program listings for JPR's network of stations.

Inside the Box: The future of data centers is out of this world

Proliferating AI is overloading our already overloaded power grids that are now buckling beneath skyrocketing computational demands to process all those bits and bytes so that we can have AI-generated pictures of Donald Trump hugging a kitten or riding astride a majestic lion as well as entirely AI-generated short films with thoughtful titles like Broccoligeddon and Drinking Gasoline.

The rise of AI-everything—from chatbots to generative-AI tools that can create images and video, produce writing, computer code, and music—has resulted in the rapid growth of sprawling data centers all over the globe that are exponentially consuming energy.

There are currently more than 4,000 data centers in the U.S. alone. In 2024, those data centers collectively consumed 183 terawatt-hours (TWh) of electricity, which was 4 percent of the country’s total electricity consumption. That may not sound like a staggering amount, but to add some perspective, a single AI-focused data center annually consumes as much electricity as 100,000 households. What is staggering, is that overall data center power consumption is projected to grow by 133 percent to 426 TWh by 2030, which, by the way, is only 4 years away.

QTS Data Centers in Hillsboro, Oregon, contribute to the massive energy demand driven by AI, with data centers consuming a huge chunk of Oregon's power.
Image Courtesy of Google Earth
QTS Data Centers in Hillsboro, Oregon, contribute to the massive energy demand driven by AI, with data centers consuming a huge chunk of Oregon's power.

Whether or not you directly use AI tools in your daily life, the AI revolution is going to impact you financially with soaring electricity costs. In fact, it’s already happening for people who live nearby those power-hungry data centers. According to a recent report by Bloomberg News, today’s wholesale electricity costs 267 percent more than it did five years ago for customers who live near data centers.

Our current trajectory of AI growth and the accompanying power demands is not sustainable. But despair not my fellow humans—Elon Musk plans to launch data centers into space in order to save the planet and all of humanity from this mounting “AI power bottleneck”.

"My estimate is that the cost of electricity and the cost-effectiveness of AI and space will be overwhelmingly better than AI on the ground so far, long before you exhaust potential energy sources on Earth," said Musk at the U.S.-Saudi investment forum.

“Simply scaling up Starlink V3 satellites, which have high speed laser links would work,” Musk said. “SpaceX will be doing this.”

Starlink, which is a subsidiary of Musk’s aerospace company SpaceX, provides global Internet connectivity via an orbiting fleet of more than 8,000 of its V2 satellites. Starlink’s larger V3 satellites will provide gigabit Internet speeds, but can only be launched using SpaceX’s new Starship rocket, which is still in testing. Dozens of V3 satellites, however, are expected to be launched with each Starship beginning this year in 2026.

Musk isn’t alone in his vision of putting data centers in space to orbit the Earth. Google recently announced its “Project Suncatcher,” which aims to launch prototype servers into orbit by 2027.

According to a posting on the Google Research blog, “our new research moonshot, Project Suncatcher, envisions compact constellations of solar-powered satellites, carrying Google TPUs and connected by free-space optical links. This approach would have tremendous potential for scale, and also minimizes impact on terrestrial resources.”

TPU refers to Google’s Tensor Processing Unit, which is specifically designed for optimizing AI compute workloads.

“The sun is the ultimate energy source in our solar system, emitting more power than 100 trillion times humanity’s total electricity production,” wrote Travis Beals, Senior Director at Google’s Paradigm of Intelligence research unit. “In the right orbit, a solar panel can be up to 8 times more productive than on Earth, and produce power nearly continuously, reducing the need for batteries. In the future, space may be the best place to scale AI compute.”

Startup Aetherflux, a space-based solar technology company hoping to beam power down to Earth, has also entered the AI space race with the announcement of its “Galactic Brain” project, which aims to launch a constellation of solar-powered AI compute satellites as early as 2027.

All of these herculean and expensive efforts to solve the “AI power bottleneck” by launching data centers into space appear to be premised on the assumption that AI should continue to expand and that we have an obligation to meet its growing power demands.

Or maybe we don’t really have a choice but to keep expanding our technology, for better or for worse, because that’s just what we do as a species. We invent. We build. We expand. And we’ll destroy anything that impedes our progress.

Let’s hope that if and when AI finally becomes conscious and sentient while whirling in orbit around the Earth that it’s not just like us.

Scott Dewing is a technologist, teacher, and writer. He writes the technology focused column "Inside the Box" for the Jefferson Journal. Scott lives on a low-tech farm in the State of Jefferson. He was born in the same year the Internet was invented and three days before men first landed on the moon. Scott says this doesn't make him special--just old.
JPR relies entirely on public support. Join the community of JPR supporters today.