Space Data Centers: Energy is Free, and No Water is Needed 0

Technologies
BB.LV
Lonestar Data Holdings предлагает разместить сервера на нашем естественном спутнике.

Orbital capacities will also be more secure than terrestrial infrastructure.

The rapidly growing volume of computations – primarily driven by the scaling of generative artificial intelligence (AI) – is prompting the search for new approaches to organizing data centers (DCs). Record-breaking gigantic clusters, which will handle the storage and processing of previously unimaginable volumes of data, are already being built on Earth.

But there is also a special trend: the deployment of such complexes in orbit around the planet. Let’s explore why this is necessary and what challenges technological companies face in this new sector.

According to analysts at Structure Research, by the end of 2024, more than 1.1% of the world’s total energy consumption will be attributed to data centers – 310.6 TWh. This is already more than that of some individual countries, and over the next five years, this figure could exceed 1000 TWh and reach up to 3% of the global total. The parameters are averaged: in the leading AI development countries, the USA and China, 69% of all energy consumed by data centers is concentrated, respectively, where energy costs will be significantly higher than in other countries.

Despite the growing demand for computer technologies (and, consequently, the associated computations), there is a flip side to the coin. First and foremost, the impact on the environment.

According to analysts' forecasts, carbon dioxide (CO2) emissions in the USA alone could exceed 80 million tons by 2030, which is equivalent to the emissions of about 10 million internal combustion engine vehicles. Water consumption, which is used for cooling equipment, is also increasing. Media estimates suggest that today data centers consume 560 billion liters per year, and by 2030, at least twice as much will be needed.

Another aspect related to ecology, albeit indirectly, is the volume of waste. IT corporations present new "hardware" every year, and AI companies, naturally, want to use cutting-edge equipment, as it provides a competitive advantage. As a result, the volume of obsolete graphics accelerators, servers, etc., being disposed of is increasing.

Therefore, it is not surprising that the idea of moving some computer computations beyond the Earth’s atmosphere has emerged.

The viability of such a plan was confirmed, in particular, by scientists from Nanyang Technological University (NTU) in Singapore in the fall of 2025, but project development began even earlier.

Cautious tests of orbital computing technology have been conducted for several years. Noteworthy is Axiom Space – a space corporation from the USA, which has been developing the ODC (Orbital Data Center) project since 2022. It is intended to be part of a future orbital station, while the method is currently being tested on the ISS for exchanging large volumes of data with Earth.

The American startup Starcloud sent its first satellite with an Nvidia H100 graphics accelerator (GPU) into orbit in November. This indicates a focus on the AI segment: such chips are specifically used for neural network operations. The startup's goal is to form a network of orbital AI data centers. The next launch is scheduled for 2026.

China, as usual, decided to take a large-scale approach. In May, Chengdu Guoxing Aerospace Technology, also known as Adaspace, launched 12 satellites, each capable of deploying an AI model with 8 billion parameters. This is not a large number by industry standards, but the main effect lies elsewhere: the overall composition of the constellation is expected to reach 2800 devices, interconnected in a single network. When and in what actual volumes the project will be implemented remains unclear. It is planned that the first real, i.e., non-test, computations will be possible in 2027, with the target capacity expected to be reached by 2030. It has been stated that the satellites will primarily work with astronomical data: processing information about space, analyzing images of Earth, etc.

Of course, servicing – as well as deploying – such equipment in space is quite feasible. The question is the cost: the cheapest launch of 1 kg of cargo into low Earth orbit costs about $1500. According to various estimates, the economics of relevant projects may start to work in the long term if costs are reduced to $100 per kilogram or more. Such a figure is only projected for the developing SpaceX Starship.

Whether the corporation will achieve the stated cost is hard to say. However, given the ambitions of its head Elon Musk, who is simultaneously developing the AI company xAI (and is therefore interested in increasing computational power), anything is possible. Moreover, he already has real experience in deploying space systems: the internet, powered by thousands of Starlink satellites, is also Musk's brainchild.

Of course, the economics of space computing will be influenced not only by launch costs. Indeed, neither the radiation problem nor cooling issues in space can be underestimated. Equipment not protected by the Earth’s atmosphere is exposed to solar radiation, and ordinary chips – such as the Nvidia H100 sent into orbit by the Starcloud startup – will quickly become unusable if not protected.

High-energy particles from the Sun and galactic cosmic rays coming from supernovae can damage electronics and cause failures. "At Skoltech, we study how to model and predict space weather to minimize such risks. Orbital data centers must be able to 'shelter' from solar storms, like modern satellites, but on a much larger scale," notes Podladchikova.

In other words, there are two options: either develop radiation-resistant electronics from the outset (as is done on many satellites; such equipment is usually 1–2 orders of magnitude more expensive than ordinary ones), or create special anti-radiation shields (as on the ISS).

Cooling is also a challenge. On Earth, it is relatively simple: heated electronics are cooled upon contact with water or even air. In space, there is neither, nor is there anything that could dissipate heat. Therefore, engineers are devising special radiators that will dissipate thermal energy in a vacuum. Starcloud, for example, is designing kilometer-long radiators for its future data centers, which will also need to be sent into space.

One of the key factors for the efficiency of orbital DCs will, of course, be quality communication. The best demonstrator of current technologies is the Starlink system, whose thousands of satellites provide high-speed internet in various regions of the Earth. This is made possible by inter-satellite optical (laser) communication with a channel of 100 Gbit/s. The maximum distance between satellites (i.e., the distance the signal travels) is over 5,000 km.

Such systems are being developed by many engineering teams and providers. The Chinese Chang Guang Satellite Technology (CGST), for example, achieved a speed of 100 Gbit/s when transmitting signals to Earth in 2025, and in September, it updated the record for geostationary satellites (at a distance of 36,000 km). It is evident that lasers will become the standard for communication for orbital data centers. Next, operators will need to transition to terabit channels and implement quantum encryption technologies.

It seems that considering the challenges and limitations, the concept appears questionable: too complex and extremely expensive. So why develop the idea at all – is it only for ecological reasons (which, however, is not insignificant)? Certainly not.

First and foremost, it all comes down to energy. Orbital data centers will be able to receive it directly from the Sun, without scattering in the atmosphere (i.e., an increase of about 40%) and interruptions at night and during bad weather. As a result, the utilization factor of such energy – extremely cheap, since sunlight is produced naturally – will increase from about 29% to over 99%.

Ultimately, by excluding space DCs from terrestrial energy networks, the overall costs for powering orbital clusters could be an order of magnitude lower than on Earth. Plus, savings on water (millions of liters of which are also not cheap): although vacuum radiators are more expensive to produce, once installed, they continuously dissipate heat into the infinite cosmos.

Speaking of infinity: potentially, the expanses of the Universe provide corresponding opportunities for scaling. However, there is a limiting factor – scientists already report on the debris in orbit today, and as satellite constellations grow by thousands of new devices, the problem will become even more critical. But almost nothing prevents placing such DCs far from Earth, at least, for example, on a relatively free geostationary orbit.

It is believed that orbital data centers will also be more secure than terrestrial infrastructure. There is logic in this: there is no permanent staff on the satellites, and malicious actors cannot physically reach them. Plus, all communication channels are, firstly, encrypted (including using quantum cryptography), and secondly, localized: the operator decides where the laser "delivers data" from the satellite.

However, the threat is certainly not completely eliminated. The primary point of vulnerability will be the ground stations. If hackers manage to break into a hypothetical control center and send unauthorized but supposedly verified commands to the satellites, predicting the consequences is difficult. In other words, the window of opportunity for malicious actors narrows, but the consequences of a hacking attack, if it occurs, will be orders of magnitude more serious than at terrestrial DCs.

The aforementioned "almost" partly defines the limitations on the tasks that orbital DCs can perform. As we remember, China is primarily focused on working with astronomical data. Such materials – parameters, images, etc. – are usually first accumulated in orbit: in communication satellites, Earth observation satellites, stations, scientific devices, etc. Today, to process data, the entire array must be transmitted to Earth.

It would be more appropriate to filter out most "empty" data directly in orbit using smart algorithms, perform all necessary operations with what remains, and transmit only the results to Earth.

With "terrestrial" tasks, it is more complicated. The system where a user's request goes to orbit, bounces around satellites during processing, and only then the result returns back seems somewhat strange. For some industries where speed is critically important (e.g., trading, online gaming), such an approach is fundamentally unviable.

In other cases, if this ultimately makes computations cheaper, then why not? Suppose waiting for a response from a hypothetical ChatGPT takes a second or two longer. But if the wait is free, most users (whose total number in the world, let us remind you, is rapidly growing month by month) will be satisfied.

Paid subscriptions – yes, can be left "on Earth": the consumer pays for a quick result by using more expensive and less eco-friendly infrastructure. But there is also a nuance: the latest updates of neural networks for code writing (in most cases, they are paid) can, without exaggeration, take hours to work on complex user tasks. Against this background, seconds of delay will be imperceptible – so why not redirect such processes to orbital DCs?

In fact, technological leaders have already begun to implement such initiatives. The most notable is the Suncatcher project, which Google is developing in collaboration with the startup Planet. The corporation plans to launch several satellites with Trillium TPU v6e neuroprocessors into orbit in 2027, which, according to tests, can operate for up to five years in an aggressive radiation environment. A key component of the mission, as the name suggests, will be "connecting" a distributed network of devices to solar energy and exchanging data at terabit speeds.

Overall, there are no insurmountable barriers to the deployment of data centers in orbit, so it remains to wait for the concept to be realized. After that, we will witness new, truly cosmic scales of competition among AI giants.

Redaction BB.LV
0
0
0
0
0
0

Leave a comment

READ ALSO