Data center in the sea and in orbit: do they have practical meaning?

    Boring data centers made of reinforced concrete are no longer in vogue, IT companies are trying to place them on and under water. There are rumors about space data centers - we could not get past such a technologically beautiful phenomenon.



    The rapid growth in demand for computing resources has led to the construction of a huge number of data centers. The equipment installed in them consumes megawatts of electricity and generates heat. The main issue was energy efficiency, which no one thought about at the dawn of housing construction: today, engineers are increasing power density per rack, reducing PUE (the ratio of total energy consumption to IT equipment consumption) and are engaged in other technological shamanism. Industry leaders are increasingly looking towards innovative solutions. Today we will try to figure out what is more in this process - fiction or madness.


    Data center in space


    The cost of delivering goods from the planet is high, with the hiring of qualified personnel to service equipment in orbit there will be problems, difficulties will arise with the transfer of data - satellite communications do not allow to achieve such a short response time as fiber optic lines. There is a problem with space radiation from which sensitive electronic equipment must be protected. The funny thing is, it’s more difficult to utilize the heat generated by IT iron in space than on Earth. Amateurs call it cold, but space is rather empty, except for the meager content of different atoms, electromagnetic radiation and a variety of elementary particles. The vacuum does not conduct heat, therefore the only way to discharge it into the external environment is electromagnetic radiation, that is, huge heated radiators that will glow in the infrared range.


    Nevertheless, enthusiasts suggest placing data centers in space, although megawatt capacities are out of the question. Among space explorers there are many supporters of conspiracy theory - there is an opinion that in orbit data will not be available to governments and other reptilians (we would argue about reptilians). In 2016, Cloud Constellation from Los Angeles actively cut investors, promising by 2019 to place a whole petabyte of data in orbit on its own network of satellite servers and communication satellites. The deadline has already come, but there is no mention of an orbital group. This did not stop Cloud Constellation from raising $ 100 million in 2018 .



    StartX ConnectX plans to launch a satellite network to store crypto wallets and other private data off the planet, and the British Space Agency has allocated more than £ 4 million to create a demonstration supercomputer in space. It’s too early to talk about other worlds: mankind hardly sends automatic probes there, what kind of data center is it? And why are they needed if the radio signal from Mars to Earth, for example, goes from 3 to 22 minutes, depending on the relative position of the planets. Less crazy options are offered mainly in the field of space communications, but so far extraterrestrial data centers remain fantastic, if you do not count yet unrealized projections of startups.


    On water and under water


    We’ll go down from the airless sea to the depths of the sea: last summer, Microsoft began the second phase of the Natick project , whose goal was to create a modular underwater data center. Experiments with diving servers have been conducted since 2013, and in 2018 a prototype the size of a sea container was lowered to the bottom off the coast of Scotland. It contains 12 racks in which 864 servers are mounted. To assemble powerful data centers from such modules near large cities can be cheaper (in theory) than on land. Water is characterized by high thermal conductivity and high heat capacity, so the cooling problem is simplified. With data transmission channels, difficulties are also not expected, the technology for laying marine fiber optic lines has long been worked out.




    Engineers test Project Natick servers and cooling systems at the Naval Group in France. Photo: news.microsoft.com


    In addition, Microsoft has ideas for using renewable energy sources to power the IT load: tidal turbines, wave power converters, and even wind turbines installed on the shore. Not without problems: the sea is rather damp, and the electronics do not like dampness. It is clear that the data center modules must be tight and fault-tolerant, they will have to work without maintenance for a long time. To prevent corrosion, the interior of the containers will be filled with nitrogen.


    The guys from Google do not dive so deep, but they swim quite confidently. True, so far on the Internet: in pictures and videos. The idea of ​​placing a data center on board the vessel is not new in general, it has been popping up on the Internet for 10 years in various variations. Someone is confined to a barge off the coast, others want to sail in neutral waters so as not to have problems with the laws of different countries. You can recall the company Nautilus Data Technologies from the USA, which developed a prototype of a floating data center. Google engineers have been dealing with the problem for a long time, but then the corporation got cold about the idea of ​​sea travel. This did not stop her from patenting a data center driven by kites in waves in 2017. These original devices must also generate electricity.



    The main idea here is about the same as that of diving servers - the use of sea water for cooling. At the same time, data centers remain serviced, which in general is pretty good. It is difficult to say how promising such projects are, but they are quite feasible at the current level of human development. Floating data centers can be used, for example, in countries with a warm marine climate and a lack of land for building. In addition, this is a good way to dispose of old ships.


    Freecooling in the Arctic


    We will return from the deep sea to land and take a look at the classic data centers. Given the trend of increasing power density per rack and the departure of large IT corporations in the so-called Hyperscale computing, the problem with heat dissipation is becoming more serious. To some extent, it can be reduced by the mode, increasing the degree acceptable for IT equipment, but the functioning of the traditional engineering infrastructure (cooling and uninterrupted power supply) can take from 30 to 50% of the data center energy consumption. The main expense item at the same time - using powerful air conditioning compressors. It would be a logical decision to try to abandon them at least partially by applying one of the existing schemes of free cooling (freecooling).


    If you do not delve into the technical details, there are two options: direct single-circuit cooling of the machine rooms with outside air with partial recirculation in the winter, as well as double-circuit systems with air recirculation in the machine rooms and recuperative devices. In the first case, heated air from the hot zones is released into the atmosphere, and the cleaned air from the street is fed into the cold zones. Sometimes it also has to be dried or moistened, and filters require regular replacement. If it is too cold outside, partial recirculation is activated.


    The second scheme assumes the presence of two open circuits: the data center circulates in the internal, and the outside air is supplied to the outside. Heat transfer occurs in a regenerative apparatus. There may also be different options, we will focus on the most interesting - a rotary heat exchanger, which is also called the Kyoto cooling system. The principle of its operation is very simple: a massive metal wheel rotates slowly and transfers heat from the internal circuit to the external.



    Unfortunately, free cooling cannot be used all year round because of the stringent requirements for the outside temperature: if it is too high, the artificial cold setting is automatically switched on. In the northern regions, the use of freecooling allows you to keep them turned off up to 80% of the time, if you take the average indicators for the year. The most efficient rotary heat exchangers can reduce the share of energy consumption of the cooling system to 5–7% of the total data center consumption, and this is an excellent result.


    Another logical move is to move data centers to high latitudes. Alas, it only partially works. Data centers in the Arctic did not particularly take root due to the huge number of difficulties associated with their construction and operation. In regions so remote from civilization, there are problems with power supply and communication channels, as well as with the availability of qualified personnel - few specialists are ready to go to twist the tails of polar bears. The funny thing is, the free chill in the freecooling system can also be expensive because of too dry air. If it is not moistened, a strong static charge will accumulate on the electronic components. Of course, companies are trying to build data centers in the northern regions, but they do not climb into the polar ice and still prefer the neighborhood of megacities.


    We warm the house, not the atmosphere


    The heat generated by data centers was discharged into the atmosphere, and when it became too much, people thought about using a by-product for peaceful purposes. The idea of ​​useful heat recovery is obvious, but not so easily realized due to unstable demand for it and for a number of other reasons. Due to the high temperature in the street, the most problems with cooling arise in the summer, and just at that time, the demand for heat among the population is minimal. Nevertheless, there are a lot of similar projects in the West, one of them was launched by Yandex. Her data center is heating a small Finnish town .


    Summary and Conclusions


    If we ignore high ideas, the construction and operation of a data center is a big capital and operating costs that the business would like to reduce. Here we immediately discard the cosmic theme, because it is incompatible with a reduction in spending in the foreseeable future. Offshore projects are possible, but they have not yet emerged from the embryonic stage. It turns out that the only real way to reduce costs is to improve traditional data centers. First of all, by increasing their energy efficiency through the use of freecooling and beneficial heat recovery. For the coming years, this is the main trend in the development of housing construction, and then scientists and engineers can give us something not so prosaic - science fiction sometimes becomes a reality.


    Also popular now: