Friday format: Unusual data center solutions



    / photo Dennis van Zuijlekom CC The

    ever-increasing load on computing systems is forcing companies to look for new ways to design data centers. In this article, we have prepared for you an overview of several of the most unusual solutions for data centers.

    Our related materials:



    Underground Data Centers


    Modern data centers consist of many thousands of servers that process confidential user information. To ensure data security, companies are taking unexpected steps, for example, building data centers underground . The advantages of this solution are its high deployment speed: you do not need to build a data center building, and installation work can be carried out in any weather.

    An area of ​​1.5 square kilometers is ready for work in 60 days. In addition, it is worth noting that underground facilities are extremely resistant to natural disasters, and this is a very useful advantage in tornado-hazardous regions.

    As an example of the underground data center can causethe brainchild of Iron Mountain, located at a depth of 67 meters in an abandoned mine with an area of ​​145 acres near Pittsburgh. The underground is mostly cool - in Iron Mountain, the temperature in the mine drops below 14 degrees Celsius, which saves on cooling. For every 1 kW of server power, there is 0.56 kW of cooling, while in classic data centers, a 1: 1 ratio is considered customary.

    Due to the fact that the abandoned mine is located far from the city, Iron Mountain was able to save on electricity. Now the company pays 5.5 cents for 1 kW / h, while data centers in large cities give from 10 to 17 cents for the same amount of electricity. The fact that the data center buys helps to saveelectricity at high voltage and independently transforms. If classic data centers take 480 V, then Iron Mountain takes 4160 V. A

    similar underground data center is located on the Norwegian island of Rennesoy. Created by Green Mountain, it occupies 21,000 square meters and goes 100 meters deep. Previously, this room was used as warehouses where NATO military equipment was stored.

    The trend was supported by the company LightEdge Solutions, which is engaged in cloud computing and colocation. In 2014, she invested $ 58 million in the creation of the world's largest underground complex with an area of ​​18 square kilometers.

    Another impressive versionWired, a server farm located in an underground technology center in the Swiss Alps. Swiss Fort Knox is two separate data centers located in bunkers from the Cold War. Both centers are interconnected by duplicated fiber optic communication channels.

    Multistage protection systems in the network and IT reduce any possible risk of failures to a minimum, and climate control and energy supply systems reduce dependence on the outside world: the center is able to work autonomously for many weeks.

    A distinctive feature of this data center is its “impregnability”: a thick layer of rock and a three-ton door protects it from technological disasters, and security systems and a security service operating by military standards protect it from terrorist threats.

    It is worth noting that the data centers presented above use the help of mother nature. To cool the computing modules, water is used from underground sources, and Green Mountain receives water at a temperature of 8 degrees Celsius from the fjord.

    The working equipment is heated to high temperatures, and the energy consumption of individual racks reaches 30-35 kW, so heat removal is one of the most important problems.

    DCD Intelligence reports that if you collect the heat generated by data centers around the world, it will be enough to warm the UK. Not wanting to waste such an amount of energy in vain, the Russian company Yanedeks decided to find a use for the generated heat. Not so long ago, a water pre-heating station was opened in the Finnish city of Mäntsälä, taking the heat of the company's data center.

    Now in Mäntsälä one of the four queues of Yandex servers is operating, each of which has a capacity of 10 MW. The air heated by the servers is chased by fans to the pre-heating station, heating water from 30 to 60 degrees.

    As for Russia, it is problematic to introduce such a technology because of high energy losses in heating systems, but there are still such projects. In the Irkutsk region, several companies (En + Group, Huawei, CDS, Lanit) and local authorities plan to build a data center, the heat from the servers of which will be transferred to the local thermal power station.



    / photo Sam Howzit CC

    Data centers under water (and on water)


    Data center designers are trying to make the most of the opportunities given to us by nature. Back in 2008, Google received a patent for a floating data center. The project provides for the placement of a data center at a distance of 5 km from the coast. This approach eliminates the need for a separate building and exempts from property taxes.

    However, Google is in no hurry to bring the idea to life. Instead, the startup Nautilus Data Technologies took the first step in this direction .

    Arnold Magcale and Daniel Kekai, co-founders of the company, createa network of floating platforms to provide equipment placement services. In their opinion, this design will protect data centers from natural disasters and add mobility to them - if necessary, the platform can be moved from place to place.

    Traditional data centers use a huge amount of water for cooling, for example, the data center of the US National Security Agency in Utah consumes 1.5 million gallons of water per day (or 5.6 million liters). Specialists from Nautilus decided to change the concept and brought the data center to the water, having developed an original cooling system that takes water directly from under the barge and returns it back to the reservoir.

    Now the first Nautilus commercial data center is being built at the Mar Island Naval Shipyard. The founders of the project are convinced that working at a military base will help them preserve technology; subsequent data centers are also planned to be built at military shipyards.

    The main advantage of the project is its rapid deployment. The floating data center accommodates 800 server racks, and you can prepare it for work in 6 months (anywhere in the world where there is a reservoir).

    According to Daniel, the data center at the Mar Island shipyard is just a prototype. Now electricity is used as the main power source, but the management plans to create next-generation barges with fuel cells, which will make the data center autonomous. It remains to understand how to interest conservative customers who prefer traditional types of data centers with this technology.

    If Nautilus Data Technologies is building a data center on the water, then Microsoft decided to dive into the topic with its head. The company is working on a project called Natick , and has already developed a prototype capsule that can withstand hundreds of meters of water pressure.

    “When I first heard about it, I thought:“ Water, electricity, why do this at all? ”Says Ben Cutler, Microsoft designer who is involved in the development of the Natik project. “But the more you think about it, the more clearly you see the point.”

    The Microsoft team is confident that the massive introduction of such technologies will reduce the deployment time for new data centers from 2 years to 90 days.

    Larry Smarr, a physicist and data processing specialist, is confident that this approach has a future. He notes that cloud vendors have been looking for suitable locations for data center installations for years in the hope of benefiting from the environment.

    Putting the data center under water solves the problem of cooling the servers. Moreover, such capsules can be placed closer to the cities standing next to the water masses - this will speed up the work of web services, reducing the delay, which gives Internet users a lot of inconvenience.

    The first experimental vessel of the Natik project was named "Leon Filpot" in honor of the popular game character from the Halo game. Testing of the data center took place in the Pacific Ocean at a distance of a kilometer from the coast from August to November 2015.

    The system was “packed” to capacity with various sensors of pressure, humidity, movement and leakage in order to better understand the possible causes of equipment failure. But during the whole test period, not a single failure occurred, which allowed the team to extend the experiment and even successfully deploy some commercial projects with data processing in the Microsoft Azure cloud environment during further testing.

    During this experiment, specialists from Microsoft observed not only the state of electronic systems, but also the state of the surrounding underwater environment. With the help of acoustic sensors, the noise emanating from the capsule turbines was evaluated and the degree of its influence on the inhabitants of the underwater world. It turned out that the sound of the device drowns out the slightest noise made by a fish passing by.

    Having successfully completed the first stage of the experiment, the research team plans to develop the next version of a triple-size capsule (the diameter of Leon Filpot is 2.5 meters). At this stage, the division plans to attract a group of specialists in alternative energy to participate in the design.

    The company plans to generate electricity using underwater currents. This will reduce the release of heat into seawater. At the moment, scientists have managed to achieve the absence of temperature changes around the capsule at a distance of several inches from the device.

    It is premature to draw proactive conclusions about the likelihood of the technical implementation of this bold project, and even more so the timing of putting the first facilities into operation. “At the moment, the Natik project is at the research stage, and it is not completely clear whether this concept will be applied by Microsoft and other cloud service providers.

    In addition to the development of new underwater data centers, Microsoft is engaged in laying underwater cables across the Pacific between China, South Korea, Taiwan, Japan and the West Coast of the United States. New Cross Pacific, so called this network, will increase the data transfer rate.

    Another transatlantic cable - Hibernia Express - was introducedin operation in September 2015 and laid between Canada, Ireland and the UK. So far, Microsoft is the only company that has decided to spend $ 300 million on laying AEConnect cables.

    The future network, with a length of more than 5,400 kilometers, is designed to transmit data at a speed of 100 Gb / s, so Microsoft will use this transatlantic network to meet the growing demand for broadband and support for cloud services.

    Dave Crowley, Microsoft Managing Director for Global Network Operators, said that of the 230 submarine cables in the world, very few are capable of supporting 100 Gb / s coherent transmission technology.



    / photo Arthur CarantaCC

    Data Centers of the Future


    Looking at the underwater / surface data centers and cables running across the Atlantic, it seems that the future described in many cyberpunk novels has already come. Another confirmation of this is the concept of the data tower, created by Marco Merletti (Marco Merletti) and Valeria Mercury (Valeria Mercuri).

    For developers, their project is the answer to the question of what an environmentally friendly data center should look like. The main task is to explore the ways of making advantageous use of natural conditions to simplify and reduce the cost of maintenance.

    Despite the apparent futurism of the tower, you can create it using modern technology. According to architects, this project is a giant three-dimensional cylindrical motherboard. All elements are attached from the outside, and the inside remains empty, forming the air duct of the cooling system. With the help of fans, it is planned to create the effect of natural draft, supplying fresh cold air from the outside and removing warm through the void in the center.

    The cooling of the tower is also facilitated by its planned geographical location. Weather conditions in Iceland are ideal for hosting data centers, and due to the cheap electricity generated by hydroelectric stations, the cost of maintenance will be low.

    Data tower consistsof 24 pillars forming the skeleton of the building, on which blocks with computing equipment are mounted. Blocks with equipment protruding from the walls carry from 4 to 8 standard server racks, which opens up scalability. According to the estimates of Mercury and Merletti, the data tower can accommodate up to 400,000 servers.

    In conclusion, I want to note that it is precisely such ambitious projects that push the rapidly developing IT industry forward. Perhaps, in the near future, machine rooms will be deployed in all the caves of the world, towers with servers will grow on the server pole, and the expression “deep Internet” will take on literal meaning.

    PS We are trying to share not only our own experience with the service for providing virtual infrastructure 1cloud, but also to talk about related areas of knowledge in our blog on Habré. Do not forget to subscribe to updates, friends!

    Also popular now: