Coop-like data centers and work in Antarctica: a selection of unusual data centers

    Today we decided to tell you about several unusual data centers built in various parts of the world. How to ensure the operation of data centers in the harsh conditions of Antarctica? And why do Yahoo data centers look like chicken coops? About it further. / photo Krishna CC




    The “coldest” data center on the planet


    By the standards of today's data centers, Ice Cube Lab is not the largest and not the most powerful. The data center has 1,200 cores and a three petabyte data warehouse. However, one feature sets it apart from the others - it is located in the Antarctic at the Amundsen-Scott station.

    The data center is connected to the neutrino detector of the IceCube observatory, which consists of arrays of optical sensors immersed in ice to a kilometer depth. The system helps scientists register neutrino flashes caused by various astronomical phenomena, this is necessary to study dark matter.

    Ice Cube's IT staff consists of only a few people, most of whom come only in the summer to carry out the planned work. The remaining support is carried out remotely. For most communications with the center, the Iridium satellite network is used, although the data transfer rate is only 2400 bps.

    At such a low speed, the team has to take a number of tricks: compress mail attachments, resort to multiplexing, and even use wireless servers to connect field stations. However, not everything is so bad. Eight hours a day, the Nasa GOES-3 satellite (which was previously a weather satellite) still provides the station with a megabit communication channel.

    Managing the work of the data center in Antarctica, you need to consider a number of features. One of the main problems is almost zero humidity. Because of this, employees near the servers are forced to wear antistatic vests and make sure that all grounding requirements are met.

    In addition, low humidity disables the magnetic cassettes used for storing and transferring data, which literally begin to crumble. Engineers even have to moisten them slightly. To do this, the cassettes are periodically placed in the greenhouse. The station employees even wanted to put it in the server humidifier, but because of possible condensate problems, they decided to abandon this idea.

    Another challenge is cooling the data center. It would seem that you can simply “open the door” and let the cold solve the problem. However, the outside temperature can reach -70 degrees Celsius, which in a short time will disable all equipment. In such conditions, a special ventilation system (without air conditioning) is used to control air temperature, which controls the flow of cold air from the outside. However, as station employees say, it happens that in extreme conditions it freezes too.

    Despite all the difficulties, team members are proud to be able to support 150 Ice Cube Lab servers near the South Pole, providing availability at levels greater than 99.5%.

    The Ice Cube Lab team last year published a short video in which they conducted a tour of the data center. You can watch it at the link .

    One of the greenest data centers


    LEED is a private green building certification program that was launched back in 1998. Companies whose buildings comply with LEED standards save up to 25% more energy, and can also receive tax benefits from the state.

    Today, about 25 data centers around the world have the LEED platinum status - the highest status in this program. One of the first to receive it was a 2-hectare Citi Data Center in Frankfurt back in 2009.

    Citi Data Center uses several energy-saving technologies at once. The center itself is designed so that 65% of the time the basis of the cooling system is fresh air from the outside. Due to the reverse osmosis processsediment in cooling towers used to cool water is reduced. For watering green spaces in the data center, rainwater is additionally used.

    All operational waste is sent for recycling. At the same time, the “environmental friendliness” of the data center was observed even during the construction. All construction waste was removed outside the city landfill, and the competent design of the data center made it possible to reduce the length of all necessary cables by 250 kilometers.


    / photo Open Grid Scheduler CC

    Cubes data centers


    When we talk about data centers, most often in the head there is an image of a huge room full of server racks and equipment. However, there is a more compact alternative - a modular data center . Such data centers are most often presented in the form of units with their own cooling system, with which you can quickly deploy IT infrastructure in the required volume.

    For example, Ecotality transport company, using Instant modular data centers, was able to move to a new office in 8 weeks, while saving $ 120,000 in space under the data center and reducing the cost of cooling costs by 65%.

    It happens that companies (such as IBM and Elliptical), “Pack” the servers and other accessories into shipping containers. In this form, the data center modules are conveniently transported with trucks or ships in a short time over long distances.

    The advantage of "portable" data centers is the ability to place them closer to their customers or sources of information. One example is the Nautilus project , whose creators placed equipment on barges. This concept allows not only to use water to cool the centers, but also significantly increase their mobility.

    Yahoo Coops


    Yahoo's new data center project is called Yahoo Compute Coop (Yahoo Computing Coop). However, this is not a joke or a marketing ploy. Data centers really look like chicken coops, but the point here is not only the external resemblance. The elongated shape of the chicken coops along with an additional roof section is used to create natural ventilation, thanks to which warm air rises and is vented out through special openings.


    This Yahoo data center can be cooled by simple air from the street. In the latter case, the outside air with a temperature of 21 to 29 degrees Celsius enters the data center through adjustable blinds (1) located on the walls of the building. Then, with the help of a ventilation unit (2), he is sent to the server room (3) through a mixing chamber (4). In this case, the air is not cooled additionally, but only filtered.

    Fans mounted on racks with servers blow warm air into the inner corridor (5), from where it, by means of natural convection, rises to the “attic” (6). Then it is led out through adjustable louvers at the top (7).

    If the air temperature in the street is above 29 degrees, then first it is cooled in the mixing chamber. And if below 21 degrees, then part of the hot air that has already passed through the server room is sent to the mixing chamber for heating.

    Yahoo didn't come to this design right away. The company began to build its own data centers back in 2007, but they did not stand out in any way - ordinary rooms with active air cooling. The next project was called YTC (Yahoo Thermal Cooling). Hot air in such data centers was blown by the fans of the servers into a special enclosed space, from where it was forced out with the help of an intercooler.

    The project of “chicken coop data centers” isThe third iteration of Yahoo's project to optimize cooling systems in large enterprises with tens of thousands of computers. The company has already built several data centers according to the new plan. One of the first to be opened in Lockport, New York. At the same time, its energy efficiency coefficient ( PUE ) is 1.08 points, which is comparable to European data centers that use features of a cold climate for cooling racks.

    In addition, the corporation has registered about 3 thousand patents, and intends to sell the technology to other large firms.



    PS What else are we writing about the First Corporate IaaS Blog:


    PPS Some fresh materials from our blog on Habré:


    Also popular now: