
EBay data center and adiabatic humidification
“The data center equipment should be operated in the temperature range up to + 25 ° C, and cooling should preferably be done using chillers or precision air conditioners.” Until recently, thanks to the recommendations of ASHRAE (American Society of Heating, Refrigerating and Air Conditioning Engineers, one of the most respected organizations in the field of refrigeration and air conditioning), this was an axiom. But the cost of electricity for cooling the data center grew along with the tariffs and capacity of the equipment, and, in the end, the cooling systems began to consume 35-40% of all the energy needed for the functioning of the data center.
There is a traditional approach to reducing the energy consumption of a refrigeration unit, which consists in the search for more efficient refrigerants and the selection of system operation parameters. But this is evolutionary development, in fact, a battle for a few percent increase in energy efficiency. In this context, the complete rejection of condensing units and the transition to the use of outdoor air can be considered revolutionary. In Murmansk or Norilsk, such an approach would be fully justified. But a data center with freecooling in a hot desert is, in the opinion of a layman, already from the field of non-obvious marketing solutions, invented for the sake of "green itch" and other phenomena that are still obscure to us.

What is even more surprising is that such a solution has a much lesser relationship with marketing, and such an unconventional approach is primarily due to economic and technical reasons. The data center "Mercury" in Phoenix, Arizona, belongs to the well-known to all of us eBay. This company chooses the location for its sites in such a way as to minimize delays for users around the world, because on eBay deals are made for $ 2,000 every second, which means that it is vital for the company's services to remain available 24/7/365. Thus, the data center in Phoenix is located geographically well. But climatically ... Tropical desert climate, very hot summers with highs in the region of +50 ᵒС. Here it’s just right to think that not every refrigeration machine can withstand work in such conditions, not to mention “freecooling”. But the initial conditions - the maximum density of equipment and the maximum processing power per watt of the energy consumed by the data center - left no choice: the use of traditional cooling systems would shatter all dreams of high energy efficiency. After a thorough analysis, Global Foundation Services experts (eBay division) came to the conclusion that it was “freecooling” that would best provide the desired efficiency, and they announced a competition for designing data centers in the desert, but with year-round outdoor cooling.
Of course, you cannot take the design of a conventional data center with traditional server racks and remake its cooling system for outdoor air. Equipment for which the IT standard +25 .. + 27 ᵒС is unshakable, simply can not withstand this transition, because "freecooling", especially in a hot region, in principle can not provide the required temperature. Equipment is required that can operate normally at higher temperatures. And such equipment was found in the Dell lineup: modular data centers with Dell PowerEdge servers and rack densities up to 30 kW.
But what about the air temperature up to +50 degrees Celsius? Of course, the temperature range of PowerEdge equipment is slightly wider (up to 45 degrees in short peaks), but not to the same extent! And here a solution that was absolutely wild for most IT specialists was applied: adiabatic humidification, or cooling using the heat of vaporization of water to take heat from the air. The essence is extremely simple: in dry hot air (which is the air in Phoenix), ordinary water purified from impurities is sprayed with droplets with a diameter of 0.06-0.08 mm, as a rule. The specific heat of water vaporization is 2260 KJ / kg, the specific heat of air is 1.006 KJ / kg * ° C. Thus, due to the evaporation of a kilogram of water, the temperature of 2200 kilograms of air can be reduced by one degree. In practice, the airflow temperature decreases significantly (by an average of 7 degrees, it depends on a combination of factors). The downside of this approach is the increase in air humidity. Thanks to numerous stereotypes, everyone knows: high humidity is death for equipment, it means failures and premature server failure.
Numerous studies by industry giants have shown that this is not so. Most of the equipment is able to withstand both temperature increase and increase in air humidity without any harm to it.

Racks and servers specially designed for such conditions, of course, are also tolerant of high humidity. The operational experience of the Mercury data center showed that during short hot periods, water evaporation can maintain air temperature at an acceptable level for the data center, while most of the year adiabatic cooling is not required at all - nevertheless, there are very cold months in Phoenix. There are no “peak” and backup systems in the data center, so the cooling of the data center equipment is carried out using an inexpensive and very reliable system due to the lack of complex units.
Of course, the implementation of such a system, and even in such an unconventional design, is associated with a host of practical difficulties. So, it is extremely important to combine the correct droplet diameter and airflow velocity: if it does not fit into the “standards of beauty” or the airflow velocity is too high, it will be taken outside the space where heat exchange occurs, and accordingly, “the focus will fail”. Water must meet rather serious requirements for CaCO3 content and hardness (8–12 degrees of hardness, 1 degree of hardness corresponds to CaCO3 in the amount of 1 mEq / l of impurity), the pH should also not be higher than 7, otherwise the elements of the cooling system will be exposed corrosion. There are less obvious difficulties: for example, what to do with water that does not evaporate, how and where to collect it?
However, after overcoming these difficulties, the use of a cooling system so unconventional for data centers made it possible to achieve enviable energy efficiency. The PUE coefficient (Power usage effectiveness, calculated as the total equipment capacity divided by the capacity of IT equipment) on August day amounted to 1.043, i.e. auxiliary equipment, including the cooling system, even in summer consumes only about four percent of the data center energy, in winter - even less PUE in the region of 1.018. The efficiency of condensing systems based on chillers or DX-conditioners is significantly lower, for them PUE in the region of 1.3 is an achievement. Even on the hottest days, a “free” data center cooling system enables servers to function reliably and reliably. Remember, the site is owned by eBay. If there were at least some doubts about the effectiveness and stability of such a decision, a company whose life depends on the availability of its sites would never do that. But the data center "Mercury" with an area of 12,600 square meters and a capacity of 4 MW has been operating for more than a year.
Interestingly, such a cooling system and the placement of data processing modules on the roof of the data center can not only cool them efficiently, but also quickly increase computing power, if necessary. So, with the help of special cranes on the roof, one and a half thousand servers can be raised in twenty minutes. Then they are quickly connected to electricity and water supply, and now in an hour they are in operation. The data center has the ability to quickly expand capacity to 6 MW, as well as the necessary infrastructure to increase it to 12 MW. 12,600 square meters is quite a bit by the standards of modern data centers, but such power and density is already serious.
The use of free cooling together with adiabatic evaporative cooling in a “hot” data center is a bold, unconventional, non-obvious, but already proven solution.

Of course, precision instruments and chillers will not go anywhere, and one should be careful to increase the average air temperature in air-cooled data centers. But even if ASHRAE in its 2011 recommendations recognizes the existence of equipment of classes A3 (up to 40 ° C) and A4 (up to 45 ° C), and eBay is already working with such equipment, it means that you should not be afraid of humidity or elevated temperature just because that they are and there is a rumor about their poor compatibility with servers. Properly selected equipment, an effective cooling system and well-established monitoring are all the secrets of ultra-efficient data centers, the share of which is likely to grow in the coming years, including in our country.
Where are such conclusions from? The reason is simple: FZ-261 establishes a fairly tight framework for all serious consumers of resources, as well as serious indicators of increasing energy efficiency –40% by 2020. The transition to natural refrigerants and the use of new thermostatic valves cannot achieve such indicators. In addition, this - with sufficiently large investments - is not always any tangible savings. But the transition to a fundamentally different paradigm of data center cooling with the help of external air is almost the same tens of percent, and, taking into account the constantly growing energy prices, significant savings on the operation of the data center. Money, the growing capacity of server hardware, as well as regulatory documents that, after the appearance of FZ-261, multiply like mushrooms after rain - this is what
Intro
There is a traditional approach to reducing the energy consumption of a refrigeration unit, which consists in the search for more efficient refrigerants and the selection of system operation parameters. But this is evolutionary development, in fact, a battle for a few percent increase in energy efficiency. In this context, the complete rejection of condensing units and the transition to the use of outdoor air can be considered revolutionary. In Murmansk or Norilsk, such an approach would be fully justified. But a data center with freecooling in a hot desert is, in the opinion of a layman, already from the field of non-obvious marketing solutions, invented for the sake of "green itch" and other phenomena that are still obscure to us.

What is even more surprising is that such a solution has a much lesser relationship with marketing, and such an unconventional approach is primarily due to economic and technical reasons. The data center "Mercury" in Phoenix, Arizona, belongs to the well-known to all of us eBay. This company chooses the location for its sites in such a way as to minimize delays for users around the world, because on eBay deals are made for $ 2,000 every second, which means that it is vital for the company's services to remain available 24/7/365. Thus, the data center in Phoenix is located geographically well. But climatically ... Tropical desert climate, very hot summers with highs in the region of +50 ᵒС. Here it’s just right to think that not every refrigeration machine can withstand work in such conditions, not to mention “freecooling”. But the initial conditions - the maximum density of equipment and the maximum processing power per watt of the energy consumed by the data center - left no choice: the use of traditional cooling systems would shatter all dreams of high energy efficiency. After a thorough analysis, Global Foundation Services experts (eBay division) came to the conclusion that it was “freecooling” that would best provide the desired efficiency, and they announced a competition for designing data centers in the desert, but with year-round outdoor cooling.
Operating principle
Of course, you cannot take the design of a conventional data center with traditional server racks and remake its cooling system for outdoor air. Equipment for which the IT standard +25 .. + 27 ᵒС is unshakable, simply can not withstand this transition, because "freecooling", especially in a hot region, in principle can not provide the required temperature. Equipment is required that can operate normally at higher temperatures. And such equipment was found in the Dell lineup: modular data centers with Dell PowerEdge servers and rack densities up to 30 kW.
But what about the air temperature up to +50 degrees Celsius? Of course, the temperature range of PowerEdge equipment is slightly wider (up to 45 degrees in short peaks), but not to the same extent! And here a solution that was absolutely wild for most IT specialists was applied: adiabatic humidification, or cooling using the heat of vaporization of water to take heat from the air. The essence is extremely simple: in dry hot air (which is the air in Phoenix), ordinary water purified from impurities is sprayed with droplets with a diameter of 0.06-0.08 mm, as a rule. The specific heat of water vaporization is 2260 KJ / kg, the specific heat of air is 1.006 KJ / kg * ° C. Thus, due to the evaporation of a kilogram of water, the temperature of 2200 kilograms of air can be reduced by one degree. In practice, the airflow temperature decreases significantly (by an average of 7 degrees, it depends on a combination of factors). The downside of this approach is the increase in air humidity. Thanks to numerous stereotypes, everyone knows: high humidity is death for equipment, it means failures and premature server failure.
Numerous studies by industry giants have shown that this is not so. Most of the equipment is able to withstand both temperature increase and increase in air humidity without any harm to it.

Racks and servers specially designed for such conditions, of course, are also tolerant of high humidity. The operational experience of the Mercury data center showed that during short hot periods, water evaporation can maintain air temperature at an acceptable level for the data center, while most of the year adiabatic cooling is not required at all - nevertheless, there are very cold months in Phoenix. There are no “peak” and backup systems in the data center, so the cooling of the data center equipment is carried out using an inexpensive and very reliable system due to the lack of complex units.
Nuances
Of course, the implementation of such a system, and even in such an unconventional design, is associated with a host of practical difficulties. So, it is extremely important to combine the correct droplet diameter and airflow velocity: if it does not fit into the “standards of beauty” or the airflow velocity is too high, it will be taken outside the space where heat exchange occurs, and accordingly, “the focus will fail”. Water must meet rather serious requirements for CaCO3 content and hardness (8–12 degrees of hardness, 1 degree of hardness corresponds to CaCO3 in the amount of 1 mEq / l of impurity), the pH should also not be higher than 7, otherwise the elements of the cooling system will be exposed corrosion. There are less obvious difficulties: for example, what to do with water that does not evaporate, how and where to collect it?
Profit
However, after overcoming these difficulties, the use of a cooling system so unconventional for data centers made it possible to achieve enviable energy efficiency. The PUE coefficient (Power usage effectiveness, calculated as the total equipment capacity divided by the capacity of IT equipment) on August day amounted to 1.043, i.e. auxiliary equipment, including the cooling system, even in summer consumes only about four percent of the data center energy, in winter - even less PUE in the region of 1.018. The efficiency of condensing systems based on chillers or DX-conditioners is significantly lower, for them PUE in the region of 1.3 is an achievement. Even on the hottest days, a “free” data center cooling system enables servers to function reliably and reliably. Remember, the site is owned by eBay. If there were at least some doubts about the effectiveness and stability of such a decision, a company whose life depends on the availability of its sites would never do that. But the data center "Mercury" with an area of 12,600 square meters and a capacity of 4 MW has been operating for more than a year.
Interestingly, such a cooling system and the placement of data processing modules on the roof of the data center can not only cool them efficiently, but also quickly increase computing power, if necessary. So, with the help of special cranes on the roof, one and a half thousand servers can be raised in twenty minutes. Then they are quickly connected to electricity and water supply, and now in an hour they are in operation. The data center has the ability to quickly expand capacity to 6 MW, as well as the necessary infrastructure to increase it to 12 MW. 12,600 square meters is quite a bit by the standards of modern data centers, but such power and density is already serious.
The use of free cooling together with adiabatic evaporative cooling in a “hot” data center is a bold, unconventional, non-obvious, but already proven solution.

Of course, precision instruments and chillers will not go anywhere, and one should be careful to increase the average air temperature in air-cooled data centers. But even if ASHRAE in its 2011 recommendations recognizes the existence of equipment of classes A3 (up to 40 ° C) and A4 (up to 45 ° C), and eBay is already working with such equipment, it means that you should not be afraid of humidity or elevated temperature just because that they are and there is a rumor about their poor compatibility with servers. Properly selected equipment, an effective cooling system and well-established monitoring are all the secrets of ultra-efficient data centers, the share of which is likely to grow in the coming years, including in our country.
Native Penates
Where are such conclusions from? The reason is simple: FZ-261 establishes a fairly tight framework for all serious consumers of resources, as well as serious indicators of increasing energy efficiency –40% by 2020. The transition to natural refrigerants and the use of new thermostatic valves cannot achieve such indicators. In addition, this - with sufficiently large investments - is not always any tangible savings. But the transition to a fundamentally different paradigm of data center cooling with the help of external air is almost the same tens of percent, and, taking into account the constantly growing energy prices, significant savings on the operation of the data center. Money, the growing capacity of server hardware, as well as regulatory documents that, after the appearance of FZ-261, multiply like mushrooms after rain - this is what