Data centers like art: data centers on the ocean floor

    We are starting a series of articles on various data centers, which, in a way, can be considered works of art. This will be an extended version of the article already published by us earlier: The most unusual data centers: data centers, as an art , supplemented by a maximum of interesting details, dialogs with experts and comments of people who sometimes have the opposite opinion.

    image

    The oceans are cold, dark and filled with kinetic energy, which makes them an ideal place to build data centers. A little over a year ago, Microsoft plunged a cabinet of servers enclosed in a sealed metal container into the ocean. A few months later they lifted the capsule to the surface, it was covered with algae and mollusks, as if it had not been several months, but many years. But the servers inside continued to work in coolness and dryness. Is this the future of data centers? The team of Microsoft engineers working on the Natick project believes this is the future.

    The origin of the idea


    It all started in 2013, when a Microsoft employee, Sean James, who was introducing new technologies in data centers and served on a U.S. Navy submarine, proposed the idea of ​​placing server farms under water, which raised doubts among some of his colleagues. But not with him.


    Sean James (Cloud Infrastructure & Operations Team)

    The implementation of this idea, in his opinion, should not only reduce the cost of cooling machines - huge costs for many data center operators (about 40% of the total energy bill - cooling), but also the cost for construction, simplify the management of these installations using renewable energy sources and even improve their productivity.

    Together with Todd Rawlings, another Microsoft engineer, James distributed an internal document promoting this concept. It explained how the creation of data centers under water can help Microsoft and other cloud-based providers meet the phenomenal growth needs while maintaining a high degree of environmental friendliness.


    Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer (from left to right)

    In many large companies, such outlandish ideas are probably quietly dying in the bud, but not at Microsoft, where researchers already have a history of solving vital problems for the company in innovative directions, even if the project is far beyond the core of Microsoft’s activity and in implementation who does not have the proper experience yet. The key to success is that Microsoft gathers teams of engineers, not only among its specialists, but also among partner companies.

    Four people formed the core of the team, which was entrusted with testing the far-reaching idea of ​​James. And already in August 2014, a project began to be organized, which soon became known as Natick, only for the reason that the research team liked to name projects in no other way than by the names of cities in Massachusetts. And only 12 months later the prototype of the data center was ready, which was to be loaded to the bottom of the Pacific Ocean.

    Difficulties in the implementation of the project


    The lack of difficulties in implementing the Natick project was not observed. The first problem was, of course, providing a dry space inside a large steel container immersed in water. Another problem was cooling. We had to figure out how best to use the surrounding seawater to cool the servers inside. And finally, the question arose of dealing with shells and other forms of marine life, which would inevitably cling to a submerged ship - a phenomenon that is familiar to anyone who has ever kept a boat in water for a long period of time. Adhesive crustaceans and the like became a problem because they interfered with the heat transfer from the servers to the environment (water). At first, these problems scared, but were gradually resolved, often using time-tested solutions from the marine industry.


    Installing a server rack in the Natick project container

    But why all these difficulties? Of course, cooling the servers with sea water will lead to lower cooling costs and can improve the work due to other advantages, as well, but immersing the data center in water involves obvious costs and inconveniences. Does trying to put thousands of servers on the seabed really make sense? We believe that it has, and there are several reasons for this.

    The first is that it will provide Microsoft with the ability to quickly increase capacity, where and when necessary. Corporate planning would become less burdensome, since it was not necessary to build facilities long before they were really needed, in anticipation of later demand. For an industry that spends billions of dollars a year on building an ever-growing number of data centers, fast response times can provide huge cost savings.

    The second reason is that underwater data centers can be built faster than ground centers, which is easy to understand. Today, the construction of each such installation is unique. The equipment may be the same, but building codes, taxes, climate, labor, electricity and network connectivity are different everywhere. And these variables affect the duration of the construction. We also observe their influence on the performance and reliability of data centers, where identical equipment shows different levels of reliability depending on where the data center is located.



    As we can see, Natick will consist of a set of “containers” - steel cylinders, each of which will contain, possibly, several thousand servers afterwards. Together they would form an underwater data center, located a few kilometers from the coast and located between 50 and 200 meters under water. The containers could either float above the seabed at some intermediate depth, and be moored cables to the bottom of the ocean, or they could fit on the seabed itself.

    After deploying the data center module, it will remain in place until the time comes to replace the servers it contains. Or, perhaps, market conditions will change, and they will decide to move it somewhere else. This is a real isolated environment, which means that system administrators will work remotely and no one will be able to correct the situation or replace part of the equipment while the module is functioning.



    Now imagine that the on-time production process is applied to this concept. The containers could be built at the factory, equipped with servers and prepared for shipment to anywhere in the world. Unlike on land, the ocean provides a very homogeneous environment wherever you are. Therefore, no configuration of the containers would be required, and we could install them quickly wherever computing power is in short supply, gradually increasing the size of the underwater installation to meet performance requirements as we grow. Natick's goal is to be able to run data centers on coastal sites anywhere in the world within 90 days of the decision to deploy.

    Reducing user delays


    Most of the new data centers are built in places where electricity is inexpensive, the climate is cool enough, the land is cheap, and the facility does not interfere with people living nearby. The problem with this approach is that very often it is necessary to put data centers far from settlements, which reduces the speed of server response to requests due to an increase in latency.

    For interactive applications in which online user interaction takes place, these delays can be significant and lead to problems at high values. We want web pages to load quickly and video games like Minecraft or Halo to be fast and not lag. In the coming years, more and more applications with a large number of interactions will appear, including those supported by Microsoft HoloLens and other augmented reality technologies. So it’s really necessary that the servers are closer to the people they serve, which is rarely the case today.

    This may seem surprising, but almost half of the world's population lives no further than 100 km from the sea. Thus, the placement of data cents off the coast near coastal cities will make them much closer to users than today.

    Savings on cooling


    If this basis is not enough, consider the cost savings on cooling. Historically, cooling systems in data centers consumed nearly as much energy as the servers themselves, since mechanical cooling systems were used.

    Recently, many data center operators have switched to cooling using air taken from the environment - free-cooling, instead of mechanically cooling the air, they simply use purified air from the outside. This greatly reduces the energy costs for cooling, which are no longer 50% of the total consumption, but lie within 10-30%. However, in some places the ambient temperature is too high or has seasonal variations, so such data centers can only be effective at high latitudes, away from settlements, which increases delays to users.



    Moreover, data centers using free-cooling type cooling can consume quite a lot of water. Since evaporation is often used to cool slightly the air blown through the servers. This is problematic in drought prone areas such as California, or where a growing population is depleting local aquifers, as is the case in many developing countries. But even if water is abundant, adding it to the air makes electronics more susceptible to corrosion.

    Natick project architects solve all these problems. The inner part of the container of the data center consists of standard computer racks, to which heat exchangers are attached, which remove heat from the air into a heat-conducting and heat-resistant liquid, similar to ordinary water. This liquid is then pumped into heat exchangers on the outside of the container, which in turn transfer heat to the surrounding ocean. The cooled liquid is then returned to the internal heat exchangers for a recycle.

    Of course, the colder the surrounding ocean, the better this scheme will work. To get access to cold sea water even in the summertime or in the tropics, you will need to place containers with equipment deep enough. For example, at a depth of 200 meters near the east coast of Florida, the water temperature does not rise above 15 ° C throughout the year.

    Prototype


    Tests of the prototype installation of Natick, called the “Leona Philpot” (in honor of the Xbox game character), began in August 2015. The installation was submerged to a depth of only 11 meters in the Pacific Ocean, near the city of San Luis Obispo in California, where the water temperature ranges from 14-18 ° C throughout the year.

    The server rack used in the prototype of the underwater data center module was filled with both real servers and dummy loads that consume the same amount of energy as the server

    In the course of this 105-day experiment, it was shown that it is possible to keep the servers cooled to the same temperatures as with the use of mechanical cooling, but with incredibly low energy consumption - only 3 percent. These energy costs are significantly lower than any industrial or experimental data center known today.

    In addition, there was no need to keep personnel on site, parking places, panic buttons and security. An atmosphere without oxygen was created in the container, which makes a fire impossible, and the equipment can be controlled from a comfortable office by Microsoft employees. Moreover, all water vapor and dust was also removed from the atmosphere of the container. This made the environment very comfortable for electronics, which minimized problems with heat distribution and corrosion of the connectors.

    image
    A look into the future. Future data centers will be much longer than the prototype, and each of them will contain a large number of server racks. Electronics will be cooled by transferring waste heat to the surrounding sea water using internal and external heat exchangers .

    Protecting the environment and providing energy


    Microsoft is committed to protecting the environment. For example, to meet its electricity needs, the company uses as many renewable energy sources as possible. To the extent possible. However, carbon has yet to be compensated. In accordance with this philosophy, the company plans to deploy future underwater data centers near coastal renewable energy sources - be it a coastal wind farm or some offshore energy system that uses the power of tides, waves or currents.



    These energy sources are usually abundant on the shelf, which means that it will be possible to place equipment close to people and have access to a large amount of green energy at the same time. Just as ground-based data centers become the impetus for building new infrastructures for providing renewable energy sources, a similar situation can happen with underwater data processing centers. Probably their presence will stimulate the construction of new infrastructure facilities that will provide clean energy not only underwater data centers, but also the population.

    Another factor to consider is that traditionally generated electricity is not always readily available, especially in developing countries. For example, 70 percent of sub-Saharan Africa’s population does not have access to power networks at all. Therefore, if you want to build a data center and make available services for such a population, you probably will also need to provide them with electricity.



    Typically, electricity is transmitted over long distances using high-voltage transmission lines and voltages from 100,000 volts or higher to reduce losses, but ultimately the servers use the same low voltage as your home computer. To reduce the mains voltage to values ​​suitable for server hardware, usually three separate devices are required. You will also need redundant generators and plenty of batteries to provide power in the event of a power outage.

    Placing underwater data centers near coastal energy sources would allow engineers to simplify the infrastructure. Firstly, it would be possible to connect to a lower voltage mains, which would eliminate some voltage converters on the way to the servers. Secondly, by connecting servers to independent power grids powered by wind power or offshore turbines, fault tolerance can be ensured. This will reduce both electrical losses and overall investment in the overall data center architecture, which will protect against local power outages.

    An additional advantage of this approach is that only the fiber optic cable or two cables that will be used to transmit data from the underwater data center will have an impact on the earth.

    Questions and research


    The first question that almost anyone asks about this idea asks: How will the electronics stay dry? The truth is that maintaining a dry climate inside a container with servers is not difficult. The marine industry learned to keep equipment dry in the ocean long before the first computers appeared, often in much more unfavorable conditions than those that might be part of this project.



    The second question that was at an early stage was how to cool servers most efficiently. The project team investigated a number of exotic approaches, including the use of special dielectric fluids and phase-transition materials, as well as unusual coolants such as high pressure helium gas and supercritical carbon dioxide. While these approaches have advantages, they also cause big problems.

    Researchers continue research on the use of exotic materials for cooling, however, in the near future, there is no real need for them. Water cooling and the use of radiator heat exchangers as part of the Natick project provide a very economical and efficient cooling mechanism that works perfectly with standard servers.

    Biological fouling and environmental influences


    More importantly, in our opinion, is that the underwater dada center will attract marine life, forming an artificial reef. This process of colonization by marine organisms, called biofouling, begins with unicellular creatures, followed by slightly larger organisms that feed on these cells, and so on along the food chain.

    image
    After 105 days in the water, parts of the equipment capsule became coated with marine life as a result of the biological fouling phenomenon familiar to yachtsmen

    When the researchers plunged the Natick prototype to the bottom, crabs and fish gathered around for a day and began to settle in a new home. The researchers were happy that they had created a home for these creatures, so one of the main design challenges was how to maintain this habitat, without interfering with the ability of the underwater data center to keep the servers inside sufficiently cool.

    In particular, it is known that biological fouling of external heat exchangers will interfere with heat removal from these surfaces. Thus, the researchers began to investigate the use of various anti-fouling materials and coatings - even active deterrents using sound or ultraviolet light - in the hope of complicating the appearance of living organisms. Although physically heat exchangers can be cleaned, relying on such interventions would be unreasonable given the goal of simplifying maintenance as much as possible.



    Fortunately, the Natick container heat exchangers remained clean during the first deployment, even though they were in a very adverse environment - shallow water and near the coast where ocean life is most common. However, fouling remains an area of ​​active research, and researchers continue to study it with an emphasis on approaches that will not harm the marine environment.

    Equipment failure, safety and environmental impact


    The biggest problem that researchers encountered during testing was the failure of the equipment and the inability to replace a specific server in the server rack or hard disk, network card, or other component of a specific server. It is only necessary to respond to hardware failures remotely and autonomously. Therefore, in this direction, active work is being carried out even in ordinary Microsoft ground data centers in order to increase the detection ability and provide the ability to eliminate failures without human intervention. Similar methods and experience will be applied to Natick data centers in the future.

    What about security? Is data reliably protected from virtual or physical theft if it is placed under water? Absolutely. Natick will provide the same encryption and other security guarantees that Microsoft's ground-based data centers have. At the same time, since no one will be physically present near containers with equipment, Natick will receive data on environmental changes using sensors, including reporting any unexpected visitors.



    You may also wonder if the heat generated by the underwater data center will adversely affect the local marine environment. Probably not. Any heat generated by the Natick container is quickly mixed with cool water and carried away by the current. Water a few meters downstream from the Natick container will be warmer by only a few thousandths of a degree at best.

    Therefore, the environmental impact will be very small. This is important because in the future much more underwater data centers will be built and it is likely that people will begin to work underwater to support and provide them.

    CONCLUSIONS


    At first glance, the idea of ​​managing data centers that are located on a cold ocean floor looks very attractive: a capsule filled with server racks requires constant and accurate temperature control. Why not shift this work to the cold ocean floor, where the water temperature is close to 0 degrees Celsius?

    The Natick team working to implement this solution is of such an opinion. Their idea is to place data centers in the ocean near the coast: “more than half of the Earth’s population lives no further than 200 km from the coastline,” says one of the project’s engineers in the film, citing arguments for placing servers in the ocean near the coast:



    The main task is to reduce the distance between users and the requested data, to be able to deploy cloud infrastructure in isolated containers for 90 days in the right location around the world, which will save on data delivery to users and become closer to them, as well as save on cooling and nutrition, make data centers more environmentally friendly.

    Will this work out? Surely. It has already worked out, the only question is when the underwater data centers will be distributed on an industrial scale. Today, Microsoft owns more than 100 data centers around the world, which host more than a million servers, more than $ 15 billion is spent on IT infrastructure annually, possibly in the near future, data centers will still begin to be built under water.



    The article was published with the assistance of the hosting provider ua-hosting.company. Therefore, taking this opportunity, we want to remind you of the action:

    VPS (KVM) with dedicated drives (a full-fledged analogue of dedicated entry-level servers from $ 29) in the Netherlands and the USA for 1-3 months free for everyone + 1 month bonus for geektimes

    Do not forget that your orders and support (cooperation with you) will allow publishing even more interesting material in the future. We will be grateful for your feedback and criticism and for possible orders. ua-hosting.company - happy to make you happier.

    Also popular now: