VKontakte data center

    Everyone knows that the heart of VKontakte is Singer's house on Nevsky Prospekt. Today we will tell and show where his brain is located and what it looks like - the ICCA data center.

    image

    How to build a data center?


    The data center (data storage and processing center, aka data center) is a combination of several infrastructure systems that provide reliability and fault tolerance of server and network equipment.

    You can’t just put a bunch of servers and switches. It is necessary to create and maintain optimal conditions for their work. If you want to build your data center, you will need:

    • Power supply system. Everything is clear here - the servers are powered by electricity, and there are many of them. Therefore, a conventional 220V power outlet is likely to be not enough.
    • Cooling system. Even a gaming graphics card in an advanced computer requires a powerful cooler. What can we say about hundreds and thousands of high-performance devices.
    • Structured cabling system (SCS). Something must connect all the elements into a single whole. You will need a lot of cables and a passionate love for the process of their pedantic laying.

    These are the basic "life support" systems, the very minimum that is needed to just start up the equipment. But for a true full-fledged data center, you need something else. Namely:

    • Fire extinguishing system. It is important to ensure that a random spark does not turn your brand new data center into ruins.
    • Access Control and Management System (ACS). Do not leave the doors open to all comers.
    • Monitoring system. You should find out in time if something went wrong.
    • Security alarm. In case someone decides to use a crowbar instead of a pass.
    • Video surveillance system.

    We believe that you will get an excellent data center. In the meantime, let's see how ours looks.

    Welcome to ITSVA


    Why ICCA, what is this name? ICCA is a research center for high-voltage apparatus engineering, which used to be located in the data center building and worked for the benefit of the energy industry. We inherited a dystopian-looking hangar with ceilings on the fifth floor level and mysterious rooms with meter-thick walls.

    There are 640 racks in four machine rooms - more than 20,000 servers and more than 200 switches, routers and DWDM systems with a capacity of more than 4 Tb / s. The ASR9000 router with serial number 1 is installed here - at one time it was the first commercial installation of such a device in the whole world.

    At the peak, the data center generates more than 1 Tb / s of external traffic. More than 10 of the largest international providers and international traffic exchange centers, as well as about 40 large operators of the Russian Federation, are connected to our DWDM systems.

    image

    The first engine room. Perhaps this is where your favorite video lies.

    Power supply


    All elements of the power supply system are reserved at least N + 1. Literally opposite the building of the data center is the Vostochnaya substation, from which power is supplied to the data center through two 6kV inputs. Then, through a distribution substation and automatic reserve input, power is supplied through two independent inputs. Here's what it looks like in the diagram (for simplicity - with one machine room out of four):

    image

    Power supply circuit in normal mode

    Each node is duplicated and normally operates at half load. In the event of an accident, power will reach the engine room, bypassing the failed section. For example, we lost one 6kV input:

    image

    Power supply circuit during an input accident

    If everything is completely bad, and you can’t count on the inputs from the backbone network, uninterruptible power supplies come into play. Their task is to provide power to the machine rooms for the short time that diesel-generator sets are launched.

    image
    Uninterruptible power supplies

    image
    And these are batteries for them - they look very similar to automobiles ...

    image
    ... they only occupy several large rooms

    Diesel-generator sets (DGU) support life in the data center in the event of a protracted accident or planned work in the power supply system. In addition to fuel tanks, a large-capacity automatic container refueling station is installed in the diesel generator sets themselves. Fuel from the tank is automatically supplied to all diesel generator sets; the supply is designed for at least a day. If necessary, a diesel fuel truck will arrive within two hours.

    image
    Diesel Generator Sets

    Each server and each switch are connected to two power inputs. As a rule, in modern equipment such an opportunity is provided by the manufacturer. For servers with one input, the power is duplicated using such a device:

    image
    Static load switch + manual bypass Digital Energy

    Cooling system


    For a comfortable life of equipment in the machine rooms, a certain temperature range must be maintained. That's why companies around the world are increasingly building their data centers somewhere close to the Arctic Circle - in such conditions, you can use outdoor air to cool servers. This is called "freecooling," and this approach is rightly considered the most energy-efficient (why waste energy on cooling warm air if you can immediately take cold?).

    We also use free-cooling, though with some reservations. Despite the legendary St. Petersburg coolness, in summer the air temperature still sometimes rises above the coveted 20-25 °, and it has to be additionally cooled. In winter, on the contrary, the air is too cold to be immediately usable. In addition to the fact that the servers can simply be supercooled, a change in temperature will lead to a dew point shift - and condensate is contraindicated for such equipment. In addition, with this scheme, air enters from the street, which means that it needs to be additionally cleaned.

    Freecooling is used in one of the four machine rooms; in the rest, the cooling system works according to the classical scheme using precision air conditioners.

    image
    Indoor units for precision air conditioners

    image
    External units of precision air conditioners

    Cold air from a mixing chamber or air conditioner is fed through a raised floor or duct to the so-called “cold corridor”. This is an isolated space between the faces of two rows of racks. Here is this:

    image
    Cold hall of the engine room

    From the reverse side, the heated exhaust air enters the “hot hall” - and from there it is sent back for freon cooling to the indoor units of the air conditioners. This achieves the circulation of clean (dust-free) air in the engine room.

    image
    Hot hall in the engine room

    Structured cabling system


    Kilometers of carefully laid wires. Words are not needed here.

    image

    image

    image

    image

    image

    image

    Fire extinguishing system


    Our data center provides a gas fire extinguishing system. Actually the gas (freon) is stored under pressure in cylinders. In the event of a fire, the valve will be actuated by a signal from a sensor in the engine room, and gas will rush through the pipes to the source of ignition.

    image
    Cylinders with Freon

    image
    Pressure Gauge

    Monitoring


    All data center health indicators are monitored in real time. This is the temperature (from the sensors of the equipment and indoors), power supply, the load on the network equipment - the data are displayed on duty displays and are controlled by automation. If something went wrong, the monitoring system itself will send engineers a message about the problem (VKontakte and SMS).

    image
    Power

    image
    monitoring Cooling system monitoring

    Access Control System and Security


    Only employees can enter the territory; any door is equipped with an electronic lock with an access card reader. ITSVA is guarded 24/7, in each room video surveillance is conducted.

    To summarize


    ICVA has a very good location: just a few kilometers from the hometown of VKontakte and next to a reliable source of electricity.

    Here, the process of equipment modernization and energy efficiency is continuously ongoing. PUE (Power Usage Effectiveness), or energy efficiency ratio, is a key indicator of data center assessment. It is considered as the ratio of all the energy consumed by the data center and the actual consumption of servers and network equipment. As is clear from this definition, the PUE of an ideal data center in a vacuum is 1.0. ICCA is not an ideal data center in a vacuum, but we are systematically working to reduce this indicator.

    The team of ICVA employees and network engineers of VKontakte is doing everything so that you can enjoy your favorite videos every day, watch new photos of friends and not think about what complex infrastructure is used for this.

    If you are a first-class specialist, you have a chance to get to know our data center better by becoming a part of the team. We invite the head of the test laboratory of the data center and system administrators to work .

    image

    PS Articles about the technical side of VKontakte can also be read on our blog in Russian and English. You can ask the author a question in the official community .

    Also popular now: