Expectations of development directions in the field of data storage and processing until 2020. Main trends

The development of IT infrastructure is a steady process that has been, is and will certainly be, gaining only large momentum. Having taken the next level of standards for carrying out their activities, network engineers are opening new horizons, the achievement of which becomes for them another challenge and the basis of their daily work. For the successful functioning of IT companies, it has always been very important to accurately determine the goals, directions of development, the most relevant trends, because as you can see, the IT sphere is at the very forefront of technological progress and is particularly susceptible to the introduction of innovations. In ancient times, for the opportunity to look into the day ahead, people turned to oracles and possessors, now professional associations unite this vacant activity,

It was with this report , most recently, that the IT community AFCOM (Association for Computer Operation Management) was pleased. The subject of a full-scale study was the functioning of data centers. According to the representatives of AFCOM, the presented report is a real event, since it covered the widest layers of specialists involved in the work of the data center (Data Processing Center). The network engineers, managers, programmers and owners of the provider companies in the presented report will give us a clearer picture of the situation in which the industry is now and where it will be in the next 3-4 years. Further, the article will present the main emphasis on which the aforementioned report is based, which claims to be the most representative and most relevant among all similar ones held in 2015.

The future is with mega data centers

From year to year we hear about the huge data centers, which are increasingly being built by such leaders of the IT industry as Amazon, Facebook, Google, Yahoo, IBM, thereby clearly creating a new trend. This situation will definitely continue in the coming years, which in turn will certainly lead to the dominance of these giant sites over smaller data centers that are familiar to us. As a result of the onset of the giant era, the total number of data centers will steadily decline, giving the flows of rapidly growing from year to year the volume of traffic to the largest infrastructure nodes.

“According to our estimates, by 2018 72.6 percent of all hosting services provided will be hosted in mega data centers, while the server area of ​​the total number related to the mentioned data centers will already be 44.6 percent. As of 2013, the area of ​​mega data centers of the total number of sites was only 19.3 percent, ”said Richard Villars, head of the analytical department of IDC (International Data Corporation), on this subject.

While IDC employees can be so skeptical about percentages with a degree of skepticism, the very vector of data center enlargement does not raise any doubts among industry experts. It’s not surprising why progress went precisely along this development path, it is possible to fully understand its origins based on the comments of representatives of two warring parties: operators of mega data centers and network engineers operating small server platforms at enterprises and not large companies.

“The services provided by mega data centers will in most cases be cheaper and significantly better than their small counterparts,” Adam Kramer, vice president of development at Switch SUPERNAP, uncompromisingly says. “Our company has relied not just on giants data centers, some of which are already operating successfully, our Fiber SuperLOOP project has truly become a breakthrough in understanding the existence of the IT infrastructure of the future. Having at our disposal a structure from a complex of mega data centers connected by optical fiber with a network delay of no more than 7 milliseconds, we get a huge potential for instant capacity building for our customers, redistributing peak load, providing redundancy, which allows us to leave our competitors far behind ” .

The position of the representatives of large business is not difficult to understand, but the fact that the operators of small server platforms expressed this was a surprise. Most of the operators surveyed, although not in such a categorical form as Adam Kramer did, were united in their forecast that over the next ten years, most small server platforms in Europe and North America would cease to exist. In contrast to the usual arguments of flexibility of small data centers and the cheapness of server rooms in enterprises, the emphasis was placed somewhat differently.

“Nowadays, a company’s network isolated from the world is more of an exception to the rule, which may be dictated by an excessive requirement for its safety. The need to have employees access to the demanded resources of the enterprise network from anywhere in the world is already a given. Naturally, such a functional can provide each self-respecting company with its own strength, but, as practice shows, every year it becomes more and more senseless. The increased requirements for network loads, the criticality of data availability 24/7, the acceptable response time for technical support - SLAs, clearly lead to an increase in the bills for maintaining corporate networks, at the same time we see an ever-increasing variety of offers from hosting companies whose prices for services are every year are becoming more accessible.

According to a Uptime Institute survey last year, only 18 percent of large companies decided to acquire their own server infrastructure, in the early years this percentage was much higher and reached 46 percent.

“Also, an important factor in the mass departure from small server platforms to large data centers in the near future is the software environment. If you want to use a ready-made corporate software product, you will increasingly face the inability to use it on a “server side” scale, this is a consequence of the fact that now the requirements of such products for the network topology are quite demanding. At a time when large companies can discard ready-made software solutions and allow themselves to create their own working shells for online services, for small companies, in which case a small server could be quite effective, the development of their own platforms is already utopian. Overpaying for the use of "iron",

IT infrastructure is heading towards landscaping

Since the creation of data centers where a rack alone can consume up to 40 kW is always a tangible event for local energy networks, the design of own energy-generating capacities is becoming a necessary step, but why does the choice increasingly fall on renewable energy sources? The issue of the relevance of using data centers of renewable energy sources is rather controversial. IT equipment consumes with the same appetite the energy produced by both solar panels and the energy produced as a result of burning "non-ecological" coal, but experts are united in their expectations - an ever closer interweaving of the IT industry and green technologies is coming. For such a prospect, there are two subjective, but no less important, factors: administrative, marketing. In the USA, Japan, European countries have very serious incentives, from subsidies to bans, companies' use of renewable energy sources, as well as non-CO2 producing. From the point of view of marketing, in the modern world, not being “green” can sometimes be very risky for a business. The risk comes from both structured organizations and the general mood of potential customers. Organizations like Green Peace, periodically create “shameful lists” of companies producing greenhouse gases and nuclear waste, can easily undermine the functioning of an existing data center or create a designed data center. not being “green” can sometimes be very risky for a business. The risk comes from both structured organizations and the general mood of potential customers. Organizations like Green Peace, periodically create “shameful lists” of companies producing greenhouse gases and nuclear waste, can easily undermine the functioning of an existing data center or create a designed data center. not being “green” can sometimes be very risky for a business. The risk comes from both structured organizations and the general mood of potential customers. Organizations like Green Peace, periodically create “shameful lists” of companies producing greenhouse gases and nuclear waste, can easily undermine the functioning of an existing data center or create a designed data center.

“The managers of large providers have no doubts about the need to use renewable energy sources, the problem with their implementation is only the high cost of upgrading existing facilities,” says Matt Stansberry, Content Director at Uptime Institute “Apple is considered the greenest company now , Facebook, Google, however, do not forget that the status of green does not make them the most efficient use of energy resources. To a certain extent, the public is misled. The data center on which the most efficient heat removal systems are used can be significantly more humane to the Earth’s natural resources than another data center, even if it is entirely provided with renewable energy sources. ”

Unlike Uptime Institute employees, representatives of provider companies investing heavily in creating giant solar, wind and geothermal power plants with undisguised enthusiasm report on their successes in "greening" their infrastructure, apparently this situation will only strengthen in the coming years.

“The place chosen by our designers to create the Switch SUPERLOOP is really unique, it allows us to efficiently use not only solar, wind and geothermal energy, we also have the opportunity to use industrial water used by other users in the cooling system for 93 percent” confirmed the general opinion of its Adam Kramar colleagues “Already, we have signed a contract to provide our project with energy from a future solar power plant, with a design capacity of at least 100 MV. Our new station will be located near Las Vegas. "

Optimization and planned cost reduction

The days of weak financial discipline of IT departments at enterprises and in scientific organizations have long sunk into oblivion. Every year, the magic of flashing LEDs and the rhythmic noise of fans are less and less bewitching the management of companies and as a result, IT engineers are increasingly faced with the practice of cutting the budgets of their departments and are faced with the struggle to increase performance indicators.

As a survey by 451 Research shows, out of 1,200 respondents, only 17 percent said that their companies completely refused to support their own network infrastructure by moving it to major data centers, removing the main problems associated with maintaining their own server capacities. At the same time, according to the aforementioned report, 49 percent of respondents claimed that their companies actively use both the internal structure of data storage and processing, and the external one, and this is already a great help for analyzing and searching for ways to optimize the network structure. As you can see from the above data, the "ghost" of improving labor efficiency affects a lot of IT professionals, both engineers and managers.

“When it comes to small data warehouses, our research shows very stable budgets allocated to these units. Despite the dramatically increasing demands and burdens on the IT infrastructure, company executives do not want to significantly revise the IT component cost line. Unlike engineers managing small server rooms, their colleagues in large data centers are faced with the need to increase the efficiency of IT infrastructure for several other reasons, the main reason is the existing fierce competition in the services market, ”explains Tim Jefferson, communications officer at 451 Research ( Tim Jefferson).

Touching this topic, the energy consumption of modern IT equipment can serve as a very characteristic parameter. If in the 90s the average power consumption of the server rack was calculated at the level of 1 kW per square meter, now this indicator starts at 6 kW and above, and this puts forward new requirements for the related server components.

If the need for improvements is generally clear (smaller, easier, more economical), let's look at the main optimization points that industry experts identified in the polls.

Radical abandonment of old decisions in favor of new

For small data storages at enterprises, significant optimization may consist in partial or even complete abandonment of the operation of “internal” storages in favor of transferring this task to large data centers in a row. As the respondents noted, within the framework of the “server room” with dimensions of the order of tens of square meters, the return on small solutions is often reduced to zero, which cannot be said about the same solutions at the scale of large data centers.


Accurate calculation of the needs of the IT infrastructure market even at the stage of designing the data center is a key component, the lack of capacity is as fatal for an effective business as there is an oversupply of it.

Building Innovation

The use of the most modern materials and advanced technologies makes the functioning of server hardware an order of magnitude more efficient. The cost of cooling network equipment is getting smaller every year, which is the result of the organization of cold and hot corridors in structures from server racks, atmospheric cooling of racks and, of course, the use of heat removal due to the placement of equipment in special fluids.

“Thanks to our work to modernize the data center in California, the time it took to fully utilize the capacity of the cooling system was reduced to only 210 hours per year. At a level of 83 percent of all time, the modernized data center is generally ventilated exclusively by atmospheric air, which naturally undergoes preliminary filtration. In addition, our specialists organized a system for distributing the load on the equipment in such a way that customers with the most demanding projects were purposefully placed in areas of high cooling, ”said Ron Pepin, Executive Director of NaviSite, the main activity of the company is to optimize existing data centers.

The aforementioned words of Ron are also confirmed by many participants in the survey, who noted the tightening of all kinds of monitoring and as a result of control over, sometimes, the most insignificant parameters in the work of data centers.

Holistic vision

The construction of data centers not as outposts separated from the world, but as puzzle pieces supplementing it, also contributed to optimizing the costs of their operation. Dozens of data centers are already working, the excess heat generated by which goes to the needs of the office, urban infrastructure, thereby returning part of the money spent to its owners.

Changes in the structure of the daily tasks of engineers serving the IT infrastructure

According to the estimates of people who took part in the survey, in the next 3-4 years, employees serving the corporate server infrastructure will feel especially sharp changes in their daily tasks. This trend is already evident now and the reason for this was the increasingly frequent use of cloud services by companies. At a time when the digital vital activity of companies is increasingly migrating from server rooms to data centers, network engineers are gradually moving away from working with the “iron” part of the infrastructure and are increasingly joining in with its “software” component.

“When dealing with the hybrid structure of our own server capacities and transferred to data centers, the problem of debugging and linking this entire economy into a single ecosystem becomes a serious challenge for many system administrators, requiring new knowledge and experience from them” - Jeff Klaus - Managing Manager, Intel. “Monitoring the temperature in the server room, backing up power, laying communications, these and other similar physical processes cease to be the main type of work of system administrators in enterprises. Since the organization of a software component, its monitoring and debugging often does not require a constant, physical presence of a specialist at the enterprise, this type of service will increasingly go to outsourcing. ”

As for the staff of large data centers, some of the respondents expressed the idea that some changes in everyday tasks are pushed forward and this category of IT sector employees. First of all, the changes will affect not so much technical specialists, as exactly managers on work with clients. Faced with the new realities of building complex data center clients by combining their facilities and data centers, customer service managers will increasingly be forced to answer complex technical questions. As one of the interviewed specialists noted, “Soon we will have a situation when even the best“ sales people ”do not have specific knowledge in the field of interaction of software shells and establishing communication among scattered local IT nodes,

Cadres decide everything

As for the personnel issue, the result of a survey of senior managers and owners of provider companies was very interesting and somewhat unexpected. Almost all the respondents showed that at the given moment companies engaged in the provision of data center services are experiencing serious staff shortages, which will only increase by 2020. It was not so much the fact that there was a shortage of specialists that was surprising, the whole IT industry in general suffers from this problem, the figures are unexpected, according to which 81 percent of managers are faced with the problem of finding qualified personnel, and the time to search for the employee who needs to close the current position is too long .

There are actually several main reasons for staff shortages. First of all, this is of course the growth of production capacities, which need more and more highly qualified specialists, the preparation of which takes more than one year. Secondly, progress is thickening, which gives rise to completely new vacancies in which specialists have to be pulled out of all kinds of related industries.

Mega data centers like the ones built by Switch SUPERNAP are a very typical example of when hundreds of new vacancies are opened in a very short time in the face of existing labor problems. Since by 2020 the growth of commissioning of giant data centers is expected, the problem with filling these IT nodes of the network infrastructure with personnel becomes enormous. As for the mentioned company, its management sees the problem and takes its solution very seriously, establishing contacts with educational institutions of the state of Nevada.

“We work closely with the state educational system. Institutes, colleges and even schools are all the places where we select our future employees. Thanks to the planning of our activities, we can formulate a request for young specialists who, after graduation, will be able to find their first job in our company, ”noted Adam Kramer, vice president of Switch SUPERNAP.

“Architects, environmentalists, heat and power engineers, water engineers, and many others are those specialists with whom the earlier functioning of data centers had nothing in common and without which it is now simply impossible to imagine the work of a large data center,” said Karsten Sherer employee of the analytical company TEKSystem.

A very favorable situation for data center employees is taking shape in terms of personal development prospects. The problem with staff shortages has become a real gift of fate for existing workers. As many of the surveyed rank-and-file employees noted, often their companies offer free continuing education courses for them, the opportunity to obtain educational certificates at the expense of the company and very attractive opportunities for quick career growth, with an increase in money allowance.

“Nearly 72 percent of the leaders of the provider companies we interviewed stated that they are planning or are already implementing targeted training programs for their specialists with the new knowledge centers needed for the successful operation of data centers,” said Ted Lane, founder of the research firm Foote Partners “All the managers we interviewed also confirmed the logic that a specialist who has received training and certification certificates is legally entitled to claim an increase in their salaries. ”


The above results of the aforementioned survey were conducted in developed countries in Europe and North America, for this particular region of the Earth they will be most relevant. At the same time, they can become very interesting for employees and customers of hosting services in the post-Soviet space. The described trends are quite universal and relevant not only for the “Western” world, but also for everything else, because they describe the problems of growth of the industry, which are faced to one degree or another on all continents. As you can see from the above material, and in the next 3-4 years, the data storage and processing industry will be most susceptible to: enlargement of data centers, "landscaping", cost optimization, some shift in production tasks, lack of qualified specialists. All these problems are relevant for us,

Also popular now: