Penguin, virtualization and $ 23 billion: how and why cloud technologies have forever changed the IT world



    Every few years, a new technology or paradigm appears on the IT market that radically changes the business models of most companies.

    For example, 25 years ago, this paradigm was a massive boom of PCs, the beneficiary of which was Microsoft. 15 years ago, the massive spread of broadband Internet turned Google and Yandex into a billion-dollar company. 10 years ago - an explosion of interest in mobile development and the beginning of the stellar era of Apple. Now, perhaps, in the midst of an era of clouds. Under the cut, we tell how and why this era has come.

    It is important to emphasize: in all these cases we are talking about mass products, around which the business is built, and not about some kind of technology that turned out to be at the center of the HYIP. For example, many developers work with blockchain or virtual reality, but these technologies have not yet produced products that are used daily by millions of consumers.

    For comparison: the global market for cloud services, according to Gartner, this year will reach $ 23.6 billion, and by 2021 it will triple to 63 billion! What is the basis for such an optimistic scenario? And how do clouds attract both mass consumers and market leaders?

    Multi-core and broadband


    Formally, the era of cloud technology began in 2006, when Amazon first introduced its cloud platform. However, the concept of clouds is not new in itself: the idea of ​​providing services to users over the network was expressed in 1993 by the little-known technology manager of Sun Microsystems Eric Schmidt (yes, that one!).

    It was Sun who first proclaimed: “The network is a computer!”.

    The advantages of this approach are obvious. The user does not need to install anything (or practically nothing). Simply connect through the browser to the desired node and get the necessary service. Actually, modern cloud services - “Yandex. Drive, Google Doc, DropBox - and work. However, 25 years ago, such a service was available only in bold fantasies. It could become a reality only due to the fact that two important trends coincided.

    Firstly, the Internet has become truly massive. Back in 2010, the number of users with broadband access to the network in Russia, China, the United States and the EU has already counted hundreds of millions. The massive construction of 3G and 4G networks has finally secured the connection of already billions of users with the virtual world.

    Secondly, equipment manufacturers for several years were able to achieve a significant reduction in prices for it, while simultaneously increasing productivity. For example, 2004-2005 became the dawn of the era of multi-core processors. During the year, IBM, Sun Microsystems, Intel and AMD presented their dual-core solutions. Such processors are able to simultaneously process several streams of information at once, so that their performance is higher than that of single-core models.

    To obtain the required amount of processor power, less equipment was needed, while the consumption of electricity did not grow much, which means that, in general, costs have decreased. Since the key cost item for data centers is electricity, they have the opportunity to improve energy efficiency. And thanks to paying only for the resources actually consumed, instead of buying equipment, having their own server rooms, highly qualified staff and training them, as in the traditional version, cloud technologies have become a serious incentive for business consumers to “move” to data centers.

    As a result, many consumers, especially small and medium-sized businesses, have lost the incentive to build their own data centers and engage in businesses that are not inherent to them. Why do something yourself, if you can buy the service at a bargain price from a professional provider?

    Many Faces Penguin


    Although the price is certainly one of the key issues in choosing a product, moving to the clouds is not associated only with this. Cloud services have another advantage for which developers and professional consumers love them so much.

    This is flexibility. Clouds are arranged in such a way that give developers access to a virtually unlimited amount of computing resources and memory. Thanks to this, the process of scaling and administering the “clouds” becomes an easy task.

    The secret of flexibility is to use virtualization technology to abstract software from the hardware on which it works. The boom of virtualization solutions is connected with the Linux operating system, which, based on the Unix operating system, was developed for its computer by the Finnish programmer Linus Torvalds. He laid out his work in a public repository and, unwittingly, launched the key product for the cloud industry.

    The first official version of Linux 1.0 was released in 1994. The Linux trademark was registered a year later, in 1995. The Linux logo was Tux - a penguin, painted in 1996 by programmer and designer Larry Ewing. Now Taks is a symbol not only of Linux, but of free software in general.

    Due to the fact that the Linux kernel code was originally open, thousands of developers from around the world began to refine it to fit their needs and constantly offer new features. As a result, Linux has a lot of distributions - operating systems that use the Linux kernel, but differ significantly in the set of utilities and applications. Now there are more than 600 distributions of Linux, half of which are constantly being updated and updated. The most famous of them are Ubuntu, Debian Manjaro.



    Linux is also widely used in Sberbank and Sbertech. Sbertech mainly uses three distributions: RedHat Enterprise Linux, Centos and Ubuntu.

    In addition to the openness of the system in Linux, there were many successful architectural solutions. For example, unlike Windows, this OS was originally built with the support of a variety of processes. In such a model, virtually every single task becomes a separate process that can be “thrown” onto one of the processor cores, thereby increasing both the utilitarian equipment and the speed of the application itself.

    Although Linux was never able to defeat Windows in the battle for personal computers, he won the war in another field. Linux is free, so it is most often used as an operating system for servers. This predetermined its massive distribution and popularity among developers. There are various versions of distributions: some of them are characterized by high stability, others - support for the latest versions of programs and libraries. One of the standards of stable distributions for the corporate sector is RadHat Enterprise Linux. It is famous for its high reliability and is actively used in Sberbank. But at the same time it contains older versions of software and libraries.

    Capturing the server OS market, the Linux ecosystem began to grow rapidly. The variety of Linux distributions has just set one of the operating system development vectors. Due to the fact that different OS versions could be installed on different servers, and also due to the lack of analogues of some applications written under Windows, many hypervisors appeared in the Linux ecosystem.

    Hypervisors are special programs that allow you to install another virtual system on top of one operating system. With their help, applications installed in this virtual system are not aware of the fact that they are “not basic” and which hardware they work with. From this seemingly simple and even amusing fact, a huge number of unexpected promising opportunities follow.

    Virtualization effect


    Virtualization allows you to abstract the application from the "iron" and its main OS. This makes it possible to combine physical computing resources into single logical blocks that are not related to the hardware component and at the same time logically isolated from each other.

    In fact, cloud services provide the user with a virtual machine, on the basis of which he can create and deploy almost any application, without thinking about how compatible they are with the hardware on which they work.



    If we run the application on a regular cluster of several servers, then the administrator must constantly monitor its work. In case the load increases by the application and the servers cannot cope, another iron server with a configured operating system and application must be ready. Thus, in order to dynamically change the power requirements, we need time, which may not be due to the criticality of the task, the expert explains.

    Using virtualization allows you to go up a level. Due to this, the developer and system administrator have a huge scale for experiments.
    Development of applications for cloud platforms can be called a separate art, which is now forced to master every programmer. Unlike classical architecture, cloud applications must scale linearly. To achieve this result, you can use well thought-out interaction design between the main elements of the application. Using any number of servers with different operating systems, you can combine them with one control system (for example, Kubernetes), and all the machines will become a single cluster. From the position of the consumer, a “supercomputer” appears, which has the total RAM and processor power of the servers it consists of.

    For example, on the basis of a working cluster, you can create machines with new functionality. In this case, additional engineers to monitor the work and support of new virtual machines will not have to hire.

    You can also achieve not only flexibility, but also increase equipment utilization - processor power and memory are beginning to be used dynamically and do not stand idle.

    Money in the clouds


    The combination of low prices and flexible settings to meet customer needs thanks to virtualization technology has turned the clouds into the most promising format for providing IT services.

    One of the first uses of public and hybrid clouds in a corporate environment was their use as test environments. The demand for equipment, for example, for organizing load testing, is always intermittent. After the new version of the application is ready, the test bench is needed urgently. But after the tests are over, he can stand idle. Here, cloud technologies come to the rescue: with their help, you can quickly create the necessary test environment at exactly the time for which it is needed and do not overpay for the subsequent equipment downtime.

    The second use of clouds is to use them during peak hours. Timely connected servers ensure stable processing of requests from users and high quality service.

    The third application of clouds is the organization of distributed work of development teams. For productive collaborative work of programmers located in different parts of the country or the world, each of them must have access to a single development system, which is provided by modern cloud platforms.

    At the same time, private clouds have been used in the corporate environment almost since the advent of the definition of “cloud computing” from the National Institute of Standards and Technology of the USA, being a logical development of virtualization technologies.

    Now every large high-tech company has its own cloud services in the portfolio. For example, Microsoft is developing its Azure platform, Google is the Cloud Platform, Amazon has a separate direction, which is called Web Services. In general, the main business models of industry companies can be divided into three categories depending on the service provided to consumers.

    The first option is to provide infrastructure as a service (IaaS, Infrastructureas aService). In this case, the client independently constructs and manages its IT infrastructure in the cloud - creates virtual networks, adds virtual equipment (servers, storage, databases), installs the application software and operating systems necessary for the work, etc. The most famous IaaS solutions: Amazon CloudFormation , GoogleComputeEngine, Windows Azure.

    The second common format of provider interaction with the consumer is to provide a platform as a service. PaaS, Platformas aService). In this case, the cloud service provider provides the user with access to operating systems, database management systems, development tools and testing. The consumer of cloud services gets the opportunity and means for independent creation, testing and operation of software. At the same time, the entire information infrastructure (computer networks, servers and storage systems) is managed by the provider.

    The third form of cloud service interaction with the consumer is software as a service (SaaS, softwareas aservice). In this case, the supplier provides users with a ready software. All data is stored in the cloud, and the user only needs a web browser to access it. This interaction format does not require additional costs for installing and configuring software, as it is necessary to use IaaS and PaaS. User charges in this case are tied to the number of licenses leased. The most striking example - in this format, Microsoft sells subscriptions to its Office365 package.

    Although cloud computing has already gained the attention of corporate and private users, it must be borne in mind that they have their drawbacks and risks. One of them is data confidentiality. Periodic scandals caused by the discharge or burglary of cloud-hosted databases make many consumers wary of cloud services.

    The second risk is the loss of communication with the cloud data center. Whatever smart algorithms are used on the servers, they are still powerless against the "accidental impact of a shovel on the cable."

    For banks, both of these risks are particularly relevant, because customer confidence is the basis of their business. In such a situation, the largest banks and financial companies prefer to build their own cloud systems with a very high SLA level.

    The data centers built by them more closely resemble secret bunkers - several independent routes with cables for access to the network are laid to the building, an uninterrupted power supply system is being built. Specialized ventilation systems are installed in the building, which are able to maintain the desired temperature and humidity inside, even against the background of temperature surges and voltage drops.

    Sberbank has its own private cloud. The iron base for it is the data center "South Port" and the data center opened in Skolkovo a year ago. The project of this data center corresponds to the Tier 3 reliability level. The bank plans to build a third data center, which ensures that the Bank will always be in touch with customers.

    Also popular now: