Shared Hosting vs Clouds

Let's compare shared hosting and cloud storage.
image
To begin with, let us recall one curious and fairly fundamental story from the general course of the economy. In the 70s of the last century, one entrepreneurial person, the future Nobel Prize winner, George Akerlof described an innovative model of the economy. He called it “ The Lemon Market, ” making it clear in advance what it would be about.

So what did he do?

He reviewed the used car market. As now, 40 years ago in this market not only cars whose owners cherished and cared for their cars, but also those that were in rather poor condition were sold. It is the latter that are called “lemons”. The whole model rested on one important fact - the buyer never fully knows whether he is buying a lemon or a good solid car.
At first, it was a used car that was called “Lemon” in America, and later this term was used to call any thing that you cannot be sure of.

It is logical that in fact this problem is solved quite simply - it was only necessary to indicate ALL the exciting parameters of a potential purchase - not only mileage, but, say, motor resources. The fewer parameters of a potential purchase we set, the more likely it is that it will not satisfy us.

The same rule also works very well, for example, with shared hosting. A lot of memory in the package? Get in the appendage slow SATA drives. SSD hosting? In addition, the processor on Atom. And other unpleasant combinations.
If we do not specify specific selection parameters according to the criteria we need, then, according to the “law of lemons" we get illiquid assets according to criteria that we forgot.

The clouds

What does this threaten us with cloud services?


The parameters that should really be important to us are the processor clock speed, bus speed, memory type, size and ECC. But what do we see in the descriptions? The amount of space (without iops), the number of cores ( which cores? ), The amount of RAM ... It turns out that we see only the "marketing" parameters, which, in fact, do not do any weather . And this means that most attempts to compare cloud services head-on ( for example, according to the price list ) are doomed to failure or fraud. Here it is - the victory of marketing in IT! We stopped looking at the essence of the cloud, but only measured marketing values.

Therefore, starting to choose according to the principle - I have more RAM and cheaper, we will fall into a pre-set trap. But why should the cloud hide its real hardware?

Let's take a look at the example of a provider. Let's say I'm a regional provider that has a 1GB dedicated channel. I sell it to twenty users - as a gigabyte (or fifty users, as lucky). If all of them simultaneously do not turn on a torrent capable of clogging the entire channel, then no one will notice that gigabit has long been communal, and not its personal one. Shared hosting

does the same thing - selling space and memory. In fact, far from all sites, someone is constantly watching - and, as a result, they can sell one server six / seven times .

The “cloud” works in approximately the same way. The transition to conventional units allows us to lose sight of actually consumed resources and begin to measure the world "in parrots." The same system allows us to sell us one server several times. And because of this, we get another unknown variable in working with the cloud.
We do not know how many resources the cloud actually has, how they are currently loaded. We don’t even know how much real resources we consume ourselves.

Partly understanding the loss of such a system, most cloud services try to bind themselves to additional services. For example, using the Amazon cloud, you will instantly become a user of their balancer, etc. Dedicated servers are the same for all providers. From this - low margin and a lot of competition. But for now, clouds are personal.

Also popular now: