Force majeure, or how people lost their data

    The bearded saying goes: admins are divided into those who do not backup, and those who already do. For most people, awareness of the need to make backups comes after a major personal data loss. And, despite the abundance of sentimental stories about how people lost everything, many still continue to hope that someone will backup for them. As a reminder of the incorrectness of this approach, I want to give some examples of how people completely unexpectedly lost their data or were on the verge of this.



    My personal story of the great loss occurred 7-8 years ago. Then I had a couple of small sites and a forum with one of them. Sites did not use the database, they were kept purely on files, because I had a local copy. And here is the forum ... Its backup was made when I changed the engine, about a year and a half before the sad case. On the server where I hosted, there were 4 disks combined in RAID5 for reliability. And at one point, one of the disks rained down. Yes, RAID5 certainly remained operational and continued to rustle valiantly. But the load on the surviving drives has become critical. The database did not last long ...

    While the engineers scratched everything that itches on them, instead of quickly putting a new disc, the second went to the best of the worlds. The gap was only 2-3 days. And because of my youth and inexperience, even knowing about the situation with the first disk, I calmly waited for it to be replaced. As a result, I had to lose the forum base in order to become smarter for the future. I think that if not everyone had such stories, then at least many.

    There are many reasons and ways to lose data. All of them differ in the degree of predictability. There are more or less predictable: system failure, hacking, administrator error. There are also cases of dishonesty when hired administrators in conflict situations did not give access to the data or damaged it. But there have been situations that are usually not expected, but which bring much more significant data loss.

    Fire


    Perhaps the most popular of the unexpected causes of data loss. Despite all the fire protection measures taken, the data centers were burning, burning and will continue to burn. The only question is the scale. In high-end data centers, each server rack has its own completely isolated space, with an independent cooling and fire extinguishing system. Even if something lights up, the fire will not come out of the rack.

    But in some data centers, racks are missing as a phenomenon. That is why such data centers burn very quickly. Nothing prevents the spread of fire through the hangars. I think many people remember the fire situation at hosting.ua, when many lost not only the main sites, but also the backups stored on neighboring servers.



    The photograph shows through a broken window that in the data center they used "warehouse" equipment placement, which contributed to the spread of flame. By the way, cable routing with a “vermicelli” helped to burn another data center quite well.



    Storage of backups in the same data center, where production servers are installed, has failed people more than once. I came across a message dated January 2008 about a man who looked with horror in his eyes at a burning data center in the USA, in which there were both a working and a backup server. After more than two years, the customers of the Ukrainian data center suffered from the same situation, and I started making backups to an independent data center in another country.

    A fire can occur everywhere, and no matter how super reliability the data center promises you, be safe. In July 2012, Canada suffered from explosion and fire infrastructure serving a lot of government information (data on driver’s licenses, car registrations, hunting and fishing licenses, as well as medical information - medical history, treatment plans, etc.). Fortunately, backups have been preserved. And in August 2013 in India, a fire destroyed servers containing personal data of 1.2 billion citizens of the country, collected as part of a government project.

    Flood


    On October 29-30, 2012, Hurricane Sandy reached the US coast. The data centers of New York and New Jersey were preparing to take the strike: they stocked up fuel for generators, agreed on emergency supplies, morally prepared the on-duty teams for 3-5 days to live in the workplace for security reasons. They quickly prepared for a possible blackout, often accompanying hurricanes. What they were not ready for was flooding.

    In many data centers located in the flood zone, backup generators, fuel tanks and pumps for them, and in some places also communication equipment, were located on the basement floors. When the big water came, all that was left for the engineers on duty was to correctly shut down all the equipment and turn off the generators. The level of incoming water can be judged by the photograph of the hall of one of the Verizon data centers.



    By the way, this is not the only case of flooding. In September 2009, due to heavy rain, the server racks of the Vodafone operator in Turkey were lower equipment in the water, and in July 2013 at the technical site in Toronto, where about one and a half hundred different providers are located, due to heavy rains and associated power outages cooling systems.

    "Masks show"


    Removal of equipment “for investigation” or shutdown of some part of the equipment by decision of government authorities is also one of the possible causes of data loss. More often it concerns large projects. Residents of Ukraine remember the fate of Infostore, ex.ua, the popular online store Rozetka. In Russia the same fate befell the iFolder.ru file hosting service, whose servers were disconnected as part of a search for unnamed evidence in a case committed by an unidentified person (wording from the press).

    But do not flatter yourself if you have only a small site with a small hoster. In our not very legal states, anything can be taken out. There are cases when, as part of an investigation into some kind of pornography, they seized the server of a small hosting provider, which has only two or three servers. And seized for a long time. Unfortunately, we are not in Europe yet, where in cases of investigation they usually remove hard drives for a day, merge all the information and return it back.

    Unfair cooperation


    Such cases are extremely rare, but still there. In 2010, due to the conflict between the companies "Makhost" and "Oversan-Mercury" a large number of servers were disconnected from the network. Naturally, each of the companies tried to prove their case and blame the opponent, but this is not easier for customers who have laid down sites.

    The reasons may be more exotic: military operations or special regimes established by the state, terrorist attacks, earthquakes (however, in seismically unstable zones special technologies are used that increase the chances of equipment for survival). I think if you dig more thoroughly through the press, at least in such situations you can find real cases.

    I suggest readers in the comments to share their experiences, their situations, taught and taught to make backups. To those whom life has not yet taught, I want to remind you that the safety of your data is necessary and important first of all to you, and it is you who must ensure and control it, not hoping for a provider, data center and heavenly forces.

    Also popular now: