Improving search quality or Yandex subjectivity?

    All of us have long and heard a lot from Yandex about the continuous improvement of search quality. Algorithms AGS-17 and AGS-30, the fight against doorways and so on. Many rejoice at the “good delivery”, many cry about the departed nets of their govnosites, which brought good profit in the bright past. But are these algorithms so good? Do they make mistakes? How Yandex "cleans" Runet?

    I am developing the “Kvartirka” service - a network of sites offering a catalog of apartments for rent in 24 cities of Russia. The purpose of the project is to simplify the interaction of apartment owners with their potential customers. Apartment owners post their proposals for renting apartments with prices, photos and so on. Visitors to the site can choose a suitable accommodation from a variety of options and contact the owners directly. Such a project allows you to make this chaotic market civilized, owners find their customers, and those who want to rent an apartment get rid of the need to pay commissions to countless agencies and buy a “pig in a poke”.

    The project was implemented on 24 domains (one domain - one city). You can select another city from the domain (links are available), but to avoid problems with Yandex draconian algorithms, the city selection block was added to the noindex tag.

    However, problems could not be avoided.

    It all started back in the fall, when 5 cities came under the AGS-17 filter. Only the main pages of these sites are left in the index. Upon detection of this fact, a letter was written to Yandex, to which a template response was received:
    Not all pages of the site known to Yandex search are included in the index and
    are ranked high. Their inclusion and position depend on the quality of the site and its
    content from the point of view of users, and many other factors. The set of
    criteria reflecting the quality of the site also includes the placement of
    SEO links on its pages , which we consider to be bad practice that harms search users.

    It is worth noting that there are no SEO links on the site and never have been. The reason for introducing the filter is more or less obvious - 24 domains with the same design cannot pass the Yandex vigilant filter. What is “user perspective”? How did they determine that he did not like them? Have you personally talked to everyone? Ya. Metrics on sites was not installed.

    But since the main page, which attracted a large number of visitors from Yandex, was still in the index, the introduction of the filter did not significantly affect the work of the project as a whole.

    Yandex dealt another major blow to the project on February 15, 2010. 6 sites completely flew out of the index. It didn’t seem like a ban - sites were remarkably added to addurl. But Yandex’s response stunned:
    After analyzing and classifying the pages of your site, our algorithms decided
    not to include it in the search. Please note that not all sites
    known to Yandex search are included in the index and are ranked high. Their inclusion
    and position depend on the quality of the site and its content. The
    use of search spam, the presence on the site of pages
    intended for the indexing robot, and not for reading by users, the
    placement of unique information, and other factors may influence the decision of the algorithm .

    What rules did the site violate? Where is it spam, content for the indexing robot and other abominations? Yandex's answer was this:
    We do not enter into correspondence regarding specific technical techniques on the site
    or the addresses of substandard pages. Please read
    our recommendations carefully :

    The recommendations were carefully studied, but, of course, the sites did not fall under any paragraph of violations. Another letter was written in which I quoted and commented on the Yandex rules with a request to answer on the merits or lift the sanctions on the sites. The answer was:
    If you will develop your site for users, place
    unique and useful information on it, then it will have to appear in the search.

    Wonderful! "You have violated that, do not say that, do that, do not know what."
    The site is quite self-sufficient and convenient - we often received positive feedback about our work, both from the owners and visitors of the site. What unique and useful information does the site still need if it already fully meets all the requirements of visitors?

    Some statistics of one of the domains:

    As you can see, dropping out of the Yandex index caused some damage to attendance, but not so much as to put a club on the door. The quality indicates the quality of the numbers - the average stay is more than 5 minutes, almost 4 pages of views per person on average. For such a service, these are good indicators - five minutes is enough to choose a suitable offer and decide.

    Why is a service that exists on 24 domains, of course, is of poor quality? This is just a form of work, nothing more. Yandex, in all likelihood, considered this a duplication and hung the label “substandard”.
    How is site usefulness determined for users? A second look at the moderator resource Yandex hires the best marketers in the world and puts them in the search department and support service?
    Why should useful and relatively high-quality resources fall under filters that are configured to destroy HS and doorways from issuing only by coincidence of extremely subjective parameters?

    In my opinion, a search engine should provide the ability to search all resources created for people. Killing doorways and HSs is one thing, but throwing normal “white sites” out of the index just because filters and platons considered the resource “poor-quality” and content filled with uninteresting users is completely different.

    UPD: Sites returned to the index =)

    Also popular now: