Own service for collecting site positions in search results

I think most beginner webmasters, like me, have faced the task of determining the position of the site, for some keywords in the search results.

The first, sane solution that I found was the site - allpositions.ru, but for some reason the data was different from those that were displayed to clients or those that I saw myself when viewing the output through a browser. Differences in most cases are insignificant (1-3 positions up or down), but are always present.

A thorough analysis of the ranking algorithms showed that the position of the site in the search results of Google and Yandex, in addition to the search engine domain, user's browser language, location, IP, is probably affected even by the moon phase. Accordingly, the position of the site in the search query for each user may differ and it is possible to determine only the average value of this value.

If you need to get position statistics for search queries for one or two sites, the service perfectly solves the problem, but in my case it turned out quite expensive (~ $ 136) to collect statistics for 100 domains.

The best I found is a-parser.com, also paid + you need to buy a proxy once a month (~ 110 $) and pay a droplet on digitalocean (~ $ 20), but in the end, in addition to taking off positions, my and my competitors, I decide many more SEO tasks with it. I think this is a must-have software if it is expensive for you to maintain semrush, ahrefs, wordtracker and other similar services.

In order not to violate the rules of the hub, I will provide here a link to solving the problem with positions:


Which can be tested in the demo a-parser.com/pages/demo

Also popular now: