SEO for Google in 2018: a well-forgotten new

    The times when updates to Google’s search algorithms rolled out in large clusters and bore lovingly selected zoological names were left behind. In the fall, one of the company’s representatives casually remarked that now the algorithms are adjusted several times a day and the general public is notified only of an insignificant part of these changes. This was to be expected - as artificial intelligence gains strength, the development of the ranking system accelerates and becomes less discrete.



    But for the average site owner, it becomes more difficult to separate the grain from the chaff with this approach. To feel more confident, I decided to conduct a brief revision of the innovations and growing trends that will affect search engine optimization in 2018, as well as recommendations that it makes sense to adhere to in their context. Under the cut you will find five main trends.

    Quick and long


    UX manuals write about the importance of fast downloads in vain, but as it turns out, it matters not only to impatient users. Just the other day, on the Google blog , information appeared that they intend to take this parameter into account starting in July of this year. The official statement reassures users by assuring that the repression will affect only the most negligent, but the community has suspicions that the requirements will be very stringent. So, if the average visitor, according to statistics, leaves the sagging page for six seconds, for Google, according to one of the representatives, the optimal download speed is two to three seconds. Check out how your site is doing on this front, official sources recommend using tools like the Chrome User Experience Report, Lighthouse, and PageSpeed ​​Insights. Many webmasters opt for AMP, which speeds things up significantly, but before following their example, it’s worth considering the negative aspects of such a decision.

    Another temporary indicator, which, according to an official statement by Nick Frost in September , pays close attention to RankBrain - the duration of the session on the site. The pattern here is transparent: the longer the user stays on the site, the more relevant the information presented there and the higher the site should be placed in the search results. According to SearchMetrics statistics, the average duration of a session for pages from the top ten in the SERP is three minutes ten seconds. Perhaps, with such indicators, it is not surprising that the place in the search results also correlates with the volume of the text: those pages that contain more than 2000 words achieve the best results.

    Voice Queries


    Now about 40% of users communicate daily with Google through the speakers of their phones: speech recognition has reached such a level of accuracy that it really saves time and simplifies matters and gradually becomes a full-fledged alternative to the keyboard. Experts predict that in the next three years the number of voice and print requests may be equal - accordingly, you can start rebuilding now. There is, however, good news: perestroika here so far is not required so radical, it is enough that key requests fit into the canons of lively conversation. If you read them out loud, you don’t feel like an epileptic robot (“download free torrent HD 1080 movies”) - most likely, everything is not so bad.



    Expanded Passage Example

    The fact that in this situation the queries formulated as a question (“what is a transaction”, “how to take a screenshot”) will grow in popularity, goes without saying; it is also natural that the issuance of them will contain Featured Snippets (specially designed extended descriptions), which the robot will read in the first place. As a result, the value of this microformat increases even more. There is no exact instruction on how to get into the circle of lucky people selected by Google for extended descriptions, but, according to experts, the chances increase the following characteristics:

    • clearly structured text with pronounced semantic blocks;
    • capacity, information content saturation;
    • direct occurrence of a query query.

    Personalization of issuance


    The course on adapting search results to the interests of a particular user can be considered a constant in the development of search services, and Google is no exception. In its attempt to guess what exactly the person wanted to see when entering the request, the company collects, carefully stores and feeds RankingBrain any information about him and his actions - what is indicated in his profile, what he searched before, on which pages he was delayed, from which device he carries out Search. The main goal of processing this entire hash of data is to reveal the user's intention, place him in the context of established habits and preferences, and select the results most relevant to this picture from the output.

    All of the above is achieved through in-depth semantic analysis of the page content. This means that simple times, when it was enough to add a couple of direct short-circuit occurrences and adjust the level of nausea, are a thing of the past. Now RankingBrain parses all the text to make sure from the lexical composition that the source really knows what it is talking about - that is, the keys are not special, but surrounded by a context of words that the algorithm associates with the relevant topic.

    In addition to the relevance of the content, the algorithm checks whether what the site offers is suitable for the basic intention of the user - to find out, buy, download, calculate. In general, this plays into the hands of the owners: pages that give visitors different types of value (for example, a site for rock climbing fans and an online equipment store) need not worry that they will compete with each other for a place in the extradition. In theory, the robot should sort them into different categories independently, but for fidelity, you can focus on the keys that call the target action of the visitor.

    Finally, a thorough study of the background, coupled with growing mobile traffic, leads to what is called hyperlocalization - the formation of SERPs with strong reliance on the user's geolocation. Correspondingly, companies whose services have at least some geographic reference do not fail if they begin to focus on it in the semantic core.

    Priority mobile versions


    Talk that Google will pay more attention to mobile versions, went back in 2016 , but judging by the announced timelines, they will soon become a reality. Here you need to clarify: the state of mobile sites will not only be taken into account in the assessment - it will take the place of the determining parameter, even if the user searches from a computer. The mobile version of the site is now considered as the main one.

    So what to do? First of all, of course, to properly adapt the site for phones. Google emphasizes that it’s better not to have any mobile version at all (in this case, the robot will calmly move on to analyzing the desktop - there’s nothing and there is no trial) than to make it as necessary. In addition to technical characteristics, the relevance of the page will also be played by the compliance of the page with UX principles and, in the first place, markup is imputed. In addition, experts advise making sure that the content on the desktop and mobile versions matches - carefully optimized texts will be of little use if they are cropped to save space on a small screen.

    Reference weight: from quantity to quality


    Link exchanges, aggregators, bulk purchasing - this is a whole era in SEO, but in 2018 it will probably finally come to an end after a long attenuation. Gary Illis shed light on Google’s position regarding the link question last September in this simple statement :

    “In short, if you publish high-quality content and are actively quoted on the Internet - I’m not only talking about links, but also mentioning on social networks or just discussing the brand and all that crap, then you’re doing great.”

    In other words, the point is not that the link mass has lost weight as a ranking factor (according to official representatives, it remains the leading parameter along with the content) - just mentions without links gradually reach a comparable level of significance. The same algorithms that Google uses to semantically analyze content for relevance allow it to track whether other companies are talking about your company and if so, in what way. Accordingly, it will be more difficult to balance a bad reputation on the Web with a large number of “bare” links and, on the contrary, companies that discuss with enthusiasm, especially on reputable sites, can earn a lot of points, even if no one points to a link to the official website. Thus, SEO here begins to blend in with the good old PR. The recommendations here are essentially

    By the way, discussions should be stimulated not only on third-party resources, but also on their own website, although many consider the comment field a relic of the past. It turns out that Google loves them even more than activity on social networks.

    So, in general, in 2018, it would seem, nothing shocking is waiting for us - good content is still in price, black and gray promotion methods are still met with rebuff. However, the effectiveness of the technologies behind these rules is steadily growing, and the traditional game of cat and mouse is losing its meaning more and more - it is more profitable to follow the rules than trying to circumvent them. Mobile focus, fast loading, more specific and detailed search queries and compliance with the basic requirements for content and brand reputation - these are the things that the SEO community should tune in in the near future.

    Also popular now: