Take a look at your page through the eyes of Googlebot

Original author: Posted by John Mueller, Google Analyst
  • Transfer
Webmaster Level: Any

The View as Googlebot feature in Webmaster Tools helps you understand how your page looks for Googlebot robots. Server headers and HTML code help to identify errors and the consequences of hacking, but sometimes it is difficult to understand them. Webmasters usually grab their heads when they have to deal with such problems. To help you in such situations, we have improved this feature, and now it can display the page using the same algorithm that Googlebot uses.

How the scanned page is displayed
When processing a page, Googlebot searches and imports from external sources all the files associated with it. Typically, these are images, style sheets, JavaScript elements, and other files embedded with CSS or JavaScript. The system uses them to display the page the way Googlebot sees it.
The View as Googlebot feature is available in the Crawl section of your Webmaster Tools account . Please note that processing the page with its subsequent display may take a rather long time. After its completion, hover over the line with the desired URL to view the result.
for a regular Googlebot robot
for a regular Googlebot
for Googlebot for smartphones
robot for a Googlebot robot for smartphones

Processing resources blocked in the robots.txt file
When processing the code, Googlebot takes into account the instructions specified in the robots.txt file . If they prohibit access to certain elements, the system will not use such materials for preview. This will happen if the server does not respond or returns an error. You can find the relevant data in the Scan Errors section of your Webmaster Tools account. In addition, a complete list of such crashes will be displayed after the preview image is created.
We recommend that you provide Googlebot with access to all the embedded resources that are on the site or in the layout. This will simplify the work with the function “View as Googlebot”, allow the robot to detect and correctly index the content of your site, and also help you understand how your pages are crawled. Some fragments of code, such as buttons on social networks, scripts of analytics tools and fonts, usually do not determine the design of the page, which means that their scanning is not necessary. For more information on how Google analyzes web content, see the previous article .
We hope that our innovation will help you solve problems with the design of the site and discover resources that Google, for one reason or another, cannot crawl. If you have questions, please contact us atthe Google Plus webmaster community, or search for a response in the Google Webmaster Help Forum .

Also popular now: