Need help buying now? +48-22-219-5163 +1-917-720-3250
My Cart

Your Cart is Empty

Need help buying now?
+48-22-219-5163 +1-917-720-3250

Google indexes pages

Google indexes pages

Fast and effective site indexing is the main part of any internet project success. Let us review how Google indexes pages, what a ‘bot’ is and what their types are.

General information

A bot is part of server software, the main purpose of which is checking your web site for updates, investigation of all site pages, adding them to the database and spreading the information about the site among the search results in a specific search engine.

Google search engine has several known bots: Adsensebot, Freshbot and DeepCrawl.

Adsensebot is mainly intended for webmasters who use Adsense context advertisement on their sites. At the moment when the site is updated (a new page, product, text added) JavaScript from Adsense code sends message to Adsensebot, it visits the page within 10 - 15 minutes and indexes its content. This action is necessary in order to have the ability to place useful search advertisements on the site pages.

Freshbot is a bot of popularity. It visits the most popular and well-visited pages of a certain site. The amount of its visits may vary depending on the frequency of the site updates and customers’ visits. For instance, such gigantic platforms as eBay or Amazon are visited by Freshbot every 10 minutes or even more often. Average sites with the amount of users up to 500 per 24 hours are visited once in 4 - 10 days. This bot checks all website links and pages and places them to the search engine database, and after that they are checked by DeepCrawl.

DeepCrawl is the most complicated bot, it checks all data that are placed into the database by Freshbot, even its name shows how deep that check is. This is necessary in order to place as many sites as possible into the full Google index. It visits site once in a month, sometimes twice, if it is a large web store or a news portal.

Sometimes, checking site by bots is complicated by so-called ‘trash’ in the results of their work. For example, Supplemental filter may ban a category page, search and filtering results. This negatively affects the site rating and often prevents bots from checking the site. In order to avoid that you need to add several code lines into robots.txt file. They are as follows:

Disallow: */tag/*

Disallow: */author/*

Disallow: */page/*


You may also use GoMage Advanced Navigation extension that allows you to hide product filtering and search results from indexing, which significantly reduces the amount of pages banned by Supplemental filter.

And have a look at all the others Magento Extensions which will help you to increase sales on your Magento store.