Post by account_disabled on Feb 24, 2024 6:59:36 GMT
An announcement has arrived from Google on the official Search Central blog which has as its subject GoogleBots and a tool that was included in Search Console. GoogleBots are the system with which Google sniffs the internet , examines websites and then collects all the information on them that may be necessary when a user performs a search. They are therefore completely harmless tools and actually have a very specific purpose which is to index what is on the pages of your WordPress site. googlebots have changed, the official announcement GoogleBots learn and improve and Google announces this by eliminating a tool. But their work can have repercussions on the quality of the experience that users themselves then have of those indexed contents. A problem that until now had been contained thanks to the introduction of a specific tool that could be modulated.
But according to what Google announced, that America Mobile Number List tool is no longer needed , because bots have become much better at doing their job without interfering with user traffic. So let's examine the announcement and try to understand what the consequences could be for your work when the tool is actually shut down. The most polite GoogleBots, the Crawl Rate Limiter Tool is retiring With a post on the Google Search Central blog it was announced that starting from January 8, 2024 the Crawl Rate Limiter Tool will no longer be active. This is a tool that arrived about fifteen years ago at the request of website managers who sometimes found themselves having to deal with a rather unpleasant situation with the sites, or more specifically the servers on which the sites were built, which they were hit by a very high quantity of bots which then had repercussions on traffic and the maintenance of the sites.
How to manage indexing and crawling on the pages of your wordpress site Crawling has changed, how will your work change now? The tool allowed you to control the amount of bots that could reach a site and its pages. However, despite the excellent idea, the tool was rather slow and usually took around 24 hours to activate the limit which then had to be reset and then entered again every 90 days. Apparently all this work is about to go in the attic. In fact, the announcement talks about " improvements " that have been made to the " crawling logic and other tools available to publishers " that have made the use of this particular tool obsolete. Even in the official post it is recognized that the Crawl Rate Limiter Tool was rather slow while the new technology that supports GoogleBots is now able to automatically perceive if a server risks exceeding its optimal capacity and therefore slow down the frequency of the their visits. A situation that therefore aims to simplify the lives of website managers.
But according to what Google announced, that America Mobile Number List tool is no longer needed , because bots have become much better at doing their job without interfering with user traffic. So let's examine the announcement and try to understand what the consequences could be for your work when the tool is actually shut down. The most polite GoogleBots, the Crawl Rate Limiter Tool is retiring With a post on the Google Search Central blog it was announced that starting from January 8, 2024 the Crawl Rate Limiter Tool will no longer be active. This is a tool that arrived about fifteen years ago at the request of website managers who sometimes found themselves having to deal with a rather unpleasant situation with the sites, or more specifically the servers on which the sites were built, which they were hit by a very high quantity of bots which then had repercussions on traffic and the maintenance of the sites.
How to manage indexing and crawling on the pages of your wordpress site Crawling has changed, how will your work change now? The tool allowed you to control the amount of bots that could reach a site and its pages. However, despite the excellent idea, the tool was rather slow and usually took around 24 hours to activate the limit which then had to be reset and then entered again every 90 days. Apparently all this work is about to go in the attic. In fact, the announcement talks about " improvements " that have been made to the " crawling logic and other tools available to publishers " that have made the use of this particular tool obsolete. Even in the official post it is recognized that the Crawl Rate Limiter Tool was rather slow while the new technology that supports GoogleBots is now able to automatically perceive if a server risks exceeding its optimal capacity and therefore slow down the frequency of the their visits. A situation that therefore aims to simplify the lives of website managers.