Google is considering lowering the page crawl rate to save computing resources. This topic was discussed by search engine employees John Mueller, Martin Splitt and Gary Illyes in the latest episode of the Search Off the Record podcast.
According to SEO specialists and site owners, over the past year, Google has become less likely to crawl and index pages. Search engine employees noted that Google is striving to make scanning more environmentally friendly by saving computing resources. Since crawling and indexing takes place virtually, site owners do not think about how this affects the environment.
Computer technology in general is not environmentally friendly. If you think about Bitcoin, for example, Bitcoin mining has a real environmental impact that you can measure, especially if the electricity comes from coal-fired stations or other less environmentally friendly stations.
“We’ve been carbon-free since 2007 or 2009 (can’t remember exactly), but that doesn’t mean we can’t reduce our environmental footprint even further. Scanning is one of those things…,” said Gary Illyes.
Scanning can be made more environmentally friendly by reducing the number of content checks. Google performs two types of crawling:
- scanning to detect new content,
- scanning to check for updates to already indexed content.
The search engine is considering reducing the number of scans of the second type – in order to check for content updates.
SearchEngines reports that there is no need to return to the same URL too often. The exceptions are large well-known sites.
For example, the Wall Street Journal constantly updates the homepage with new content, so this page should be scanned more frequently for changes. But the About Us page on the Wall Street Journal is unlikely to be updated often, so it makes no sense for a search engine to re-crawl something that does not change.
NIXSolutions notes that so far, the Google team is only considering this idea. Whether the search engine will really reduce the number of scans is not known.