Google will no longer support unverified and unpublished rules in the robots select protocol, effective September 1, the company said, meaning Google will no longer support robots.txt files with the noindex directive listed within the file.
“In the interest of maintaining a healthy ecosystem and preparing for potential future open source releases, we’re retiring all code that handles unsupported and unpublished rules (such as noindex) on September 1, 2019. For those of you who relied on the noindex indexing directive in the robots.txt file, which controls crawling, there are a number of alternative options,” the company said.
Google listed the following options, the ones you perhaps should have been using anyway:
Google has been looking to change this for years and with regulating the protocol, it can now move forward. Google said it “analyzed the usage of robots.txt rules.” Google focuses on looking at unsupported implementations of the internet draft, such as crawl-delay, nofollow, and noindex. “Since these rules were never documented by Google, naturally, their usage in relation to Googlebot is very low,” Google said. “These mistakes hurt websites’ presence in Google’s search results in ways we don’t think webmasters intended.”
Design
Art
AI
Development
Apps