Google said the past 25 yearsRobots Exclusion Protocol (REP)The protocol has always been a standard of agreement, bringing a lot of uncertainty to webmasters and crawler tool developers.
Google nowAnnounceIt will be the first to focus on making REP an industry standard, and as part of this effort, it has open sourced its own robots.txt parser.Source codeHosted on GitHub with an Apache License 2.0 license. The robots.txt parser is a C++ library for parsing and matching rules in a robots.txt file. It has been around for 20 years and contains code written in the 1990s.
User comments