Google's Push for New Protocols to Give Web Publishers Control Over AI Accessabislity

1 year ago 17140

Google is advocating for a discussion that will explore new protocols to empower web publishers with choices and controls over how their web content is accessed by artificial intelligence.

The tech giant believes that it is important for web publishers to have the ability to regulate how their content is utilized on the internet. Given that current mechanisms like the robots.txt tool were established in a pre-AI era, Google contends that there is a need for additional protocols to be deliberated upon. Today, website administrators utilize the robots.txt file to specify which sections of a website can be crawled by search engines and which ones should not.

Google is calling for a conversation that involves various stakeholders from the internet industry as well as the AI realm. The company is seeking a diverse range of perspectives and is encouraging engagement from academia, among other sectors, in the forthcoming discussions.