The Automated Content Access Protocol (ACAP) is a joint venture between WAN, the European Publishers Council (EPC), the International Publishers Association (IPA) and the European Newspapers Association (ENPA), that they claim would allow search engines to better recognise the terms and conditions of specific publishers sites and pages.
Rob Jonas, Google's head of media and publishing partnerships in Europe, told the Guardian Media Summit today he was satisfied with the performance of the existing robots.txt terms and conditions protocol used by Google.
Representatives from WAN have previously urged Google - and other major search engines - to participate in the project, describing their lack of involvement as 'somewhat troubling'.
Jonas said that while Google was involved in working groups on the ACAP project, the company was not yet looking to implement the system.
"The general view is that the robots.txt protocol provides everything that most publishers need to do. Until we see strong reasons for improving on that, we think it will get every one where they need to be," said Jonas, in his keynote address to the conference.
His views echoed those of Dan Crow, product manager of crawl systems at Google, who told Journalism.co.uk last June that robots.txt was sufficient as an industry standard and should not be replaced by ACAP.
Timesonline is the only publisher to have so far implemented the new protocol after its official launch in New York last November.
Free daily newsletter
- Does your news organisation have under 500 employees? If so, you could get your hands on the latest Google cloud technology — for free
- Theatre, newsrewired and Snapchat: Here's your weekly journalism news update
- Inside 3 organisations' approach to community-minded journalism
- Your Voice Ohio's take on collaborative journalism starts with community events
- Working with its members, Republik wants to show there is demand for reader-funded journalism in Switzerland