Screenshot of Google News UK
Google has again refused to make use of a technology that a consortium of publishers' groups claim could end disputes between news providers and search engines over content use.

The Automated Content Access Protocol (ACAP) is a joint venture between WAN, the European Publishers Council (EPC), the International Publishers Association (IPA) and the European Newspapers Association (ENPA), that they claim would allow search engines to better recognise the terms and conditions of specific publishers sites and pages.

Rob Jonas, Google's head of media and publishing partnerships in Europe, told the Guardian Media Summit today he was satisfied with the performance of the existing robots.txt terms and conditions protocol used by Google.

Representatives from WAN have previously urged Google - and other major search engines - to participate in the project, describing their lack of involvement as 'somewhat troubling'.

Jonas said that while Google was involved in working groups on the ACAP project, the company was not yet looking to implement the system.

"The general view is that the robots.txt protocol provides everything that most publishers need to do. Until we see strong reasons for improving on that, we think it will get every one where they need to be," said Jonas, in his keynote address to the conference.

His views echoed those of Dan Crow, product manager of crawl systems at Google, who told Journalism.co.uk last June that robots.txt was sufficient as an industry standard and should not be replaced by ACAP.

Timesonline is the only publisher to have so far implemented the new protocol after its official launch in New York last November.

Free daily newsletter

If you like our news and feature articles, you can sign up to receive our free daily (Mon-Fri) email newsletter (mobile friendly).