This article was migrated from an old version of our website in 2025. As a result, it might have some low-quality images or non-functioning links - if there's any issues you'd like to see fixed, get in touch with us at info@journalism.co.uk.

There used to be something here that couldn't be migrated - please contact us at info@journalism.co.uk if you'd like to see this updated! Timesonline has become the first publisher to implement the Automated Content Access Protocol ( ACAP ), a technology that could end clashes between news publishers and search engine over content use.

ACAP version 1, which is being unveiled today in New York, is a system that allows content publishers to embed into their websites information that details access and use policies in a language that search engines and automated tools can understand.

"TimesOnline has implemented ACAP to show its support for ACAP and its goals," Dominic Young, director of editorial services at News International, told Journalism.co.uk.

"The aim is to allow more sophisticated relationships between websites and the machines which access them. At the moment that makes the focus search engine crawlers…in time, as ACAP develops, we hope it will enable new and creative ways of doing this to the benefit of our readers, the search engines and ourselves."

Since the mid 1990s, Robots.txt files have been the most common information permissions system being used across the web, and some have questioned the necessity of needing to move from this existing permissions system to a new standard.

The desire for greater f lexibility and control

over content seems the key element. Developers say ACAP will include instructions that give publishers the power to set limits on how long content can be indexed and the amount of an article aggregators, like Google, MSN and Yahoo, can display.

However, Mr Young added, the project was not about blocking access to content but creating new opportunities and an environment in which new models for content publishing online could emerge.

"ACAP is not about a battle, it's about creating tools that don't exist yet online, [the absence of] which makes it hard to be as creative and diverse about the ways content owners do business online as they do offline," he said.

News International, Newspaper Association of America, Reuters, Fairfax Business Media, European Newspaper Publishers Association and the Associated Press are all ACAP members.

Agence France-Presse, Independent News & Media, Media 24 and the Nordic Sanoma Corporation were all participants in the 12-month pilot process.

Google and the other leading search engines r efused to become part of the ACAP pilot project

, instead preferring to watch from the shadows.

It wasn't until nearly nine months into the 12-month pilot that a significant search engine, Exalead , joined the project.

Details of the pilot scheme and further development of the technology were unveiled at a conference today, in New York, along with information on how the system works. According to the developers , ACAP can initially work by modifying 'robots.txt' files to contain additional permissions information (robots.txt files communicate web site permissions information to web crawlers allowing them to be indexed by search engines).

Developers claim that at first not all web crawlers will be able to read the new system, therefore modifying an existing system is the simplest form of implementation.

However, using this approach a further modification to ensure specific permissions are understood by web crawling technologies will be required at a later stage.

Alternatively, ACAP also allows publishers to embed permissions information directly onto a piece of web content rather than just onto the .txt file.

Share with a colleague

Written by

Laura Oliver
Laura Oliver is a freelance journalist, a contributor to the Reuters Institute for the Study of Journalism, co-founder of The Society of Freelance Journalists and the former editor of Journalism.co.uk (prior to it becoming JournalismUK)

Comments