Image by mudeth on Flickr. Some rights reserved.
When they knew the exact route of Margaret Thatcher's funeral, the people at the BBC's user-generated content (UGC) hub, with the support of a third-party location-based monitoring platform called Geofeedia, could set up and pre-populate social media monitoring feeds along the itinerary.
As explained in a case study published by Geofeedia, "the team established a workflow through which they could monitor social media platforms in real-time to identify unique and interesting content, engage with specific social media posters to gain permission to republish their user-generated content and then re-post the approved content into a map view for presentation to their broader audience".
The team identified hundreds of highly contextual and relevant user-generated content posts during the procession. Of these, 70 pieces were granted permission for re-publishing and presented to the BBC's online audience in map view. The use of geolocation for handling and presenting to the audience user-generated content, is one of the trends I analyse in a study I've written for the Reuters Institute for the Study of Journalism, published a few days ago with the title Newsroom curators & independent storytellers: content curation as a new form of journalism.
In the paper, I deal in depth with the process of curating user-generated content, which consists of collecting, verifying, presenting and preserving the material created by users through their smartphones or other devices and then uploaded and shared on the internet.
From the analysis, aside from geolocation, another clear trend is the outsourcing and the automation of one of the main parts of the curation of online content: verification. The flawed coverage by some news organisations – even large ones like CNN – of the Boston Marathon bombings provides a recent example of the challenges related to verifying information in the era of social media.
While UGC sometimes makes it possible to document what is happening in places that otherwise would be out of reach for journalists (such as Syria), sifting through social networks in order to gather valuable information is a time-consuming task and requires skills that are not always to be found in newsrooms. That is why some big players have begun to rely on third-party services to help them to gather and verify the content.
One of the main agencies that provides this kind of service is Storyful, based in Dublin, but with offices also in New York and Hong Kong. It employs roughly 30 people, both journalists and developers, relying on a combination of algorithms and human skills to spot on social networks like Twitter or YouTube early warning signs of breaking news.
While Storyful's work is undoubtedly valued and trusted, the outsourcing of such a core competence to an external service raises some key questions. Can news organisations that accept such a deal continue to define themselves as "journalistic" tout court? What are the downsides of losing (at least, in part) control of these competencies?
Another phenomenon I discuss in the paper is the emergence of "independent storytellers". What happens when the so-called 'citizen journalists' – often, but not necessarily, activists or tech-savvy people – begin not only to produce their own content, but also to curate the UGC produced by others?
Just a few years ago, the main authoritative online voices outside the mainstream media were those of respected bloggers, whose popularity was based mainly on their expressing of their opinions with text posts on their websites or revealing hidden aspects of their country's situation.
The emergence of platforms like Storify make the "curation" of visual (and textual) content so much easier, and affordable to all those willing to devote time and effort in creating their own narration of a certain event. By collecting pictures, videos, links and other user-generated content posted online, and by tweeting them in a sequence, or combining them in a more sophisticated chronological narration on networks such as Storify or Tumblr, these independent storytellers are often able to provide a perspective different from that of mainstream media. This is a complementary view, in some cases, or alternative in others.
Not only activists but also other categories of people who are active in the dissemination of news might take advantage of the new curation possibilities: for instance, foreign media outlets, that are unable to dispatch correspondents abroad due to cuts in costs or because it is impossible to enter certain regions. Alternatively, freelancers or would-be journalists may use the opportunity offered by Storify and other curation tools to showcase and demonstrate their skills to established media organisations, gaining in popularity and visibility.
One thing to consider is that this kind of work might reach, in the future, a wider audience, thanks to some developments in the field of online translation. In July Twitter launched a new (experimental) tool that automatically translates tweets using Bing, Microsoft's search engine. Tweets would appear in the original language, and clicking a small button labelled "translate now" would show the translated tweet in smaller text underneath. At first, the social network began experimenting with posts in Italian, French, Spanish and Arabic.In the future, activists and bloggers from foreign countries could bypass the filter of Western curators and tell the world live what is happeningFederico Guerrini
What might seem just a marginal feature may actually have serious implications for how the future of online news will develop, creating a lot of opportunities for independent storytellers and bloggers to make their voice reach a wider audience. In the future, activists and bloggers from foreign countries could bypass the filter of Western curators and tell the world live what is happening. Of course, reporters on the ground will always be needed as well as editors in the newsroom who can help to make sense of what otherwise could seem like 140 characters, not linked together by a common meaning.
But the "Andy Carvins" of the future will probably operate in a different way: on the one hand, by expanding their range of available sources, thanks to the overcoming of the language barrier; and on the other by being subject to a much more intense scrutiny, as people compare their own narration with that of many more witnesses among people living in remote countries.
Last but not least, there is the issue of preservation. As projects like the Guardian's Reading the Riots have shown (the London Riots, together with Occupy Wall Street are one of the case studies I analyse in the paper), data shared by users online offer an immense treasure trove for in depth-analysis not only in real-time but also long after an event has taken place. However, for this to be possible, the data must be stored, preserved and made available to the newsrooms (or other subjects, activists or researchers).
Otherwise, due to copyright reasons, closing of websites or other cause, they could become quickly unavailable. In the study "Losing my revolution: How Many Resources Shared on Social Media Have Been Lost?" published in September 2012, Hany SalahEldeen and Michael Nelson, two researchers from the University of Old Dominion (Norfolk, Virginia) analysed six different event-centric datasets of resources shared in social media from June 2009 to March 2012. They found that after the first year of publishing, nearly 11 per cent of shared resources would be lost and after that the trend would continue at 0.02 per cent per day.
This is something that should worry news organisation, as more and more coverage of breaking events is done through liveblogs, filled with user-generated content that could easily disappear, leaving a "not found" message in place of the suggested video or image.
Another possible cause of concern, for publishers, is that the data shared online largely do not belong to media outlets: they are assets of technology companies which could use them to become, more and more, media companies, a process that we can actually already see happening. The recent hiring of the Guardian's data editor Simon Rogers by Twitter and the current search from an “head of news” by the same company, together with the hiring by Twitter of Guardian's social and communities editor Joanna Geary and the imminent move of Hannah Waldram, currently community co-ordinator of the same newspaper, to Instagram, are all strong signals that point in this direction.
Technology companies have already disrupted a few sectors, challenging the incumbents: Amazon has done it with book publishers, Skype with phone carriers, Apple, though iTunes, with the music majors. Will newspapers and broadcaster, already weakened by the downturn in profits due to digitalisation be the next to suffer the competition of newcomers like Twitter, Facebook or YouTube?
Federico Guerrini is a freelance Italian technology journalist. He writes for La Stampa and also contributes to other websites and newspapers. From January to the end of June 2013, he attended a fellowship at the Reuters Institute for the Study of Journalism, where he studied the relationship between journalism and content curation, publishing a paper on the subject.
Free daily newsletter
- Tech City News launches print magazine
- Ex-Future staff in £90k crowdfund bid to set up own title
- How the Washington Post's Truth Teller wants to 'change the political discourse'
- Tool for journalists: Jeffrey's Exif Viewer, for verifying images
- Citizen journalism site Blottr to supply video to NYT Syndicate