Credit: Josh Caius

Using generative AI is a bit like learning to drive a car.

How do new drivers build confidence and skills? Not by heading straight to the M25 but by driving around an empty car park, before learning clutch control, to change gears and do hill starts. Only then are we safe - to ourselves and others - to tackle roundabouts, dual carriageways and motorways. And of course, do not forget about your theory lessons too.

This was an analogy heard at the NCTJ event on using AI in journalism yesterday that focused on practical ways newsrooms are embracing this new tech.

Working backwards

Reuters has been using AI for years to identify market trends or process archived videos. Gen AI is a "different beast" tough, says Jane Barrett, news editor, media news strategy, but one they are slowly embracing.

It started with trying to develop a proof of concept of what the tech was capable of. Then the news agency developed ethics and standards guidelines, followed by internal training and execution.

There are plenty of concerns; the inability of Gen AI to detect quotes, news content being scraped, LLMs (large language models) being black boxes, and made-up facts and inaccuracies (known as hallucinations).

There are also opportunities. It helps to start with audiences, says Barrett, and how their interactions with the brand can be improved. Newsroom workflows can also be improved with the right AI-powered tools.

For example, Reuters' Latin America editor introduced an automated 'first pass edit' to tidy up copy for journalists writing in a second language. It is estimated this saves between 15 and 20 minutes of editing per story. Safeguarding and human accountability are both crucial to this process.

The news agency is also trialling automated translations for clients who want a rough copy quickly. There are big disclaimers to make them aware of the use of AI and routine spot checks are made to mark translations on accuracy and relevance. If this falls under its 'blue score' parameters, the process is switched off and only switched back on when the standards have been improved.

"If you play with what you know, you will learn its limitations quickly," says Barrett.

An everyday example

Eyebrows were raised when Newsquest advertised AI-assisted reporter roles this year to augment the output of local news content.

But the bet paid off. Jody Doherty-Cove, head of editorial AI and leading the initiative, showed how a simple chatbot can speed up the process of a common local newsroom task: creating and sending Freedom of Information (FOI) requests.

The bottom line is that these tools are only as good as the data fed into them. Prompt design - the tasks and commands given to gen AI tools - is especially important "to win back time".

A lot of this can be applied to chatbots (like Bard, Grok AI, Chat GPT) and image creators (Bing images, Dall.E 2, Midjourney).

Fundamentally, chatbots can do one of three things: generate something new (provide more information, brainstorm, plan or draft copy), transform information (rewrite, restructure, reformat, translate and answer questions from a dataset) and provide knowledge (ask questions, find content on the web or document, execute code).

When it comes to filing an FOI request, journalists can use AI agents - a custom GPT - to format an email with a set of rules and instructions. You can even automate sending the email - but that part comes with considerable risks. Make sure to command the bot to "just draft the email" or "ask for approval before sending" first.

The more precise your commands, the better the tool will be in delivering what you want. "It excels when you are very specific," says Doherty-Cove.

Slowly, but surely

AI itself has become an industry norm, says Gavin Allen, digital and data journalist at Cardiff University. AI-powered analytic tools like Dataminr and Chartbeat are important parts of many newsrooms.

Gen AI will need to tread that same journey, but for now, many questions surround its usage and it is not so clear how they fit in the newsroom. There are some "imminent" and obvious wins when it comes to transcription and translation, says Nadine Forshaw, assistant head of audience at The Sun. The newspaper is also looking at chatbots as the future of its horoscope section Mystic Meg.

But because the tech is moving so fast, it is wise to wait and see how it develops. An air of caution comes from Manjiri Carey, news editor, BBC News Labs. Her team is actively looking at ways the tech can remove pain points for both journalists and audiences.

The issue of assumed knowledge and tricky terminology is on her mind, and the tech could be used to unpack complex terms in an article without resorting to Google. That product is not ready to be audience-facing just yet. There is also a weather app with synthetic voices for regional accents. Unlike many other news organisations, the broadcaster has not got public AI guidelines.

"It would be really easy for us to unleash this technology, but there are still those risks associated with it. We want to use it for the right reasons, not because it's fun or shiny," says Carey. "We want to get the guardrails and guidelines right."

Bridge roles like herself are key, she adds, those who have feet in both editorial and tech teams, and can help those departments collaborate.

The overwhelming message is to play, test and learn. But how long can we afford to wait? In the meantime, there are diminishing editorial teams and crumbling business models to think about, notes moderator Joanna Webster, interim managing editor for video and photographer, Reuters.

It is a good question without a resounding answer. The best newsrooms can do for now is pick one problem at a time, and see if gen AI can provide a solution that works.

Free daily newsletter

If you like our news and feature articles, you can sign up to receive our free daily (Mon-Fri) email newsletter (mobile friendly).