Journalism isn't the only profession upended by the AI boom resulting from ChatGPT's explosion in 2022. Its raised big questions in many sectors about everyday workflows, talent management and sustainability.

Your burning questions about AI were fielded by AI futurist Andrew Grill at last week's Newsrewired conference (26 November 2025) who offered an expert view on navigating the AI landscape.

We've edited his responses for brevity with the help of an AI assistant, before it was edited by a human


Q: What impact will AI have on young people entering the workforce, especially those just starting out?

Ai trainer Harriet Meyer asking this question to Andrew Grill at Newsrewired on 26 November 2025. Credit: Mark Hakansson / Marten Publishing

Andrew Grill: Many schools are telling students not to use AI, calling it "cheating." But when these students enter the workforce, they're behind. Everyone else has been using these tools. In the future, AI proficiency will be as expected as knowing how to use Word. The real challenge is that education hasn't caught up: we need to teach critical thinking, not just tool use. Encourage your kids to use AI at home, so they become the go-to person for these skills at work. But also make sure they're learning to question and fact-check, not just accept what AI gives them.

Action point: Support young people in developing both AI skills and critical thinking. Understand fresh recruits may need upskilling and training in AI.

Foundational AI for Journalists: a practical bootcamp
15 January 2026 - Build confident, ethical, day-to-day AI workflows — from brief, research and pitching to editing and publication — in one hands-on day

Q: Which industries are leading in AI adoption, and what can media learn from them?

AG: Financial services have used AI and machine learning for years. They're not afraid of it, just focused on regulatory issues. The most successful adopters are nimble, reimagine their processes, and have leadership support. Education is behind because it hasn't changed its methods in decades. Organisations that encourage experimentation and allow teams to test new tools are ahead of the curve.

Action point: Foster a culture of experimentation and process reimagination. Don’t wait for perfect conditions to start testing new AI tools.

Anita Zielina’s five signs of a strong AI leader
AI is not a magic wand that will solve a messy strategy: top newsroom leaders make it okay to fail and provide clarity amongst the chaos

Q: Is your book on AI already out of date? And what should newsrooms worry about in terms of safety and AI?

Journalism safety trainer Hannah Storm asking this question at Newsrewired on 26 November 2025. Credit: Mark Hakansson / Marten Publishing

AG: I anticipated this – my book has its own AI (a GPT) that updates with new information. Most boardrooms are still confused about AI, so the basics remain relevant. For safety, never put confidential information into public AI tools, even if you think it’s private. Push your IT team for tools that work inside your firewall. Use AI as a "fact flagger" to highlight what needs human checking, but don’t blindly trust it. Critical thinking is essential – AI can make things up, so always double-check.

Action point: Establish clear internal guidelines for AI use, prioritising data security and human oversight in all newsroom workflows.

ChatGPT - CuriousGPT
Ask me any question about Andrew Grill’s latest book: Digitally Curious

Q: What’s the worst example of AI you’ve seen recently?

AG: Many organisations roll out tools like Microsoft Copilot and call it "doing AI," but often the tools aren’t ready or useful. One law firm I've worked with built an expensive internal tool, but it was easier to do the work by hand – so no one used it. Most AI pilots fail because there’s no support from the top and processes aren’t flexible.

Action point: Before investing in new AI tools, ensure there is leadership buy-in and a clear plan for integration into real workflows.

Andrew Grill’s tips on using AI for writing books, managing talent and mitigating risks
“The main barriers to AI are training, budgets, data quality, and outdated processes. For journalism, it’s about reimagining workflows: use AI as a decision partner to gather news, research, and qualify sources more efficiently”

Q: Will open source Chinese AI and automation lead to a world of leisure in 10 years?

AG: Some believe open source AI will make powerful tools free for everyone, but most companies are cautious and keep these models behind firewalls. We're in an AI bubble – there are more tools than people ready to use them. Robots are advancing, but we’re not close to a world where humans do nothing. People want to interact and work together; robots will get closer to autonomy, but not replace us entirely in the next decade.

Action point: Stay realistic about automation – embrace new tools, but plan for a future where human collaboration and adaptability remain essential.

Share with a colleague

Written by

Jacob Granger
Jacob Granger is the community editor of JournalismUK

Comments