Sep 20, 2023 - Technology

The right newsroom jobs for AI, and the wrong ones

Illustration of a paragraph of binary code with red squiggly spell check lines under some of it.

Illustration: Natalie Peeples/Axios

As newsrooms around the globe are beginning to harness AI, this week the New York Times posted a job listing for an editor to serve as "newsroom generative AI lead."

My thought bubble: Good luck to the winning applicant! Journalists, like many other professionals, are fearful and anxious about the new technology, and this is not going to be an easy job — but I'm generous, so here's a memo with some helpful advice.

Dear future robot wrangler: Many publishers have jumped right in to use generative AI for writing entire articles — a task the technology isn't ready to take on. Far more interesting opportunities lie under the surface.

1. Writing full articles is the most obvious use of generative AI, but not the best, because of all the ways AI still fails at the basics of good journalism.

  • Today's generative AI systems make mistakes, plagiarize, say inane or embarrassing things and make stuff up. The Columbus Dispatch, MSN, Onion and Gizmodo parent G/O Media, CNET and others have all learned that the hard way with spectacular, very public flops.
  • Even if they get better at sticking to the facts, algorithms can't assess credibility. Today's technology often does a passable job summarizing an interview — but it has no idea if the interviewee is the foremost expert on a subject, or someone who is in over their head, or someone who's promoting an agenda.
  • There are some very limited writing tasks AI can be let loose on today, like corporate earnings stories, which follow a predictable pattern and often depend on a single source. Using older technology, the Los Angeles Times has for years had a bot churn out first takes on earthquakes, telling people what they most want to know — how big the tremor was and where it was centered.

2. The bigger newsroom opportunity is to use AI for other parts of the reporting and writing process.

  • I've been using Otter.ai to record and transcribe notes, and its AI chatbot lets me ask questions of my notes (see review). And I am constantly looking for other tasks it might assist with. I think AI will someday soon help me identify sources from my email, for example.
  • AI can also be used to find patterns in vast amounts of data — an organization's own archives as well as large swaths of public information. A reporter might only have time to read through a handful or dozens of speeches to detect shifts in a politician's stances over time; an AI bot could consume every available recording and transcript.

3. Illustrations and some kinds of conceptual graphics are another area of promise for AI. The ability of text-to-image engines to create powerful artwork from a simple prompt is already impressive. Engines can even be trained on an organization's style using past work.

4. Experiment behind closed doors first.

  • The best way to understand the tools is to try them. I get to do that as part of my job, but any newsroom that wants to be relevant in the AI era should be constantly exploring what's possible, scrapping what doesn't meet standards and investing in what does.

5. Be transparent.

6. AI can help the news business — but it also threatens it.

  • Journalism has been slow to adapt to other recent tech shifts, and paid a heavy price, but that needn't be the case with AI.

7. AI will be a huge labor issue, whatever publishers do.

  • Like their entertainment industry counterparts, unions representing journalists have AI front and center on their radar, especially at companies like G/O Media that have already been aggressive in adopting the technology.
  • Journalists are highly skeptical that AI will just free up skilled reporters, editors and illustrators for bigger tasks. Most newsrooms have been getting smaller throughout my 20+-year career and especially in the past decade. It's hard to see how AI could reverse that.

8. Generative AI will make it even harder to combat online misinformation, as we've written extensively.

  • That creates a huge need for humans who can bring fairness and accuracy, as well as much needed context, to the news. But they will have to do so amid intensifying business pressures that the new technology is unleashing.

9. Be prepared to shift your thinking.

  • A bad use of AI today might make sense tomorrow. I've been covering Silicon Valley for more than 20 years, and generative AI is evolving faster than any other technology I've witnessed.

Go deeper: I spoke on the topic of AI and newsrooms earlier this month at the conference for NLGJA: The Association of LGBTQ+ Journalists. You can view a transcript and replay here.

Go deeper