Insider's newsroom will start experimenting with AI
Insider plans to begin experimenting with ways to leverage AI in its journalism, its global editor-in-chief Nicholas Carlson told Axios.
Why it matters: "A tsunami is coming," Carlson said. "We can either ride it or get wiped out by it."
- "But it's going to be really fun to ride it, and it's going to make us faster and better."
Details: The company will set up a working group first to test ways to responsibly incorporate AI into its workflow before rolling out a set of AI rules and best practices to the broader newsroom.
- Beginning Thursday, members of Insider's newsroom can apply to be a part of a working group that will test different ways to leverage AI across all of Insider's journalism products.
- The team will include roughly a dozen of Insider's "very experienced, seasoned journalists ... with great judgment," Carlson said.
This team will experiment with using AI-written text in their stories. The rest of the newsroom will be encouraged to use AI to generate outlines for stories, fix typos, craft headlines optimized for search engines, and prep interview questions. They are discouraged from putting sensitive information, particularly sourcing details, into ChatGPT.
- "The AI companies employ humans who can see conversations with their bots," Carlson warned in a memo to editorial staff sent Thursday and seen by Axios.
- Carlson said that while he wrote his memo, "There may be like a line or two that it [AI] might have suggested."
How it works: Carlson will continue to encourage those not in the working group to experiment with ChatGPT and AI, but they cannot incorporate results into their work until the working group finalizes its recommendations on responsible AI use.
- "ChatGPT can be helpful for research and brainstorming, but it often gets facts wrong," the memo said. "ChatGPT is not a journalist. You are responsible for the accuracy, fairness, originality, and quality of every word in your stories."
- The memo also included warnings about ways generative AI can generate misinformation and plagiarize material.
- Carlson said that for now, the company doesn't plan to include disclosures about AI being used in its work for each individual article, but instead will list its policies around leveraging AI on its editorial policies page.
Between the lines: While many publishers are experimenting with different ways to incorporate AI into their work, not all are comfortable publishing articles fully written by AI.
- "I anticipate that the world will have AI-written articles all over the place," Carlson said. "Insider may also. We have to figure it out if that's a good idea or not. The pilot group is going to help us do that."
- Some newsrooms limit the use of AI to certain topic or focus areas, such as Wall Street earnings reports, sports scores or quizzes.
- Carlson anticipates that AI will be leveraged across every beat and every type of content at Insider, to varying degrees.
- "What percentage of journalism from Insider will have the journalist at some point touch AI in the creation of that story? Very high, you know, maybe not 100%, but like, very high."
Be smart: Carlson acknowledged that while generative AI tools like ChatGPT can be great for aiding research and improving sentence structure, "the hallucinations can be a real problem."
- Even if Insider were to eventually get to a point where an entire article is written by AI, it will still be required to go through the same editing process that human-created work undergoes.
- "Nothing we publish, will not be vetted and signed off on by a meticulous fact-checking human editor," he said.
The big picture: The sudden rise of generative AI has left newsrooms scrambling to figure out how to protect their works from copyright violations and misinformation.
- "We certainly know we're early," Carlson acknowledged. "We're gonna be careful, and we're gonna monitor it carefully, but I feel good about copyright law."
- Many publishers, including Carlson, believe generative AI companies should have to pay publishers for the content they use to train their algorithms.
The bottom line: "I think that this is a really enriching, powerful new tool. And I want this newsroom to be the best at leveraging it," Carlson said. "And to do that they need to begin experimenting, carefully."