Chicago-based freelance journalist Marco Buscaglia had a week from hell that began with some nasty emails he received on his cellphone early Monday.
In a nutshell, the blowback he received from readers that morning at 6 a.m. pertained to an article he wrote earlier this year that was published last Sunday, about the top books to read this summer. That article contained major errors 鈥 several of the books, 10 of 15 that he listed, don鈥檛 exist at all. Readers immediately spotted the problems.
It鈥檚 the kind of calamity that gives many of us in the media shivers.
As it turns out, Buscaglia relied on content generated by artificial intelligence (AI) to write his article. The AI he used, including a tool named Claude, contained bad information.
Buscaglia鈥檚 erroneous article was featured in a special 鈥渁dvertorial鈥 section called the Heat Index guide to the best of summer, syndicated by a third party and picked up by two widely read U.S. newspapers, and the Philadelphia Inquirer.
It was disastrous for Buscaglia, 56, a veteran in journalism for 33 years. He told me in a telephone interview this week from his base in Chicago that he takes full responsibility for what went wrong here, admitting that he didn鈥檛 do his due diligence by fact-checking the information he gleaned from AI.
鈥淭he fact that I completely dropped the ball on this, (not) checking up on it, makes me feel awful and incredibly embarrassed,鈥 he said.
He later added: 鈥淚 didn鈥檛 do the leg work to follow up and make sure all this stuff was legit.鈥
While he sat in bed reading those awful emails Monday and pieced together what had happened, he felt like a 鈥渃artoon character鈥 blasted through the stomach by a cannon ball, walking around with a gigantic hole in his stomach for the rest of the day.
鈥淚 was devastated,鈥 he said. “It鈥檚 been a couple of really bad days here, but I can鈥檛 say I don鈥檛 deserve it.”
Paramount in his thoughts was the notion that what he did fell well below the standards of the Inquirer and Sun-Times.
While AI can be a powerful tool that can assist journalists in some scenarios, as public editor I would state that this unfortunate case demonstrates that AI can also be quite fraught. AI can contain flaws and must be handled with caution by journalists.
When things go wrong, like the books fiasco, it can undermine the media’s credibility in a climate where public trust is already shaky.
The special Heat Index section was produced and licensed by a U.S. operation, King Features, which is owned by the large magazine outlet, Hearst. A spokesperson for the company that owns the Sun-Times said in a statement that the content was provided by the third party and not reviewed by the Sun-Times, but these oversight steps will be looked into more carefully for the future and a new AI policy is also being worked on for the Sun-Times.
Buscaglia said he had used AI before writing his book summaries and was familiar with AI 鈥渇rom a layman鈥檚鈥 perspective: he assumed it was akin to a 鈥済lorified search engine.鈥
It was only after his mishap this week that he delved deeper into how this technology works. He told me that鈥檚 when he felt 鈥渋ncredibly na茂ve鈥 and that he should have known more about AI while using it.
Generative artificial intelligence relies on large language models (LLMs) to create content, such as images, text and graphics. These LLMs are trained by massive amounts of digital data 鈥渟craped鈥 from the internet.
Flaws with AI come when incidents, sometimes called 鈥渉allucinations鈥 occur. That鈥檚 where AI simply invents facts. This has even caused significant problems in court here in Canada where, in one example, a lawyer relied on legal cases .
Full disclosure: the Star uses AI for processes such as tracking traffic to our website, . But we have a strict AI policy — internal and in our publicly accessible Torstar Journalistic Standards Guide.
Among the rules stated: human verification of any AI-generated information or content is always required in our newsrooms. In addition, all original journalism must originate and be authored by a human. AI 鈥渕ust not be used as a primary source for facts or information.鈥
Since stepping into her role last summer, Nicole MacIntyre, the Star鈥檚 editor-in-chief has spoken publicly about her concerns around AI and its impact on journalism.
鈥淚 said then 鈥 and still believe 鈥 that we must harness the benefits of this technology cautiously, with public trust always at the forefront.
“Since then, I鈥檝e immersed myself in the topic, watching closely as newsrooms around the world experiment with AI. I’ve seen the risks, including some very public missteps that have shaken reader confidence. But I鈥檝e also seen what鈥檚 possible when this technology is used responsibly and with purpose,鈥 MacIntyre told me.
She went on to say the Star’s AI guidelines protect our commitment to people-powered journalism.
鈥淲ith the right guard rails, I鈥檓 excited about the possibilities,鈥 she added.
To join the conversation set a first and last name in your user profile.
Sign in or register for free to join the Conversation