There is so much to learn from AI flub
“Is ‘AI’ good or bad? Depends on who you ask” was the headline on my April 16 column here.
Right now, I suspect there are some Columbus newspaper readers not calling it “good.”
Advances in artificial intelligence, or AI, continue, and new forms known as “Generative AI” are still picking up steam.
Generative AI is a form of AI capable of generating text and other content in response to user prompts. It’s able to read, write and understand text-based data well.
But as I stated in that April column, despite all the high-tech advances, robots still lack a very important communication skill — the human element.
That quickly came to light recently when the Columbus Dispatch, and several of that newspaper’s sister newspapers owned by media giant Gannett, began relying on an AI tool to generate high school sports coverage.
Launching this experiment on high school football coverage was their first mistake. Just ask my sports editor how readers react when you mess with their high school football coverage. Seriously.
However, according to CNN — and many, many mocking comments on social media — one AI-generated story published last month in the Dispatch began like this: “The Worthington Christian [[WINNING–TEAM–MASCOT]] defeated the Westerville North [[LOSING–TEAM–MASCOT]] 2-1 in an Ohio boys soccer game on Saturday.”
That page has since been updated, but the technology flubs didn’t end there.
The reports were mocked on social media also for being repetitive, lacking key details, using odd language and generally sounding like they’d been written by a computer with no actual knowledge of sports. (Ahem, it is a robot.)
CNN identified other Gannett outlets in Louisville, Florida and Milwaukee that also published similar stories written by AI in recent weeks featuring identical language, describing “high school football action,” noting when one team “took victory away from” another and describing “cruise-control” wins. In many cases, the stories also repeated the date of the games multiple times in just a few paragraphs.
According to national reports, Gannett has now paused the use of the AI tool to write high school sports stories, and web stories have been updated.
CNN reported that a Gannett spokesperson said the company added hundreds of reporting jobs nationwide while also experimenting with automation and AI to build tools for journalists and add content.
It should be noted the AI tool debacle comes after Gannett in December laid off 6% of its news division.
It also comes as many news outlets grapple with how to handle the rapid advancement of AI technology.
As I said in April, I still have a hard time believing robots soon will be driving (or flying) around, knocking on doors and gathering first-hand information like our reporters do. I also doubt they will be building trust and cultivating news sources allowing them to keep their fake metal fingers on the pulse of community happenings.
However, I was remiss by not pointing out that sports writers, like ours, who stand on sidelines capturing and memorializing the essence and excitement of a high school game in words and photos, also probably cannot be mimicked by a robot.
Sure, AI-generated stories might be grammatically or factually correct, but in many cases — at least thus far in technology development — the human touch still is lacking.
I still believe humans want to read stories with a human element. I suspect eventually AI will be able to fool some of us by “learning” to write “human-interest” stories.
Whether such stories ever will really reach that expectation is yet to be determined.
I still believe in the value of community journalists, like the ones who work here and spend much time speaking to residents and elected leaders in our local community, listening to their stories, and then sorting out what facts and emotions are most meaningful and compelling to tell readers.
Someone please tell me how a computer could know or do that.
And, as is clear by the Gannett experiment, editors — that is, real human beings — are still very much needed to edit stories — whether generated by humans or robots. I would argue strenuously that these stupid AI-generated errors would never have seen the light of day had they been read first by a human editor.