By Sebastian Grace 
There’s no need to panic about AI in journalism. ChatGPT told me so. 
 
When I asked OpenAI's language model whether AI will replace journalists in the future, the answer went something like this: It's unlikely because of human's "unique set of skills," including "critical thinking" and the ability to "conduct interviews." Well said, robot. I agree. But it's not just unlikely. It's impossible. 
 
Arijit Sen, a computational journalist at The Dallas Morning News and an AI Accountability Fellow at the Pulitzer Center, believes that AI will bolster, not debilitate, the craft, improving an industry in desperate need of revitalization. 
 
"It might hurt the churn-and-burn roles, but I think there's still a place for a lot of breaking news," Sen said. “Hopefully, we can educate the higher-ups of news conglomerates, and they can realize it will only make the existing journalists work a lot better. There's no need to eliminate a ton of jobs because this will be a net economic win, and we'll get more in-depth coverage that our community relies on. There's reason for optimism." 
 
In the current furor, fueled by a panic born from misunderstanding, many talking heads are envisioning tumbleweed blowing through (more) newsrooms. Given the context, such as the dystopian reporting in The New York Times of Kevin Roose’s conversations with “Sydney,” Microsoft’s AI chatbot for Bing, and the Future of Life Institute’s open letter from tech bigwigs, it remains critical to explain how fear of an AI-impacted future for journalism is unjustified. The doom mongering fails to recognize that the future of AI in journalism is a new and exciting frontier for reporting. 
 
Pritish Pahwa's assessment is that "there are a few things that pump the brakes on whether this stuff can really take over journalism and wholesale replace a ton of writers." Pahwa, Slate's associate business and tech writer, believes "journalists don't need to worry about ChatGPT taking their jobs." He explained, "It is not trained to do things that journalists are supposed to do in their work, which is actually talking to people to get their viewpoints on a specific thing that is not public knowledge." 
 
Technology has always made reporters' jobs more straightforward. Take the collection of campaign finance reports on sites like Open Secrets or automated interview transcriptions with Otter.ai. Artificial intelligence has proven helpful in automating menial news-gathering tasks, like aggregating data, and will continue to do so, allowing many newsrooms to operate more cost-effectively. Politico’s Jack Shafer agrees, writing: “The first newsroom jobs AI will take will be the data-heavy but insight-empty ones that nobody really wants.” 
 
The Associated Press has used AI to expand its corporate earnings coverage to 4,000 companies from 300, and The Washington Post has long been able to cover all D.C.-area high school football games, thanks to its Heliograf bot. As Shafer wrote in his Politico piece, “By deskilling the writing of mundane and everyday stories, AI will free human journalists to ask questions it can’t yet imagine and produce results beyond its software powers.” 
 
Sen detailed where he sees the future application of AI in journalism: "Investigative reporting will only get better. You speed up the reporting process dramatically… ChatGPT can help us distill complicated topics we're trying to understand into targeted simple language." 
 
He added, "In the future, you could scan documents, generate numerical representations of the text, and then have AI find text between similar documents, like Command F on steroids. Or you could do entity extraction, where you pull out say Mark Zuckerberg for one document and find all the references in all the other documents. There's a lot of potential there." 
 
Despite the protests of some long-time detractors, who overestimate not only the negative impact but the totality of the influence AI will have on journalism, it is not the case that AI will further the damage inflicted on journalism in recent history. Though they argue that human beings no longer do jobs now done by software, it's a false equivalence to compare mass adoption in the industry with the guaranteed extinction of its practitioners. CNET’s recent lazy, mediocre failure to harness the potential of this new technology with the written word shouldn't be seen as a bad omen. 
 
AI seems everywhere, all at once, and journalists must report more accurately and responsibly. "I think we're in the moral panic phase about AI, where everyone sees a million scary, negative things that can happen," Carrie Brown, director of the social journalism program at the Newmark Graduate School of Journalism, said. 
 
But, she added, "I would prefer more nuance. The introduction of the written word caused massive moral panic back then, and now look where we are. Having that historical perspective would help journalists cover this more responsibly." 
 
Amid the hype and fear, Sen believes it’s a case of increasing capacity to think critically and responsibly unpack these complex phenomena. "Stoking the idea that we're going to get Terminator Ex Machina stuff in the next three years is not helpful," he said. Instead, he believes, "We just need as an industry to set best practices and for people to agree on that and build up the level of technical literacy in our newsrooms." 
 
The benefits of technological advancement of the journalistic craft through AI will assist local newsrooms exponentially, paving the way for smaller chains to do the higher-impact investigative work that previously they had been squeezed out of by the bottom line. 
 
In response to those who fear the acceleration of job-cutting programs throughout large-scale ownership newspapers across the country, Pahwa admitted, "There is the possibility that, especially in a lot of newsrooms that these private equity companies own, those executives could see AI as a way to replace almost everything on the local news beat. But I suspect that will ultimately not benefit their business." 
 
He explained, "Yeah, you can cut out all the human employment and try to put in sloppy ChatGPT. But you'll also find the subsequent decline in readership even steeper than that which has already been there because people will not want to go to a site and read robots talking about the people they care about when reading about their community." 
 
“It doesn't necessarily have to be a subtraction,” Brown. "The ideal would be that it frees people up to do more creative work or work that requires more judgment that only humans can have." 
 
Brown added, "There's a big difference between being able to recognize patterns of words and parse meaning from those. It's fact-checking on the basic side, but on the more important side, it's making decisions about what to include and what to leave out. Humans can understand meaning and emotion in ways that machines can't. That's a big piece of storytelling. There is still a strong need for humans to be involved in that process.” 
 
The impact of AI on a beleaguered local news industry is a vital frontier of this new dawn. Moreover, it is undoubtedly the case that helping community organizations compete in the new online ecosystem is critical for their survival. Accordingly, in 2021 the Knight Foundation announced a $3 million initiative to support local news organizations harnessing the power of artificial intelligence. 
 
United Robots is helping smaller newsrooms convert data-heavy work into more time for their contributors to cover the stories that matter. The Swedish news tech company describes itself as "a band of robots writing stories based on any data you like in any language you like.” “We're kind of like a news agency for automation,” Cecilia Campbell, chief marketing officer, explained. She said, “The mission is to empower newsrooms. We automate community information that sits alongside journalism and is part of the local journalistic quality product. It's information readers want.” 
 
News organizations employ United Robots’ technology to create stories on data-heavy topics such as real estate figures or sports scores. “We're trying to fit into where the data is too much for the newsroom to deal with, but where it's value-added information that people want,” Campbell said. She added, “It's robots and reporters working together.” 
 
Partnership, then, is critical. There is much more human journalists bring to the table than their automated rivals, and seeing software as antagonistic rather than advantageous is the wrong mindset. As Annie Lowrey wrote in an article for The Atlantic, even if ChatGPT can produce a clear and concise paragraph on AI, it can't interview AI experts and skillfully condense their theorizing into a column like this one. Lowrey explained ChatGPT "creates content out of what is already out there, with no authority, no understanding, no ability to correct itself, no way to identify genuinely new or interesting ideas.” 
 
There aren't enough well-constructed sentences left for AI to ingest, yet the human capacity for creativity is infinite. A team of researchers at Epoch AI recently predicted that programs will run out of high-quality reading material by 2027. Without new text to train on, AI's much-heralded ability will come to a premature end. Journalists, though, will remain. 
 
As Pahwa said, "I would hope that it really centers the human factor of journalism: humans who are reporting humanely about other humans and the things these human beings are making, and the impact those things are having on other human beings.” 
 
Humans are the future of journalism in the era of AI. We better believe it. 
 
 
 
Share this post:

Leave a comment: 

Our site uses cookies. For more information, see our cookie policy. Accept cookies and close
Reject cookies Manage settings