
Artificial Intelligence (AI) has become one of the most talked-about topics everywhere these days. One profession still trying to determine how best to use — and not use — this new technology is journalism.
I began my career as a radio news reporter in the 1970s, left broadcasting to work in the federal civil service in the early 1980s, and later launched this local news site in 2012 after retiring from government service. Back in my days behind a microphone, I could never have imagined a tool like AI.
Today, newsrooms of all sizes are grappling with how to navigate this fast-changing technology. My goal in this piece is to explain how I use AI — and how I don’t.
First and foremost, I never ask AI to write stories for me. Every article you see on this site is the result of my own reporting and writing.
That said, I do find AI useful in limited ways. I sometimes use it to suggest headlines, offer alternative wording for a press release, or polish a story I’ve already written. Even then, I don’t always accept what AI produces — and every piece published here is reviewed by a human being: me.
I can’t recall a single instance where I’ve taken AI’s advice completely as-is. Those who design these systems even have a term, “hallucination,” for the times when AI confidently produces something that’s just plain wrong. It happens more often than you might think.
For example, just a few days ago, I asked an AI program for background information about government shutdowns. It responded that no federal shutdown was in place — even though there clearly was one. When I asked again, it apologized and corrected itself.
Experts warn that AI could one day replace human thinking and decision-making as it grows more sophisticated. In my humble opinion, though, no AI will ever attend a local government meeting, grasp the nuances of what happens, and accurately report it for the public. There’s a human element to journalism — context, judgment, empathy — that machines can’t replicate.
I may be proven wrong someday, but for now, I see AI as a useful yet limited tool in the reporting process. Like any new technology, it holds great promise — and real potential for harm.
Remember when social media was first promoted as a great way to stay connected and share ideas? It certainly does that, but we’ve also seen the unexpected downsides that came with it.
Across the country, newsrooms are developing policies to guide their use of AI. As for me, I’ll continue to use this technology carefully, ethically, and transparently — keeping human judgment at the heart of everything I publish.