Hoodline Dallas News Site Gives It's AI Articles Fake Human Bylines | Dallas Observer
Navigation

A New Dallas News Site Gives Its AI-Only Reporters Fake Human Names

Be careful which site you're getting local news from. They might not use real people.
You have to look really hard to find th AI badge next to an author's name on Hoodline.
You have to look really hard to find th AI badge next to an author's name on Hoodline. Igor Omilaev/Unsplash
Share this:
There’s a website filled with news stories about things going on in Dallas, Irving, Plano and Garland. Scroll down a bit and you’ll even see stories discussing happenings in farther-out locales including Perrytown and Crowley. On a recent day we visited, stories covering news in each of those places had hit the website in the previous five hours, and each one had the same byline at the top.

That’s one hell of a productive reporter, for one hell of an active news site, we must admit.

But the reporter in question isn’t human. In fact, it doesn’t seem as though any of the reporters for Hoodline’s Dallas news site are real human journalists. No, the articles, although posted along with author names, are the result of artificial intelligence.

Hoodline’s coverage isn’t confined to Dallas, but includes more than 25 major American cities including Austin, San Antonio, San Francisco, Chicago and Los Angeles. As a result of Hoodline’s reach, more people are starting to notice and question.

“These are not real bylines,” a CNN report from May states. “In fact, the names don’t even belong to real humans. The articles were written with the use of artificial intelligence.”

Take the articles we mentioned at the top of this story, for example; the bylines belong to “Nate Simmons.” But click on a specific Simmons article, and you’ll see a tiny blue badge to the left of the name. In barely legible text, the letters “AI” fill the badge. So, if you really pay much closer attention to bylines than you likely typically do, you might be able to figure out that something is up. But not many do.

Maybe the stupendously generic, awkward tagline Hoodline employs should give it away. “In depth reporting about your home area” is Hoodline’s motto, which in an accidentally humorous way sounds similar to the clunky articles that outlets including Sports Illustrated have been caught trying to pass off as human-powered journalism in recent years.

According to the CNN article, Hoodline started populating its sites with AI-generated content in 2023. There is some discussion of the AI-generated content buried rather well on Hoodline’s site, under the “Disclaimer/Use of AI” tab at the very bottom of the page.

“These types of websites are scary because they masquerade as vetted news by trained journalists.” – Tracy Everbach, University of North Texas.

tweet this

“At Hoodline, we sit at the intersection of cutting-edge technology and traditional journalism,” the website disclaimer reads. “Our mission is to bring you hyper-local news that is informative, engaging, and accurate. Our articles are crafted with a blend of technology and editorial expertise that respects and upholds the values of journalism.”

The copy goes on to mention humans in various ways without explicitly saying that humans do not in fact write any articles. Our favorite instance is “human-centric approach.” There have been some steps towards a more honest, transparent effort it seems. Hoodline reportedly used to include AI-generated headshots and fake biographical information of the not-human authors, something the site has not been doing in more recent times. Hoodline did not respond to our requests for comment.

“These types of websites are scary because they masquerade as vetted news by trained journalists,” said Tracy Everbach, a journalism professor at the University of North Texas. “But they are propaganda sites. Clearly some people, maybe not a lot of people, will be fooled.”

On top of the possible attempt at deception, there’s the idea that Hoodline’s coverage simply isn’t very substantial. A quick look through the stories on Friday, June 14, around 11 a.m. reveals little more than a collection of empty calories in the form of articles that seem more like press releases than any sort of genuine reporting.

Take a look at what we found, fellow news-reading Dallas-area human:
  • "City Seeks Public Input on Draft Water and Wastewater Impact Fee Study to Guide Infrastructure Development"

  • "Arlington Announces Juneteenth Observance with City Facility Closures and Continuation of Select Services"

  • "Arlington Unveils Splash Pads to Combat Summer Heat With Updated Health and Safety Protocols"

  • "Arlington Encourages Smart Water Use With Tips and Classes During Hot Summer Months"

Seems the bots did a large-scale grab of Arlington City Hall press releases at the end of last week, but that’s just a guess on our part, of course. Now, to be fair, there were also stories that dug into more hard-hitting stories that same day.

One post discussed the Dallas police needing help identifying a murder suspect and another announcing the safe return of a missing elderly man. While both were important topics to be sure, they failed to be little more than extended social media posts lacking anything resembling fresh reporting or even a modicum of new information not already released by the DPD through its typical communication channels.

Each of the aforementioned stories had the byline of our friend Nate Simmons. In fact, Simmons is credited with more than 20 stories in the period June 10–14. At first, that indeed sounds like a lot, and some AI-powered assistance would seem to be needed in order to generate that amount of content. But again, it takes reading only a couple of the stories to know that ol’ Nate isn’t likely to win any Pulitzer Prizes very soon for this less-than-hard-hitting coverage.

And before you get around to thinking there might be some sort of calculated rhyme or reason to how Hoodline chooses its AI bylines, we’ll stop you. Zach Chen, the CEO of Impress3, the company that owns Hoodline, told Nieman Lab earlier this month “that Hoodline’s AI personas were generated at random by AI, and that their beats and cities were also randomly assigned.”

In that interview Chen also explained that “a team of dozens of (human) journalist researchers  ... are involved with information gathering, fact checking, source identification, and background research, among other things.” Given the superficial level of information many Dallas Hoodline stories offer, it’s difficult to imagine that much human intelligence is employed on a story-by-story basis to the degree Chen wants the public to believe.

As prevalent as AI has become in many aspects of daily life, including in such common activities as typing a text message or email or unlocking our iPhones with facial recognition, its use has yet to reach ubiquity in newsrooms. In a recent survey of more than 3,000 journalists from across the globe, more than half said they are not using generative AI tools such as ChatGPT “at all,” and only 5% admitted to using generative AI “often.”

Everbach, the UNT professor, points to the necessary human element of reporting the news when explaining why relying too heavily on AI in journalism is an unwelcome development.

“Even more than being propaganda, they [AI tools] aren't human, which leaves out so many skills we train our students on: ethics, fairness, accuracy and good writing,” she said. “It is definitely a way to avoid hiring journalists. That way they can offer free, crappy content. Real journalism needs funding to pay talented and skilled writers, editors, visual journalists and designers.”
BEFORE YOU GO...
Can you help us continue to share our stories? Since the beginning, Dallas Observer has been defined as the free, independent voice of Dallas — and we'd like to keep it that way. Our members allow us to continue offering readers access to our incisive coverage of local news, food, and culture with no paywalls.