Tamedia shares 3 stages of automating newsrooms

By Paula Felps

INMA

Nashville, Tennessee, United States

Connect      

Publishers worldwide are looking at how generative AI is affecting operations, and during Wednesday’s Webinar, INMA members heard how Swiss media group Tamedia is automating its news processes.

Timo Grossenbacher, the company’s head of newsroom automation, sharing some of the experiences and learnings he and his team have gleaned in their journey from rules-based content generation to generative AI during From automation to AI: a progress report with Tamedia.

“It’s important to note that, of course, technology is important to us, but especially the responsible use of technology is important to us,” Grossenbacher said. Because of that, Tamedia has developed internal guidelines on how it will use generative AI. Along with its newsrooms and product teams, it has determined what it will — and won’t — allow AI to do.

“At the same time, we see a huge wealth of possibilities for AI,” he said.  

Exploring the newsroom ecosystem

Along the way, Tamedia discovered automating newsrooms in this digital age has three basic stages:

  1. Sourcing of information: This can be anything from phone calls to doing research with Google to searching archives for stories and information.
  2. Content generation: “This is where you actually write an article, produce a podcast, where you do all these different kinds of journalism that online journalism now encompasses.”
  3. Distribution: “This has gotten more and more important, especially with the advent of social media.” Newsrooms now must think of the best place to distribute something: Will it be better as a tweet or a LinkedIn post, or would it be better received in a newsletter?

While the boundaries between these three stages often blur, Grossenbacher said it is helpful to distinguish between each one.

Tamedia identified the three stages of automating newsrooms.
Tamedia identified the three stages of automating newsrooms.

Playing by the rules

He provided an example of how these stages work: a project Tamedia completed using rule-based content production and personalisation. The company wanted to examine renewable energy and learn how much solar energy was produced. It also wanted to tell readers how their municipality compared to others.

Because Tamedia had data on Switzerland’s solar plants, their production capacities, their geographical coordinates, and the area of all the roofs, it could calculate the potential amount of energy being used. Within the article, readers could choose their municipality and receive a personalised article that is generated based on the rules outlined and fed to the machine by the data team.

“We define the rules, so there’s no machine learning involved,” he said. “The machine doesn’t learn any kind of rules. So that’s very distinct from the modern type of AI and from the machine learning paradigm AI, which generative AI also belongs to.”

Using rule-based text, Tamedia generated 4,290 personalised articles that were written automatically based on the rules it developed.
Using rule-based text, Tamedia generated 4,290 personalised articles that were written automatically based on the rules it developed.

For this story, Tamedia generated 4,290 personalised articles that were written automatically based on the rules developed. Readers also saw a personalised map and a chart for their municipality: “Basically, we have the data I just showed you, and then we define the rules, and the output is text and graphics.”

Can journalism be automated?

While such data-driven stories have helped with traffic, Grossenbacher still questioned whether journalism itself can be automated.

“Keep in mind that if you look at the whole article from A to Z, then basically only small parts of it are automated or personalised,” he said, pointing to titles, graphics, and charts  generated by a machine. “The rest of it is well-written by humans.”

But even those personalised sections have a human touch, he said, because humans wrote the rules that led to that information being collected and included.

“I would say what we do here is not necessarily automating the journalism; we’re automating the personalisation,” he said. “What we’re trying to do is to automate personalisation, distribution, these kinds of things. And in order to do that, we need to do automated journalism in the classical sense of how this word is used in the industry.”

 

The way journalism is delivered and what audiences receive can be personalised, but journalism itself cannot be fully automated, said Tamedia's Timo Grossenbacher.
The way journalism is delivered and what audiences receive can be personalised, but journalism itself cannot be fully automated, said Tamedia's Timo Grossenbacher.

Adapting to ChatGPT and generative AI

The emergence of ChatGPT led Tamedia to begin exploring possible use cases for it in the journalism production process. After several brainstorming sessions with its editorial staff, it identified several tasks that newsrooms agreed they would find helpful, such as having the machine offer up titles and ledes for stories — particularly for content being supplied by a third party, such as an agency or a municipality.

“For example, say they want to build a new school, and the municipality is informing the public about that on their Web site. We scrape the Web site and get immediately alerted whenever there’s new content,” Grossenbacher said.

But this content is often boring and can’t be run as a story. “So we slightly change the text. For that, our journalists said it would be really helpful to have an AI that would digest this text and propose a better sounding, more snacky title and lead.”

Grossenbacher decided to compare how generative AI would work in his original example of the story on solar plants. Many of the challenges he faced had to do with teaching the machine to find the right information: “The first thing that is wrong, which is quite funny I think, is that it’s speaking of plants you have in your garden, not plants in the sense of solar plants.”

By correcting the machine, it began to learn and — without being given the rules outlined for the original story — it kept improving its accuracy. “The machine actually found the rules within this correlation between structured data and the expected output it gave. And that’s why we call it machine learning because the machine learns the rules by itself.”

One big difference between rule-based content generation and generative AI content development is that “when we define the rules, we are in full control. As long as there’s no rule and no error in the rules, then there will be no error in the output that is generated.”

With generative AI, however, “we really don’t know the rules, so we only have to trust the machine that it comes up with the right rules.”

If you’d like to subscribe to my bi-weekly newsletter, INMA members can do so here.

About Paula Felps

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.
x

I ACCEPT