OpenAI executive offers a closer look at ChatGPT for news media companies

By Paula Felps

INMA

Nashville, Tennessee, United States

Connect      

The rapid evolution of Artificial Intelligence continues to create opportunities and raise new questions within the news media industry.

During this week’s Webinar, Fireside chat and Q&A with OpenAI, INMA Product and Tech Initiative Lead Jodie Hopperton was joined by James Dyett, OpenAI’s head of platform accounts, to discuss those opportunities and answer some of the questions.

Since OpenAI launched ChatGPT-3 in November 2022, it has attracted the attention of individuals and businesses alike. This was a surprise, Dyett said, as the company didn’t expect ChatGPT-3 would appeal to a consumer base. 

“We got that totally wrong,” Dyett said. “It turns out that the chat interface really captured the imagination of people and helped them see the power of these models.”

As Hopperton pointed out, AI is often thought of in the newsroom as a content creation tool, but that is a narrow and limiting point of view: “There are so many different steps it can take, from brainstorming … to trying to understand large data sets, headline testing, ideas, summarisation, rewriting for social,” she said. “There’s so many ways we can use AI as an assistant.”

INMA's Jodie Hopperton and OpenAI's James Dyett discussed what AI and ChatGPT can do for news media companies.
INMA's Jodie Hopperton and OpenAI's James Dyett discussed what AI and ChatGPT can do for news media companies.

Key themes for the coming year

And, after a year of experimentation and exploration with ChatGPT, the technology will likely step into its own in the coming year as users become more comfortable with it.

For 2024, Dyett identified three key trends that will move AI’s use further into the business world:

  • Multimodality: “We think, increasingly, these models are going to have integrated image, vision, and audio capabilities so that any interaction with the model won’t necessarily be text to text but can be image to text, audio to text, or image to audio,” he explained. This will create greater integration between the different modalities.
  • Agentic behaviour or AI acting on behalf of the users: This will see large language models (LLMs) moving from a “text-in, text-out” application to inputting audio, text, and images on behalf of the user — and then being able to take action. It will no longer depend on talking to the LLM about what it knows from its training; instead, it’s about the model knowing, “Oh, this question means I have to go talk to this piece of software to get the answer,” Dyett said.
  • Customisation: Companies are focusing on building models unique to their domain that can help complete specific tasks.

ChatGPT in the age of fake news

As AI becomes more sophisticated and industries are becoming more comfortable using it, global questions must be considered. The much-discussed challenge of hallucinations, or putting together ideas that don’t belong together, is a cause of concern, particularly for news media companies. So is ensuring that the information presented is unbiased.

OpenAI is aware of those challenges and is facing them head-on. It recently published its approach to the 2024 elections, explaining how it will prevent abuse, provide transparency on which content is generated by AI, and improve access to accurate voting information worldwide.   

“Our job when it comes to accuracy is to work with media who are producing authenticated verifiable facts and find ways to point the model towards these verifiable facts,” Dyett said. Then, the model will be able to provide accurate information “in a way that works for the journalists doing the hard work to create these facts and surfaces something useful for our customers.”

OpenAI is highly incentivised to get this right and is engaging with the news media industry to ensure that happens, Dyett said: “I think you have a lot more experience on how to sort out what is a high-quality set of facts versus one that might be less reliable,” he said. “And we’re going to have to make some tough decisions.”

However, as a small company with just 800 employees, Dyett said, “it will take us time to build relationships with all of the organisations we want to.”  

Overcoming privacy concerns

One of the biggest concerns companies have about OpenAI is privacy and data security, Dyett said. Many companies still are reluctant to allow its use because of concerns about what is done with the data it collects.

For example, if a journalist enters information into ChatGPT, there might be concerns that the information will be used to train models or might be stored. Companies are often concerned that it doesn’t meet their strict security, data privacy, and compliance requirements.

But those fears can be put to rest, Dyett said, by providing an understanding of how ChatGPT was built — and why.

“We spend a lot of our time just helping our customers understand what’s happening to their data,” he explained. “We don’t train our models on data that’s used for ChatGPT enterprise, the data isn’t stored, and your data is processed in a secure environment.”

The companies that allow ChatGPT in and are letting employees experiment with it are enjoying positive results, he said: “If you give your employees access to ChatGPT and … really let them figure out how to use [it] in their workflows, you’ll end up with some really creative applications.”

AI and the news environment

AI’s disruption is being felt at varying levels across all industries. But when it comes to news media, there are some things it cannot do, particularly when it comes to telling the compelling stories of people.

While AI does an impressive job of collecting data and finding information, “[it] won’t do a very good job of going door to door and asking questions to gather facts about an issue,” Dyett said. “Or if a disaster happens, AI’s not going to be able to knock on people’s doors and get to the bottom of it.”

When it comes to local news coverage, AI still lags in finding accurate information and being able to discover local stories. That’s something OpenAI wants to tackle to make the models more helpful, but Dyett said it will take time for the company to build relationships with organisations to provide the kind of information it needs.

“We are going to need to have local news to create a helpful assistant. How we build the partnerships at scale to do that is something we’re still trying to figure out,” he said, adding that the company is trying to see how it can work with local media: “There are so many different local news sources and we need to find ways to work with all of them.”

About Paula Felps

By continuing to browse or by clicking “ACCEPT,” you agree to the storing of cookies on your device to enhance your site experience. To learn more about how we use cookies, please see our privacy policy.
x

I ACCEPT