How AI is generating change in newsrooms worldwide
Editor’s note: Charlie Beckett is the Director of JournalismAI, a global initiative of Polis — the journalism think tank at the London School of Economics and Political Science (LSE) — JournalismAI is supported by the Google News Initiative.
In 2019, we conducted our first global survey on how newsrooms are using AI in their work. At the time, even some early adopters were at the beginning of the AI integration process. Since then, so much has changed in the world of AI and in the ways media makers approach and use these technologies.
In our latest research report, Generating Change, we share what newsrooms are doing with AI today. We wanted to reach a larger and more diverse group of media professionals this year. Between April and July 2023, we surveyed 105 news organizations from 46 countries about their engagement with AI and associated technologies. JournalismAI is a global initiative of Polis, the journalism think tank at the London School of Economics and Political Science (LSE), and is supported by the Google News Initiative.
The latest report is meant to serve as a comparative exercise to help us better understand some of the trends we’re seeing around AI in the newsroom. Now, let’s get into the findings:
Almost three quarters (73%) of news organizations surveyed believe generative AI applications, such as Bard or ChatGPT, present new opportunities for journalism.
Around 85% of survey respondents — including journalists, technologists and managers at news organizations — have at the very least experimented with generative AI to help with tasks such as writing code, image generation and authoring summaries.
Some respondents noted AI can help free up capacity for more creative work by helping with time-intensive tasks such as interview transcription and fact-checking. Respondents pointed out that generative AI is accessible, has low requirements for technical skills, and what was described as their ability to understand “context. This, they say, makes generative AI stand out from other AI technologies that generally require deep specialist expertise in areas like programming.
Despite these opportunities, respondents recognized the need for any AI-generated content to be checked by a human to mitigate potential harms like bias and inaccuracy. More than 60% of respondents noted their concern about the ethical implications of AI on journalistic values including accuracy, fairness and transparency and other aspects of journalism.
While newsrooms globally contend with challenges related to AI integration, the challenges are more pronounced for newsrooms in the Global South (e.g., Brazil, Russia, India, China and South Africa). Respondents highlighted language, infrastructure and political challenges. They noted how the social and economic benefits of AI tend to be geographically concentrated in northern nations, where there is better infrastructure and easier access to resources.
With 80% of respondents expecting an increased use of AI in their newsrooms, the report’s authors believe this is a crucial opportunity for who they call “good” journalists to do more “‘human” work with the support of AI.
With these results, we have new questions about the future of journalism AI. For instance, will relying on generative AI technologies in editorial tasks become an industry norm? Or could it become a widely unacceptable practice? You can download the full report on JournalismAI’s website.