Universiteit Leiden

nl en

Leiden researchers call for new guidelines for AI-generated images in journalism

Generative AI presents journalists with new options for image use but also raises ethical questions. Leiden research shows that GenAI image use is yet limited and calls for guidelines to seize the opportunities without losing sight of the risks.

You can create images in seconds in GenAI, but there is a downside. The research report AI in Beeld shows that AI image generators are not yet widespread in journalism. This offers  great promise but presents serious risks. Researchers from Leiden University call for new guidelines for using this technology.

The AI in Beeld report by Leiden researchers Jaap de Jong, Astrid Vandendaele, Maartje van der Woude and Stef Arends is based on 59 interviews with policymakers, editors and experts from the world of journalism. It was created in collaboration with AD, KRO-NCRV Pointer and the Foundation for Regional Public Broadcasters, with financial support from the Dutch Journalism Fund.

Between imagination and responsibility

GenAI images can enrich journalism: as an illustration, visualisation or for topics that cannot be photographed for practical or ethical reasons. Regional media are leading the way. ‘The technology opens up new forms of visual imagery’, say the researchers.

But the risks should not be underestimated. AI models are often trained on datasets with copyrighted materials and stereotypical imagery. And it is increasingly difficult to tell the difference between AI and real photos. The public does not always realise that images are artificial, even if they are labelled or captioned as such. ‘Transparency is essential but is not a watertight solution’, says the authors.

Recommendations: clear framework and active engagement

To encourage responsible use of AI image technology, the researchers make five recommendations:

  • Draw up editorial guidelines with clear boundaries for GenAI images;
  • Invest in training on AI, ethics and image recognition within media outlets;
  • Increase technical control on image manipulation;
  • Create transparency as needed, for example with labels, metadata or disclaimers;
  • Actively involve the public in the discussion about AI and news production.

‘This is an opportunity for journalism to reevaluate its visual craft’, says Vandendaele. ‘But that will only be possible if it retains control over how, when and why AI is used.’

Read the AI in beeld research report​​.

This website uses cookies.  More information.