This article was published on March 20, 2024

How news organisations decide whether a photo is ‘too edited’

AI editing tools are casting a grey line


How news organisations decide whether a photo is ‘too edited’

In the era of artificial intelligence and accessible photo editing, you can’t believe everything you see online. One exception, of course, is (usually) if it’s published by a reputable news source.

The foundation of photojournalism lies in its ability to present reality in an
authentic and unaltered manner. Digital manipulation poses a significant threat to this core principle, undermining the credibility and trustworthiness of the images distributed by photo agencies. The controversy around a retouched family photograph of the Princess of Wales and her children was a rare glimpse into how publishers deal with this issue.

Agencies such as Getty Images and PA Images play a crucial role in delivering accurate and reliable photographs to the public. These organisations adhere to strict codes of conduct designed to ensure the integrity of the images they distribute. If an image is accepted but later found to violate these guidelines, it is given a “kill order”. It sounds dramatic, but this instantly stops the distribution.

The main reason why photo agencies cannot accept digitally manipulated imagery is the potential distortion of truth. Manipulated photos can present a skewed version of reality, misinforming the public and compromising the public’s trust in them. Many a photographer has been fired for violating this trust.

Photojournalism is a powerful tool for documenting and bearing witness to events around the world. Authenticity is paramount. Even family portraits of public figures become historical documents.

There is a grey area around portraits in the ethical discussion. They can be staged or directed — the photographer will guide and position people. But there is still a requirement in the press to avoid any retouching. That said, in areas such as fashion and celebrity outlets where airbrushing is common, those guidelines are looser.

Photo agencies have their own standards about what level of editing is acceptable. AFP says that photos and videos, “must not be staged, manipulated or edited to give a misleading or false picture of events.” Getty allows for some minor changes such as colour adjustment or removal of red eye or of dust from a dirty lens, but prohibits “extreme” colour or light adjustments.

Several agencies decided to retract the photo of the royals because it did not meet their standards. This does not mean the photo was AI-generated or fake, only that it does not meet the strict level of acceptable editing.

Changing tech, changing guidelines

As new technology such as generative AI (which can create photos or videos from a prompt) makes photo editing and creating fake images easier, press agencies are starting to discuss how to handle it. The Associated Press states:

We will refrain from transmitting any AI-generated images that are suspected or proven to be false depictions of reality. However, if an AI-generated illustration or work of art is the subject of a news story, it may be used as long as it is clearly labelled as such in the caption.

News organisations are also experimenting with AI-generated text, and developing guidelines for this. They tend to focus on transparency, making clear to readers when artificially generated content is being used.

A man sits at a desktop computer and edits a photo of a model on an editing programme like Photoshop
Some retouching is accepted in photojournalism and fashion photography.
Gorodenkoff/Shutterstock

World Press Photo (WPP), an organisation known for its annual photojournalism contest, provides explicit guidelines for submission, updated annually. Photo agencies often align themselves with these principles, recognising the importance of a universal standard for truthfulness in visual reporting.

Due to pressure from photographers and artists who work in more conceptual photography, WPP has added an “open format” category. This welcomes, “innovative techniques, non-traditional modes of presentation, and new approaches to storytelling.” The contest organisers considered allowing AI-generated images in 2023, but backtracked after outrage from many photojournalists.

The rise of advanced editing tools and software has made it harder to distinguish between authentic and manipulated images. Fully embracing manipulated imagery in a photojournalism contest would be a risk to the industry’s credibility, at a time when trust in journalism is already at risk.The Conversation

Andrew Pearsall, Senior Lecturer in Photojournalism, University of South Wales

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with