Ofcom has issued a stark warning to television services about the use of ‘deep fakes’ in programmes (see Broadcast Bulletin, 3 April 2023). Although the regulator’s new Note to Broadcasters does not refer to the ‘deep fake’ images of the ‘arrest’ of former US President Donald Trump (pictured) which have been circulating on social media, Ofcom’s warning is clearly linked to them.
Ofcom asks all TV channels to check their compliance processes to ensure they take account of “the potential risks involved in the use of synthetic media technologies to create broadcast content.” This is because of the serious potential risks to broadcasters and their audience posed by ‘deep fakes’.
RISK ONE is mis/disinformation. One challenge here is ‘deep fakes’ used to create fake news or other disinformation that spreads online, so broadcast journalists or documentrary makers face problems selecting authentic images from online sources. If a broadcast journalist does mistakenly broadcast ‘deep fake’ pictures in news or current affairs so viewers are misled, the results would be significant. Ofcom takes a very dim view of such cases and might fine the service. (See Rules 2.2 and 5.1, Broadcasting Code).
It is not only ‘deep fakes’ that cause Ofcom concern. Sometimes footage used in gaming is so realistic it can be confused with the real thing. This happened several years ago when a serious ITV documentary Exposure: Gaddafi and the IRA caused a furore when it was revealed to have included video game footage (captioned ‘IRA film 1988’) which the programme falsely claimed was genuine footage of the IRA shooting down a helicopter (see 23 January 2013 Broadcasting Bulletin).
I personally think the greater risk in this area is of a ‘deep fake’ being presented as genuine to viewers in programmes, or on channels, funded or produced by people with certain ideological or religious beliefs which are at odds with mainstream opinion and attracted to conspiracy theories. Such an approach, coupled with weak compliance, when presented with a ‘deep fake’ in tune with a channel’s prejudices might well lead to a serious problem.
As shown in the Exposure: Gaddafi and the IRA case, the potential harm to viewers of ‘deep fakes’ or synthetic media being presented to viewers as authentic is that it undermines trust in news, current affairs and factual programmes.
RISK TWO is a breach of Fairness and Privacy rules (Sections Seven and Eight of the Code). Audiences could mistake ‘deep fake’ images of a real person in a way that could cause unfairness to them or unwarrantably infringe their privacy.
Ofcom’s Note to Broadcasters is timely and helpful, rightly stressing that broadcasters can of course include ‘deep fakes’ or other synthetic material in programmes. They should not be frightened of doing so. BUT if so – and this is the most important point – broadcasters must ensure they comply with the Broadcasting Code.
To avoid problems in this area my compliance advice to broadcasters is straighforward: ‘If you include synthetic media (including ‘deep fakes’) in programmes, take measures to ensure viewers do not think they are genuine images.”