Skip to content

Fake News:- you ain’t seen nothing yet


The above titled article in The Economist drew Adam’s attention.

At a time when politicians daily claim ‘Fake News’ and posit ‘alternative facts’ this piece on recent technology developments raises some interesting questions.

The lead in is this video clip

The clip’s creator Mario Klingemann noted on the You Tube clip:-

Published on Feb 4, 2017 

Another go at FaceTransfer. Using the face landmarks of another person a generative adversarial network generates a new face, in this case it was trained on the face of Françoise Hardy using various music clips as source material. In order to show the range and limitations of the generated latent face space I’ve added some random camera motion and zoom. There are some weird single frame face jumps in there which I still try to figure out what causes them.

The Economist noted:-

She is asked, by a presenter off-screen, why President Donald Trump sent his press secretary, Sean Spicer, to lie about the size of the inauguration crowd. First, Ms Hardy argues. Then she says Mr Spicer “gave alternative facts to that”. It’s all a little odd, not least because Françoise Hardy (pictured), who is now 73, looks only 20, and the voice coming out of her mouth belongs to Kellyanne Conway, an adviser to Mr Trump.

The video, called “Alternative Face v1.1”, is the work of Mario Klingemann, a German artist. It plays audio from an NBC interview with Ms Conway through the mouth of Ms Hardy’s digital ghost.

The key element here is that Mr Klingemann generated this on a personal computer using a GAN -a generative adversarial network, a form of machine learning algorithm.

This clipshows us a new battlefield between lies and truth. The current push to attack media with alt facts and denigrating major media and /or news you disagree in is having some success worldwide. Trump and Erdogan are major players in this.

Although written media are increasingly viewed with scepticism

But images and sound recordings retain for many an inherent trustworthiness. GANs are part of a technological wave that threatens this credibility.

Further on the article notes:-

Amnesty International is already grappling with some of these issues. Its Citizen Evidence Lab verifies videos and images of alleged human-rights abuses. It uses Google Earth to examine background landscapes and to test whether a video or image was captured when and where it claims. It uses Wolfram Alpha, a search engine, to cross-reference historical weather conditions against those claimed in the video. Amnesty’s work mostly catches old videos that are being labelled as a new atrocity, but it will have to watch out for generated video, too. Cryptography could also help to verify that content has come from a trusted organisation. Media could be signed with a unique key that only the signing organisation—or the originating device—possesses.

The technology raises a number of issues:-

  • moral and ethical
  • political
  • legal

It is likely that in the near future entire recordings and videos could be presented to us as actual, e.g. from a ‘citizen journalist claiming to have witnessed an event’ but may in fact be totally artificial and fabricated in somebody’s bedroom on their personal computer.

The consequences could be far-reaching and extremely unsettling.

Comments are closed.

%d bloggers like this: