Main content

Deepfake attacks

How an AI scam phone call has had a lasting effect on a family.

After explicit faked photos of Taylor Swift went around the world, US politicians have
called for new laws to criminalise the creation of deepfake images. The term 鈥榙eepfake鈥 describes how artificial intelligence 鈥 AI 鈥 can be used to digitally alter pictures, audio or video and trick us into seeing or hearing something that is not real.

It is not just the famous who are being targeted. Host James Reynolds hears the story of how a daughter鈥檚 voice was copied and used to make a scam phone call to her mother.

鈥淪he said mom I messed up, and all of a sudden a man said 鈥榩ut your head back and lay down鈥 and that鈥檚 when I started to get really concerned that she was either really hurt or something more was going on,鈥 Jennifer tells us. 鈥淎nd then she goes 鈥榤om, mom, these bad men have me, help me, help me and she starts crying and sobbing.鈥

Thankfully her daughter, Brianna, had not been kidnapped but the call has had a lasting effect on the family.

Technology has made the process of adjusting images easier but artificial intelligence provides the means to create media from scratch to generate completely fake content. We bring together two women 鈥 in the US and Australia 鈥 who have had their faces manipulated using AI to produce malicious pornographic images and videos.

A Boffin Media production in partnership with the OS team.

(Photo: Noelle Martin. Credit: Noelle Martin)

Available now

23 minutes

Last on

Sun 11 Feb 2024 12:06GMT

Broadcasts

  • Fri 9 Feb 2024 20:06GMT
  • Fri 9 Feb 2024 21:06GMT
  • Sat 10 Feb 2024 09:06GMT
  • Sat 10 Feb 2024 16:06GMT
  • Sat 10 Feb 2024 19:06GMT
  • Sun 11 Feb 2024 00:06GMT
  • Sun 11 Feb 2024 12:06GMT