Skip to content
1932

LAYOUT MENU

Insert PARAGRAPH
Insert IMAGE CAPTION
Insert SIDEBAR WITH IMAGE
Insert SIDEBAR NO IMAGE
Insert YMAL WITH IMAGES
Insert YMAL NO IMAGES
Insert NEWSLETTER PROMO
Insert IMAGE CAROUSEL
Insert PULLQUOTE
Insert VIDEO CAPTION

LAYOUT MENU

CREDIT: NIESSNUR LAB / VISUAL COMPUTING GROUP AT TUM

Information apocalypse: All about deepfakes

VIDEO: Doctored images, fake videos and computer-generated news are easier than ever to create and distribute. Here’s why you should worry and what you can do.

Support sound science and smart stories
Help us make scientific knowledge accessible to all
Donate today


Lea en español

PRODUCED BY HUNNI MEDIA FOR KNOWABLE MAGAZINE

Today, computing power and artificial intelligence have democratized the creation of faked media. Images can be synthesized, videos manipulated, audio mimicked and text written, all by deep-learning neural networks. These can then spread on social media platforms, distorting our shared reality. Digital image forensics expert Hany Farid of UC Berkeley discusses how fake forms of media are created, how they proliferate, the perils they pose and what people can do to guard against them.

Read more: Synthetic media: The real trouble with deepfakes

Video Transcript:

Hany Farid (digital image forensics researcher, UC Berkeley School of Information): “We are in an information apocalypse. And the fact is that the majority of people now get their news from Facebook and from social media. That should freak everybody out.”

“Twenty years ago, my lab, we started focusing on developing these forensic techniques to authenticate content. In fact, that was a niche field, and something bizarre and amazing happened in 20 years, which is the internet exploded, cell phones, mobile devices exploded, social media exploded, and then we started seeing mis- and dis-information designed to disrupt democratic elections and so civil unrest, non-consensual pornography, and small- and large-scale fraud that are having a huge impact on our economy.”

“Photo manipulation is not new. If you look back through history, Stalin famously airbrushed people out of photographs. Mao did it. Hitler did it. They just literally would take images where they were their friends one day and they fell out of favor and they would just vanish from the photographs. But, when Stalin did it, there was probably 10 people in the world that could do it, so it was in the hands of a relatively few and of course we didn’t have the distribution mechanism that we have today.”

Deepfake image of Kim Kardashian: “I genuinely love the process of manipulating people online for money.”

Hany Farid: “There are four wings of deepfakes: text, audio, image and video. We now have the ability to write a headline, and then hand that headline over to a computer system, and it will write an article — a full-blown article, top to bottom — based on the headline, and it’s really good. And so why should we worry about that? Well, imagine I want to create misinformation campaigns and I can train these things to write articles in a certain voice, and I can carpet-bomb the internet with these things.”

“In the audio domain, we have technology for synthesizing speech in a person’s voice. So you would record a couple of hours of me speaking, and then you would do what’s called ‘text-to-speech.’ I would type and the computer will synthesize me saying anything you want me to say. We have technology that will synthesize images of people who have never existed. There’s this amazing website out there called thispersondoesnotexist.com, and when you go to that site, true to its name, it will show you an image of a person who doesn’t exist — men, women different races, different ages, glasses, facial hair, incredibly well done. And we’ve already seen the first nefarious uses of that technology in the creation of fake profiles on Twitter, on YouTube, on Facebook and on LinkedIn to either start to promote fake news, to sow civil unrest, to commit fraud. And then perhaps the biggest threat is the fake video.”

Deepfake image of Jeff Bezos: “Hi. I’m Deepfake Bezos.”

Deepfake image of President Obama: “President Trump is a total and complete dip-(beep).”

Hany Farid: “There’s three types of deepfakes today. There’s face-swap, lip-sync and puppet master. So face-swap, as the name suggests, I take a video of me talking and I put another person’s face, from about the eyebrows to the chin onto mine, frame after frame after frame after frame, and it looks like somebody else. The lip-sync deepfake is where you take me talking like the way I’m doing right now and you have a new audio track synthesized, and you simply replace my mouth to be consistent with that audio track. So I’ll have been saying one thing, but the fake video is me saying something completely different.”

Deepfake images of President Obama: “NBC glad. Why? Fox TV jerks quiz PM.”

Hany Farid: “And the puppet master deepfake — in some ways, the biggest threat — you take a single static image of me, and then you have a puppet master sit in front of the camera and talk and move, and then my face gets animated according to their motions. And then of course you add the audio track to that.”

“So you can see why people are concerned about the deepfake video in particular and the audio because you can literally put words into people’s mouths.”

Deepfake image of President Obama: “You see, I would never say these things, at least not in a public address, but someone else would, someone …”

Jordan Peele: “… like Jordan Peele.”

Hany Farid: “Whether that’s a presidential candidate saying something like ‘I’ve launched nuclear weapons’ or something racist or illegal, or whether it’s a CEO saying, ‘Our profits are down 10 percent’ and the stock market moves billions of dollars instantaneously, or anything in between those things.”

Deepfake image of Mark Zuckerberg: “Imagine this for a second: One man with total control of billions of people’s stolen data — all their secrets, their lives, their futures.”

Hany Farid: “How does it work? You actually have two computer systems, two algorithms — one is called the synthesis engine and one is called a detector. And the synthesis engine, it’s job is to generate an image of a person. So here’s what it does: It takes an image, it slaps down a bunch of pixels. It’s just literally random pixels in the image, right, so if you looked at it, you’d be like, ‘This is nothing.’ And it hands it over to a detector, and the detector’s job is to say, ‘Is this a person?’ And what it has at its disposal is a bunch of images of actual people, and so it compares them; it says, ‘You know what? I can tell the difference.’ And so it goes back to the synthesis engine and says, ‘Nope, this isn’t a person.’ And the synthesis engine modifies a few pixels and sends it back, and they do that a few million, hundred million times, and eventually what will happen is the synthesis engine — not entirely trial and error, but somewhat on trial and error — will eventually create an image where the detector says, ‘Yeah, this looks like all these other people,’ and it’s a person who doesn’t exist. That’s it.”

“And so the reason we call it ‘synthesis’ is that you are synthesizing whole cloth. And that’s true, by the way, whether it’s text, audio, image or video — that same core underlying technology underlies all of that.”

“So where do we go as a society, as a democracy, as a group of people who have to have a shared set of facts of what’s going on in the world? We have different opinions, but we have to have a common set of facts. And if everybody can simply start to claim that news they don’t like is fake, and everybody lives in these echo chambers of social media, where they see things that conform to their previously held views, where are we as a democracy in this society? How are we going to understand people who are different from us? How are we going to have a democracy? And therein lies, in my view, the bigger threat of deepfake — it is part of a continuum, but really what it is, is we are reaching this point where text and audio and image and video simply cannot be trusted anymore online.”

Deepfake image of President Nixon: “Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace.”

Hany Farid: “We’ve looked at how well people can detect manipulated content — we’re terrible at it, we’re just bad. And what’s particularly dangerous here is because we have so much exposure to digital images and digital video, we think we’re good at it, but we’re not.”

“Two pieces of advice: Slow down, and have a healthy amount of skepticism. The ‘slow down’ is important, right, because we are really fast online, we’re very quick to respond, because we’re being manipulated. But the other one is we just have to return the old adage that exceptional claims require exceptional evidence. Do we really have to have a conversation about whether Hillary Clinton is running a child pornography ring out of a pizza joint in DC? I mean, is this really the world that we’re occupying now?”

“Here’s the third one: Delete your Facebook account, delete your Twitter account and get off of YouTube, because honestly the rot that is social media is rotting your brain, it’s rotting our society and it’s destroying our democracy. So if I could pick one thing, I would just get you off of social media, because I think whatever good had come out of that has long surpassed us, and it is largely negative now and we should just abandon it. It is a failed experiment, and we should go and try and do something else. By the way, I get funding from Facebook, I should just, for full disclosure, so you know, we’ll see if that grant gets renewed.”

Support Knowable Magazine

Help us make scientific knowledge accessible to all

Donate

TAKE A DEEPER DIVE | Explore Related Scholarly Articles

More From
/content/article/technology/2020/information-apocalypse-the-problem-with-deepfakes
dcterms_title,dcterms_subject,pub_author
+dcterms_language:language/en -dcterms_type:topics/newsletter -contentType:Journal -contentType:Contributor -contentType:Concept -contentType:Institution
3
3
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error