Hit it toots! And I was kinda like, what an idea! You’re fired! In the past year, a powerful new technology has emerged.
Do you by a savvy investment in a thriving business? It’s what enables the face swap videos
you may have seen on social media, but some believe it will have far more
profound implications for politics and society. Software powered by artificial intelligence can now create wholly fake images, video and audio. They’re called deepfakes and they are
increasingly indistinguishable from reality. Which one do you think is fake?
Can I, can I watch them again or, yeah. They literally all look real.
Yeah. I need to watch it again. Can you spot the fake? Three of these
video clips are of the real Barack Obama, one of them was entirely generated
by a computer. Can you spot one? No. Wanna watch it one more
time? One more. All right. To test the current state of this technology, we showed people on the Venice beach
boardwalk these clips to see if they could spot the deepfake. This is crazy.
This is a totally fake video. Wow. Foremost in all of our minds has
been the loss and the grief. Wow. Couldn’t tell. That’s good. I visited with the families of many of the victims on Thursday. As deepfakes get more realistic, pundits and lawmakers say we need to take action to control this scary new tech. They could be anybody from a transnational group to cyber hoodlums and vigilantes, and we are not ready for it. We are not ready for it. They claim that the rise of deepfakes will bring to an end the very idea of generally accepted truth. Seeing is no longer believing. We are entering into a future where anything can be faked, well then nothing is going to be real. You can literally manipulate the news
as it happens. In this dystopian future, even a 4chan using basement dweller
could alter the course of international politics. It could have me say things like, President Trump is a total and complete
dipshit. Does it scare you? Yeah. Yeah because we were fooled on a lot of them. Yeah. A recent Pew poll found that 77% of Americans think that government needs to step in and restrict altered or made up videos or images. Do you think that the government should
make that a crime? Yeah, I do think so. As long as it doesn’t infringe, as long as that doesn’t bleed over for, you know creative laws. But does this technology really represent
an unprecedented assault on truth that requires new restrictions on speech? What the most breathless predictions
overlook is that deepfakes are just the latest extension of a tradition of manipulation that goes back to the earliest days of
captured and reported media. Deepfakes not only don’t pose a threat to society, the technology has many exciting potential
applications that could be stymied by legal restrictions. Now it’s true that deepfakes are getting
more realistic and easier to create. Stay woke bitches. Using a single selfie from a user, Chinese app Zau can replace a celebrity in a movie scene with anyone holding a smartphone. Stanford researchers have developed an algorithm that can edit talking head interviews as easily as texts. Subtracting or adding words the subject never really uttered. I love the smell of napalm in the morning. I love the smell of French
toast in the morning. And several companies have unveiled software that can sample just seconds or minutes of someone’s voice. I kiss my dogs and my wife. And then put any words into that person’s mouth that can be typed into a computer. I kissed Jordan three times. They had their faces plastered on other
women’s bodies in pornographic videos. Redditers recently started posting videos that superimposed celebrities into porn scenes using a product called Fakeapp and the short lived app, DeepNude allowed users to virtually undress a celebrity or anyone else. That’s scary cause it’s like, how do you prove that’s not you? You never thought of that did you? No, I didn’t. But yeah. I think the technology’s definitely scary. Reddit and other sites quickly banned these concoctions, and many deepfake software providers limit the availability of their tools on ethical grounds. But many people we spoke to think only
the state has the power to halt the spread of deepfakes. I’d like to think that if this is a concern for the people, that our government would take the right strides to protect us. So-called deepfakes that enable malicious actors to foment chaos, division, or crisis. What you have is not a threat to our
elections, but a threat to our republic, a constitutional crisis unlike we have ever faced in the modern history of this country. The House is considering legislation
called the Deep Fakes Accountability Act, or the defending each and every person from false appearances by keeping exploitation subject to accountability act, which would make it a crime to post quote synthetic media, unless it’s labeled by irremovable digital watermarks. It’s kind of like South Park. I mean, you know, in South Park, that that’s not who it’s being portrayed to be. I don’t know. There is a weird gray area. I do think it should be illegal, but at the same time, people should, Do you think it’s possible to make it illegal without sucking up stuff like that? That satire? Yeah I think so, I think so. But the proposed law would be impossible to enforce without also ensnaring vast amounts of benign and creative speech. The bill is vague as to what should count
as synthetic media, and its response to software that can alter any image is to make it a crime to post images that can be altered by software
to remove watermarks. You’re the government and you’re trying to, you know, define what a deepfake is so that you can ban it or you can regulate it. How would you define it? As long as the parties involved are all informed and consensual, then it’s fine. But if I don’t know, cause then we’re going back to the South Park thing where it’s free, it’s free use to Like, let’s say for example, I take this video and I put your face into the Avengers. Should it be banned? Yeah. Yeah. It should be banned because
like I said, you didn’t have my consent, but then isn’t there already laws like that? Yes, there are. Existing defamation, libel, and fraud laws already make it a crime to knowingly lie about what someone said or did to cause them harm. And those civil libertarians debate
their constitutionality. Revenge porn laws do exist in most states explicitly banning the posting of sexual content without the subject’s consent. Proposed regulations on deepfakes thus inherently take aim at certain types of speech, regardless of any actual harm. Come on. So how would law makers use new powers
to restrict so-called synthetic media? A recent viral video featuring house speaker, Nancy Pelosi, put the question to the test. President Trump tweeted out a video
of the house speaker that was edited. The clips were slowed down to make
the Congresswoman sound intoxicated. Give this president, the opportunity, to do something historic. Facebook marked the clip has doctored and limited its distribution, but when executives refused Pelosi’s demands to remove it from the platform all together, she called them quote, willing enablers of the Russian interference in our election. That video was not an
AI assisted deepfake, but rather a crude manual manipulation
that some have called a cheap fake. Nonetheless, the is virality on social media demonstrates the scale of the challenge we face. The Pelosi clip also demonstrates how lawmakers would quickly take aim at nonsynthetic media. As such low tech fakes can be just as convincing. So what makes deepfakes a
unique threat to the truth? Some argue that their spread will enable
politicians to dismiss every major caught on video moment as a fake. You
can do anything. Whatever you want, Grab him by the pussy. He’s not claiming that the infamous
access Hollywood tape is doctored, that this is a fake tape. President Trump’s recent attempt to spin his notorious access Hollywood appearance as fakery has only fueled this fire. Alarmists conjure a future in which even
scenes of presidential inaugurations could become a battle of
discerning whose video is real. This was the largest audience to
ever witness an inauguration period. Actually that already
happened in 2017. In fact, none of these worries are new, nor are the calls for government intervention. Photographers have been convincingly
altering depictions of reality since the invention of the medium, including for political reasons. For example, Stalin’s right-hand men, Nikolai Yezhov was famously erased from
this photo following his execution. So you’d rather see them police it, then people learn to question sources of things? There’s too many dumb people out
there man, I don’t think you can, um, I don’t think you can trust people to
be able to recognize it all the time. Radio was also once considered a medium
for dumb people. Because it was piped directly into American living rooms, some feared that it would drown out the
more sober reporting you might find in newspapers. This occurrence possibly has nothing
to do with the disturbances on Mars? After Orson Welles’ radio broadcast of the
War of the Worlds in 1938 tricked some listeners into believing there
was an actual alien attack, newspapers seized on the incident to rally support behind restrictions on the scary new technology. Similar calls have followed the emergence
new technologies from the Telegraph to the internet. Fake video going viral on social media. The real image on the right,
Emma tearing up a target post, not the constitution. A bill to cut down on Photoshopping of models in magazines. Do you think that this represents sort
of a fundamental shift in manipulation of news? That Photoshop or deceptive editing in the past have not? No, I think it’s the same. The same thing. People still believe Photoshop
pictures as well. Yeah. Do you think there should be similar
rules for like Photoshop photos? Yeah. I don’t know, that’s a hard one man. Most of the people we spoke with, including those who favor
legal restrictions on deepfakes have adapted to the existence of photo editing software and no longer view it as a threat to the nature of truth that requires government intervention. Do you think things like Photoshop photos
should also be regulated in the same way? No. There’s always going to be someone,
they’re gullible enough to believe, but usually they’re going to
believe whatever they advocate. And to be honest I wouldn’t care if that
person’s fake, if the news is correct. Cause a lot of times we get the
correct person but the fake news, which would you rather have? It’s horrible for us to have to like
question everything we see on the news or online. It’s like is that real or is
this fake news? But you should question. So I guess the more people know about deepfakes like the better cause then you know to question. The technology behind deep fix could
allow real time language translation in video chat. Or give back a voice to people who
have lost theirs to illnesses like ALS. I want to sound like me. It could help actors age up or down for
movie roles and create opportunities for underrepresented, indigent,
or disabled artists. Laws supposedly targeting deepfakes could stymie these and other exciting potential applications. And they could also give powerful people the ability to censor any content they find embarrassing. You want to keep your
money and your freedom. Deepfakes may be the latest front in media manipulation, but like the people we interviewed, most consumers have adapted
to past technological leaps and will continue to do so. And just as technology
enabling deepfakes is advancing, so with technology that can identify them. Software and development can analyze metadata and the video itself, using AI to detect subtle signs of manipulation. And the company Amber uses cryptography and the blockchain to verify the time location video is captured, as well as its content. These tools are hardly foolproof
but they’ll continue to get better. Even so software already exists to spot
realistic fake images made with programs like Photoshop. But the people we talked to said they
still find tried and true media literacy strategies more useful. A lot of people like will send me
something and they’ll be like look, and I’m like that doesn’t seem right. And so if I think it’s
like something weird, I’ll like Google it and
like look it up. Yeah. I do like every other good person.
I Google it. Or I Snope it. Or I call the local drunk down the street who’s going to the barber shop cause he knows everything. But the main thing I do is I will
go and check for other sources. Do you think that you could still do that for deepfakes without having to have it government regulated? Yeah. Yeah. Like it’s there for me to do if I want
to do it and really, really figure out. So Americans may want to reconsider
whether there really is a coming techno-apocalypse that will send
society into a tailspin of lies. Perhaps this brave new world isn’t
really all that different than those that have come before. And whether the
fakes are obvious, and it will be huge, or subtle, foremost in all of our minds,
media consumers can and will adapt. You believe what you want to
believe. You find what you look for. If somebody wants to believe that Hillary
Clinton was in a threesome with Donald Trump and a boa constrictor,
they’ll believe it. It’s up to you to determine whether it’s
fake or true and look a little deeper. [inaudible].

Tagged : # # # # # # # # # # # # # # # # #

98 thoughts on “We Asked if the Government Should Regulate Deep-fakes”

  1. Maybe not immediately, but anyone caught distributing deep fakes for reasons other than say, parody or comedy, and passing it out as truth should be subject to legal punishment.

  2. Deepfakes are overblown. There's potentially some issues nearby that need to be mended but which can be quite easily mended, and some bigger issues much further down the line, if ever (like edited security cams), but overall it won't be too big of a deal. Faked content has been around ever since writing came to be and ever since photo editing existed, etc.
    Even since video has been around there have been lookalikes.
    That said, there should definitely be protections in place, but it doesn't mean that anything new needs to be added because many protections are likely already in place, such as defamation, or personal image rights.

    edit: I guess the video already covers this stuff.

  3. The people who claim they are going to google everything the see and hear online and in print are either stupid or lying. We usually reflexively accept anything we agree with or expect to be true without follow up, unless there is some obvious flaw in it. We research to debunk things we disagree with or that defy our expectations. That's doubly true these days, when most people have their preferred political "side" and back them with all the loyalty we used to reserve for our favorite baseball and football teams. Believing that people will be so media savvy that they won't immediately accept a video of their political opponents fucking up is naive in the extreme. It's akin to thinking that people under communism will suddenly become angels who will stop acting selfishly. It'd be nice, but it's not the humanity that exists in the real world.

  4. Do you really expect the powers that be to regulate themselves? If they pass any law at all, it will be a loophole that allows them to use deepfakes and forbids all else.. just like they do with ponzi schemes, tax evasion, and every other double standard that allows the creation and maintenance of the current caste system.

  5. You can't regulate or control this type of technology. Best you can do is have platforms like YouTube and Facebook to have anti deepfake AI that checks every video that is uploaded. If a deepfake is detected it flags the video and informs the users watching it that it's a deepfake. This can keep fake news in check while still allowing for parody videos to be made.

  6. "Americans may want to reconsider whether there really is a coming technopocalypse that will send society into a tailspin of lies"

    When did our legislatures become so bored they had to regulate things before it becomes at all anything resembling a problem?

  7. "4chan using basement dweller" Steriotyping at it's finest. Libertarian that thinks everything shouldn't be regulated until they get scammed from a product they thought was legit.

  8. You can't prove that a deepfake is fake or that it is not fake. you can't prove that a suspected deepfake is actually a fake.

  9. I think the world took a wrong turn when the mobile phone was created. People don't have as much contact with their loved ones any more and our social construct has fallen apart. I've seen people texting each other while sitting right next to one another and it makes me wonder where we go from here .Family structure will be their next target, people divided are a weak force to reckon with and we're losing the battle to segregation.

  10. There are already laws barring false impersonation and they apply to all mediums: libel, slander, and defamation laws; plus laws against committing fraud and identity theft. Seems like we have all the laws we need to punish bad actors who use any technology to cause harm to a person’s reputation.

  11. But the United States is owned by The City of London. The United States President works for The City of London so if you get elected what will you do to tell the owners of The United States Military that you won't go to war. This is why Presidents always promise things but when they get in office they soon learn they aren't the power.

  12. Thanks for talking about deep nde, it was pretty easy to access. Funny program.

    "The Streisand effect is a phenomenon whereby an attempt to hide, remove, or censor a piece of information has the unintended consequence of publicizing the information more widely." I'm a victim of that effect.

  13. Doesn't matter what the government does. The only option in the near future for the local, state and federal government AND private business will be to ban ANY video as reliable evidence for "fill in the blank" legal and commercial applications. The pathetic side of our Brave New World.

  14. Let the free market protect us. Someone could more than likely invent a browser extension that gives you a warning when a deep fake is detected. Boom – done. Whoever makes this please give me a %.

  15. Celebs only upset about porn because they weren't paid and didnt do it. The govt already does this. Now theyre telling everyone MKUltra

  16. Anything the government touches is a disaster. so HELL NO ! 2:08 recent polls are shown that 99% certain Hillary will win in a land slide. ….wait…..!?

  17. 10:08 – "it's horrible for us to have to question everything we hear from news" 🤔 wait, you weren't doing that all your life?

  18. This shoulda been titled “should the government regulate deepfakes? We asked donald trump” and it shoulda been a deepfake of donald trump on a toddler like the thumbnail

  19. People believe what they see on the media just look at CNN even though Project Veritas expose CNN is biased, people still believe it.

  20. Should the government regulate deepfakes? That's like asking a person with alzheimers how to get rid of alzheimers. Govern means control ment means mind government means literally mind control. Dumb fucks vote.

  21. OF COURSE NOT
    if we have this tech now, it means it has existed for at least 20 years. All tech we have is 2nd hand from tech our taxes created for Govts to use against their collective cattle. wake the fk up.

    If anything, this should teach us to believe nothing we see and hear 2nd hand, or to think…….or at least not to have strong opinions with low IQ.

  22. Only the politicians have anything to fear from this. So of course they want to regulate. The fact is if you analyze the video, you will be able to tell.

  23. Seriously? The acronym? Maybe we go see people in person with our own eyes. Media is already using spin and sound bites to create very convincing fake news.

  24. Seal is still a thing. Not to worry.

    Gin,

    Edited:
    Also the idea of reputation, once a person found guilty of lying, reputation mechanism should kick in. Less that person in the future will be believed, unless they make effort considered enough (adequate) as a prove of atonement.

    Final point, there will always a new ways to lie, yet the solution still the same, the truth. Only truth is the antidote ever changing lies. And free speech is a tool created to help find truth.

  25. Intention to mislead is the key, and what other definition can you put on the CIA, FBI, Democratic party, and a few hundred other denizens of the district of criminals????

  26. @ 5:55: No. Defamation is not a crime, it is a tort. The difference is between the government arresting you for the act or a person suing you for the act. THIS is a distinction that cannot just be overlooked. When the government is involved, we have serious 1st amendment issues. When a private citizen is involved, pull out your checkbook. Yes there are criminal defamation laws, but these laws would clearly bring up the 1st amendment issues and would more than likely not be prosecuted.

  27. Nothing new to me knowing that software like this existed since the time people were running xp on their computers. There is no need to heavily regulate the internet with an extra regulation like this …

  28. T THe governments would never –
    1984
    The Great Leap Forward
    Well the us government would never—
    Texas – chemistry castrations
    California – goodness CA a literal poop area
    Maryland – red flag laws

  29. As if the media wasn't already negatively influencing the masses. Now, any fuktard can spew forth BS, making seem as though it's legitimate. As with most things having the potential to be positive, it will be utilized in grotesquely harmful manner. Just as with Guns, there are numerous laws already in place, they simply aren't being enforced.

  30. Regulation probably isn't the solution, but protection might be. That would be protection for individuals to have rights to their own image and voice, much the way copyright protects created works. If a media source publishes a deep fake construct without proper clearances, then they would face the possibility of defending that choice in court, with a large financial penalty as the consequence.

  31. "We asked several burnouts on the street if they can tell the difference between a computer image and real people talking. Go figure, they can't. So, likely your dementia stricken grandmother on Facebook won't either."

  32. No. I will disseminate truth from falsehood on my own. If the government does it for me what is to stop them from just offering me fake news. Like CNN OR MBC? Or worse yet the Young Jerks?

  33. I don’t already believe everything I see or hear unless it’s with my own eyes. Cause you know movies have talking animals, aliens and actors long dead alive again. See also the movie “Wag the Dog”.

  34. Dubbed videos, slow mo videos, remixes of people, photoshopping, memes, clipped videos, etc.. could all be examples of deep fakes in the government's eyes. Many many videos and content creators could be under attack and with a biased government they could shut down one political party while allowing the other side to continue spreading propaganda legitimized by the law

  35. “Some corrupt People may abuse this technology, so lets allow a clearly corrupt group of people to decide who gets to use it for us and declare what is and isn’t a deepfake

    10/10

    Great idea

    Can not go wrong

  36. What could ever go wrong with government laws https://www.lancasterguardian.co.uk/news/council-under-fire-using-terror-law-catch-dog-foulers-656715 fuck these people need a dose of history, these people do not understand the implications on what you are asking. Do not ever be fooled to trust the government.

  37. I'm more concerned with evidence in court. Video evidence is some of the best ways to prove someone's guilt or innocence. What if someone makes a really good incriminating deepfake of you, to either incriminate or embarrass you? At the very least, it should be illegal for any governments to use them in any way, and anyone making a deep fake of someone without someone's consent should have to label it as a deep fake before putting it in the public record – not necessarily with a watermark, though.

  38. I live in a home for the blind.
    Some time last year there was a report of criminals who used AI-based learning to generate synthetic voice-calls in a scheme to steal money from a German bank.
    My first thought when I read about this was, This would make the work of Tape Aids For The Blind so much easier.

    A volunteer spends three hours reading law-texts for a blind law-student; a blind reader rejects an electronically generated voice as it causes distress over long periods of listening.

    This is one small area that deep learning can help and enrich our lives, but I fear that the inevitable kneejerk thou-shalt-nottery is, yet again, going to profit only two groups: the legislators, and the forces that will profit from enforcing those laws; and the criminals who, as in the days of Prohibition, are enabled and empowered by those very laws intended to suppress them.

    (Search "law of unintended consequences".)

  39. Probably the solution to deep fakes is to make sure that all sound/video/photo devices have a cryptographic key associated to [each/some random] frame/s of the data recorded by the device. Then if someone claims that the content is a deep fake, then the original cryptographic key of the content can be matched with the cryptographic key of the "fake content" content and see if they match. Also the cryptographic key should have the date of the content in order to know which content was created first. This cryptographic method on media can be mandated by law.

Leave a Reply

Your email address will not be published. Required fields are marked *