२६ ऑगस्ट, २०१९

"In April of this year, a company called Coding Elite exposed an artificial intelligence (AI) program that took a substantial sample of my voice..."

"...which is easily accessible on the YouTube lectures and podcasts that I have posted over the last years. In consequence, they were able to duplicate my manner of speaking with exceptional precision, starting out by producing versions of me rapping Eminem songs such as Lose Yourself (which has now garnered 250,000 views) and Rap God (which has only garnered 17,000) as well as Rock Lobster (1,400 views). They have done something similar with Bernie Sanders (singing Dancing Queen), Donald Trump (Sweet Dreams) and Ben Shapiro, who also delivered Rap God. The company has a model, the address of which you can find on their YouTube channel, which allows the user to make Trump, Obama, Clinton or Sanders say anything whatsoever."

From "The deepfake artists must be stopped before we no longer know what's real/I can tell you from personal experience how disturbing it is to discover a website devoted to making fake audio clips of you — for comic or malevolent purposes" by Jordan Peterson (National Journal).

1. First, "garner." These videos don't just get views. They garner them.

2. How do I know this is really written by Jordan Peterson? It could just be the old-time, shallowfake of putting somebody's name on something that he didn't write.

3. Assuming it is Jordan Peterson, why should I accept what he's saying at face value? It could be the most common fake of all, an insincere opinion. I mean, look at the part I'm quoting above, with all the links to the fake version of him performing cool songs and prodding us to go listen and spend time within the wacky Jordan Peterson phenomenon. What's he really up to? What's it all about? He had that book where he went on about lobsters and now somebody got his voice performing "Rock Lobster." I've go to think this promotes his brand:



4. I can certainly understand wanting to stop deepfake versions of your voice and image (as well as the fraudulent use of your name), especially if it's done to trick people and to damage your reputation. Sometimes it's just having fun and no one is fooled (or is someone always fooled?... maybe you could think JP did "Rock Lobster," sort of like the way William Shatner did "Rocket Man").

5. What should we do about the deepfakes? Can they "be stopped before we no longer know what's real"? An effort to stop them could make the ones that are not stopped more convincing. We need to develop a better sense of what is real and can't count on the government to save us. The government is often the source of fakery. What about Facebook and YouTube and other social media? Should they take down all impersonations (or take down anything the impersonated person wants removed)? I don't know what the answer is here, only that I wouldn't want to stop satire and that there are negative consequences to purporting to protect people from fakery. We need to be the masters of our own mistrust.

१७६ टिप्पण्या:

The Godfather म्हणाले...

"Don't trust the government" shows great wisdom, O Althouse!

Chris म्हणाले...

Deep fakes will destroy us.

Hagar म्हणाले...

A long time ago they made a film - with Warren Beatty, I think - wherein they had Bill Clinton make a speech he never made, using clips from an actual speech and altering the words he spoke.

rehajm म्हणाले...

I'm Bill McNeal- On crack- I like boys!!!

David Begley म्हणाले...

It is my considered legal opinion that JP can sue for a common law copyright violation of his voice. Harley Davidson trademarked the sound of its engine.

Jaq म्हणाले...

Maybe these people who ridicule the lobster/prozac stuff, on which there has been a lot of research, BTW, are more crypto creationists.

Fen म्हणाले...

Video/audio has always been manipulated. This is just the 3.0 version.

Jaq म्हणाले...

Bring on the EMP.

rhhardin म्हणाले...

Credulity and entertainment are friends.

rhhardin म्हणाले...

Jokes have always needed a chain of custody.

richlb म्हणाले...

"...I wouldn't want to stop satire...."

Well, satire that you approve of.

Laslo Spatula म्हणाले...

My Deep Fake ate my homework.

I am Laslo.

rhhardin म्हणाले...

Artificial intelligence is itself a fake term.

Jaq म्हणाले...

Not ready for prime time. Great source material if you want to do a Canadian accent.

Temujin म्हणाले...

We are entering the times of 'Nothing is as it seems'. Both visually and audibly. There is no way to shut it down, outside of shutting off the internet and dealing only with real people, one on one. In the very near future, we will all be subject to misinformation spoken to us from sources we are sure we know, while they may or may not be real.

The election of 2020 may be the starting point for this use as a weapon. It will be a small trial run. But it will be tried and completed successfully. Between Russians, Chinese, Iranians, Eastern Euros, Western Euros, Israelis, our own government, our own major political party operatives, anarchists/hackers from all over the world, and CNN, who knows where the misinformation will be coming from? But it'll be coming in bushel-loads. And within the barrage of misinformation there will be a tiny experiment in fake visual or audio comments from someone within a campaign. It won't be real, but we'll think it is. And by the time the truth comes out, the damage will have traveled around the world twice.

We've entered the time where our technology has sped past our maturity as a spieces.

rhhardin म्हणाले...

On the bright side, public apologies can be automated.

Jaq म्हणाले...

Maybe we will finally get to see those hookers peeing on Obama’s bed while Trump does a Bond villain cackle.

rhhardin म्हणाले...

dealing only with real people

what about fake orgasms.

Ann Althouse म्हणाले...

Writing this post got me thinking of the Dylan lyric: "And the princess and the prince/Discuss what's real and what is not/It doesn't matter inside the Gates of Eden."

Shouting Thomas म्हणाले...

Doing business with a fake persona and avatar seems to be the future.

What better way to sidestep discrimination and equality mania?

It seems to me that working within this virtual world of fakes is the beginning skill we must all master.

TRISTRAM म्हणाले...

For a law professor, this seems a bit short sighted. With this technology, the limits of eye witness testimony, FBI labs falsifying drug tests, states using quack forensics like bit marks, losing audio and video evidence returns us to 'in the act' and vigilante justice because you just won't be able to trust the states process.

rhhardin म्हणाले...

The first thing automated, back in the 60s, was PARRY, a psychotherapist program. Secretaries liked to talk to it.

rhhardin म्हणाले...

The real is a subgenre of the fake.

rhhardin म्हणाले...

Peterson's real complaint ought to be that he hasn't signed the fake stuff. He signs his real work.

It's a signature being faked.

rhhardin म्हणाले...

Obviously blockchain has to come in.

rhhardin म्हणाले...

I always post anonymously, but using my real name to keep it simple.

rhhardin म्हणाले...

There's a future in mining reality checks, as they have for bitcoin.

bleh म्हणाले...

I can guarantee there will be deepfakes in the near future of politicians saying the n-word or somesuch. Trump will be the first bigly victim of this, I predict. You saw how badly the “grab ‘em by the pussy” video wounded him. It’s the only time I can remember him apologizing for anything!

And if a politician is unsure whether he really said it, well, that’s damning enough to justify the fakery. No one will care if it’s fake.

Jaq म्हणाले...

It seemed more cut and paste to me. Little improved over that Clinton clip in the movie referenced above. I think all that they use esatz intelligence for is to make a database of the spoken words indexed to written words. You could disprove the above by matching the words to your public speech and noting that precise matches could be found to each of the words, but not in that above order.

There remains work to be done before we can use this to convict Trump of raping young girls. Emotion carries fluidly from word to word. I don’t see that here. This is like the first LED, it was decades before they got to the welder’s torch high beams that we see today.

iowan2 म्हणाले...

Deep fakes is a problem. Note how many people believe President Trump said White supremacists were fine people. That's just a simple matter of news readers repeating a known lie.

So it's only a problem when your moral and intellectual betters tell you its a problem.

Honestly, deep fake, fake news, what is the difference?

Jaq म्हणाले...

Sorry ersatz intelligence.

Paul Snively म्हणाले...

I'd bet good money the fakes still wouldn't stand up to spectrographic analysis. The human brain will tend to overinterpret the signal, making misidentifying its source form of availability error. The fast Fourier transform suffers from no such processes.

Howard म्हणाले...

Back in the days of my youth white male patriarchy training, we were taught to believe only half of what you see and nothing of what you hear. This deep fake shit is another excuse for pearl clutching by snowflakes.

MikeR म्हणाले...

I just don't know what "stop them" means. Of course there is no way to stop them. In ten years anyone will be able to do this.

Howard म्हणाले...

I suppose it's daunting for those whom are troubled by shit and shinola

traditionalguy म्हणाले...

All publicity is good publicity. Now the fake guys need to do one with the Toronto Terror compelled to speak all of th pronouns he refused to speak.

Peterson is a force of nature.

Laslo Spatula म्हणाले...

People will believe what they want to believe. Deep Fakes just help mitigate the cognitive dissonance.

Progress is simply the improvement of the things we use to lie to ourselves better.

I am Laslo.

Howard म्हणाले...

We're well past peak Peterson. He hasn't said anything new in over a year

Laslo Spatula म्हणाले...

"We're well past peak Peterson. He hasn't said anything new in over a year"

If only this was applicable to Karl Marx.

Shinola: now Gluten Free!

I am Laslo.

Ralph L म्हणाले...

I just watched JP at AEI with C Hoff S and Frum's wife. He mentioned how social media reduces the rejection risk for males since they can interact anonymously. It seems like girls should wise up eventually and wait for the real person, but we can't expect teenage girls to be wise.

traditionalguy म्हणाले...

Peterson has years of lectures on YouTube. Like peak oil, peak Peterson is a myth.

gilbar म्हणाले...

This "Jordan Peterson" guy is OBVIOUSLY a total LIAR!
He 'claims' that he's Never sang the song; BUT, WE'VE HEARD it
LIAR LIAR PANTS ON FIRE!!!

Where's Snopes when we need them?

Jaq म्हणाले...

"We're well past peak Peterson.”

What I can’t understand is why his anodyne opinions got the snowflakes on the left so worked up in the first place.

Shouting Thomas म्हणाले...

I'm surprised that nobody has taken up my thesis...

Being a total virtual fake will be the business platform of the future.

Legal liability is one of the most prominent drivers of this.

Ralph L म्हणाले...

why his anodyne opinions got the snowflakes on the left so worked up

Part of it is similar to TDS, they're hearing a big distortion or making one for sport.

Jaq म्हणाले...

https://www.nytimes.com/2019/08/25/us/politics/trump-allies-news-media.html

Four people familiar with the operation described how it works, asserting that it has compiled dossiers of potentially embarrassing social media posts and other public statements by hundreds of people who work at some of the country’s most prominent news organizations.

Here the New York Times is getting their panties in a twist that people are going to quote their real words, “distorting” them by repeating them and possibly commenting on them, apparently. When your world is premised on carefully crafted lies, the real problem is deep reality.

Operatives have closely examined more than a decade’s worth of public posts and statements by journalists, the people familiar with the operation said. Only a fraction of what the network claims to have uncovered has been made public, the people said, with more to be disclosed as the 2020 election heats up. The research is said to extend to members of journalists’ families who are active in politics, as well as liberal activists and other political opponents of the president.

They are going to use our OWN words against us! In our own voice! Heavens to Murgatroyd! Stage left!

Danno म्हणाले...

Blogger rhhardin said...I always post anonymously, but using my real name to keep it simple.

Sounds like a Yogism. Almost lost my covfefe upon reading it.

Derek Kite म्हणाले...

Peterson has the unfortunate ability to be right. Deepfakes and twitter are either going to destroy the medium or destroy a good number of people. I bet on the people destruction.

By the way Snopes has it's tail in a knot over people taking satiric sites seriously because they can't tell the difference.

buwaya म्हणाले...

One possible ultimate outcome is that no video image, nor text, will be trusted.

Which makes Trumps heavy emphasis on personal appearances in large venues rather prophetic. A very large number of people have now seen him in the flesh. This is very rare for a modern celebrity, or politician.

It seemed a very old fashioned method when he started it, but it looks to be working.
None of his opponents seems to have been able to compete in this.

narciso म्हणाले...

Much the way the times, convicted zimmerman in the press, along with abc and nb

buwaya म्हणाले...

The "journalist" dirt-file seems like what they should expect - they have used these methods against hundreds of their targets. It is natural that someone would fire back.

Turnabout is fair play.

Dear corrupt left, go F yourselves म्हणाले...

CNN and MSDNC could infuse a pile of fake Trump into their already Avanetti-Maddow-Schit cult news.

rhhardin म्हणाले...

You can't trust movie captions either. In Foyle's war there was a caption "(chickens clucking and cows mooing)" when it was chickens clucking and goats bleating.

narciso म्हणाले...

Avenatti seems to pushing the full procopius, to beg for hnas favor.

Fernandinande म्हणाले...

Fake news, sigh it never ends. They're not singing, they're reciting the lyrics, I was curious about an "autotune" effect.

Still, it's a cool advancement, a video version also exists: your voice coming out of your face, saying fake words.

Bob Boyd म्हणाले...

How does Jordan Peterson know he didn't actually try to sing Rock Lobster?
If you ask me, it's a hell of a coincidence that the morning his cover of Rock Lobster came out, he woke up on the floor of a strange apartment to find that someone had drawn a face on his torso with a Sharpie making eyeballs out of his nipples and a mouth from his belly button. Not only that, his eyebrows had been shaved off.

Hagar म्हणाले...

Considering the success they have had with puffing up a petulant quip by Trump in response to their interrupting his lunch with silly questions, it is not hard to imagine someone creating a panic by faking s really outrageous video clip and having it go around the world before it can be retracted.

jnseward म्हणाले...

"Shinola: Now gluten free!" As usual Laslo provides the perfect summation.

Jeff म्हणाले...

What's the big deal? Politicians and advertisers have always lied to us. Every sane person heavily discounts what they say. The media has been lying to us for decades. It seems to have taken the obvious dishonesty of the various anti-Trump campaigns to get people to realize they can't be trusted either.

Although it may seem that there is more lying going on today, maybe all that's happening is that more of it is being exposed and we're getting better at recognizing it for what it is. In the public arena, almost everyone has an agenda they're not telling you about.

Rory म्हणाले...

It's the government fakes we really have to be wary of. Eventually a government is going to doctor a video to incite genocide among its own people. Best defense against it is that citizenry should be armed to defend itself.

Fernandinande म्हणाले...

(which has now garnered 250,000 views)

"(with 250,000 views)". Peterson is not concise.

buwaya म्हणाले...

"In the public arena, almost everyone has an agenda they're not telling you about."

This was one of Orwells points, in "1984" and essays.
He worked for the BBC in the war. His portrayal of Minitrue is taken from the wartime BBC, not the Germans or Soviets. That BBC with a global reputation for reliability.

The BBC was always an instrument of the British government, and did exactly as required by the government of the day, or rather those factions of it that controlled the actual governing institutions.

If that was the case with the BBC, then the rest of the institutional news sources anywhere can be assumed to be that much less objective.

Wince म्हणाले...

Deep Fake?

Haven't married women been doing that in the bedroom for generations?

Fernandinande म्हणाले...

Harley Davidson trademarked the sound of its engine.

They tried to but gave up.

Amadeus 48 म्हणाले...

I want to hear Hitler deliver the Gettysburg Address in English at the Nuremberg stadium with American flags dubbed in and George W.Bush, John McCain, Nancy Pelosi, Marianne Williamson, Oprah Winfrey and Bill Maher sitting behind him. And I want it to be believable.

Fernandinande म्हणाले...

narciso said...
utm_source=facebook
fbclid=...
fbclid=...
fbclid=...


Do you include all that extraneous tracking in your links because you're employed by Fecabook?

narciso म्हणाले...

Happy now:
https://www.breitbart.com/europe/2019/08/26/delingpole-amazon-fires-big-fat-nothingburger-fakenews-scare-story/

narciso म्हणाले...

Facebook like the cpu in tron, is trying to snuff out resistance.

Brian म्हणाले...

The solution to this will be more artificial intelligence. Someone will write a program to analyze everything someone has posted and see if this is in line with something they WOULD have said. Jordan Peterson is not the kind of person that would sing Rock Lobster, therefore this is a fake.

Sort of a "Snopes AI". Then we'll have the fake Snopes AI. Continuous battle.


It's funny in a way though. I remember one of the plot points of Robert Heinlein's "The Moon is a Harsh Mistress" is on how Mike the sentient computer was able to lead the revolution because he could impersonate various officials through video calls. Something no human would be able to do.

Jeff Brokaw म्हणाले...

Here’s what I adopted in 2015-16: assume all news is bullshit to protect myself from lies and unnecessary emotional ups and downs, but also allow for 5-10% error in my filtering process and keeping my mind open to the possibility that some or all of that 5-10% might actually be true.

Sounds more complicated than it is. Works great, saves time and energy, positions the media where it rightly belongs, in the “probably b.s.” bucket, and disciplines the mind to stop being a sucker.

They’ve been lying to us for decades, it’s just more brazen and frequent now.


Fernandinande म्हणाले...

"Don't trust the government"

NASA detained a 75-year-old woman selling a tiny moon rock. An appeals court says she can sue (2017)

Her dead hubby had been an engineer, presented with a moon-rock sample in a Lucite paperweight. She needed money and emailed NASA about selling it, so NASA hired a goon who ...

“organized a sting operation involving six armed officers to forcibly seize a Lucite paperweight containing a moon rock the size of a rice grain from an elderly grandmother,” Chief 9th Circuit Judge Sidney R. Thomas wrote for a three-judge panel. and detain[ed] her for two hours in a public parking lot in urine-soaked pants.

MadisonMan म्हणाले...

Obviously what is needed here is Federal Govt oversight. Maybe this should be Cabinet-level. The threat to Democracy cannot be overstated. I think a budget north of $10bn should suffice, and several thousand employees.

Jeff Brokaw म्हणाले...

“In the public arena, almost everyone has an agenda they're not telling you about.“

THIS x 1000.

MayBee म्हणाले...

when there are problems, the human mind begins to create solutions. That's how we advance.

I do believe one solution to this is to find a better way for ordinary people to be able to wipe information from the internet about ourselves. Christine Blasey Ford managed to do it. Barack Obama did it as he was getting ready to run for POTUS. It should be a service available to all Americans.

As for deep fake- was what happened to the Covington Catholic kid a deep fake? No, it was just a bad edit with a PR firm and bad intentions. At some point, we need to reinforce the "buyer beware" principle when it comes to news and culture stories.

Fernandinande म्हणाले...


Leonard Nimoy : The Ballad of Bilbo Baggins(Music Video)(1968)

William म्हणाले...

Would it be possible for humans to evolve a form of intelligence and consciousness that is actually dumber than that of humanity? You've got to admire the ingenuity that goes into such a project. Imagine, if you will, a computer that does not solve Fermat's theorem but chooses instead to write scripts for never to be produced Three Stooges movies or Warren economic proposals.

Fernandinande म्हणाले...

LEONARD NIMOY ~ PROUD MARY

brylun म्हणाले...

Can't Peterson sue to protect his brand?

Darrell म्हणाले...

Lot's of Althouse images and voice recording are available for jokers to do with as they please. I'm sure Althouse will be cool with it.

How about you use the latest dying bee video first?

buwaya म्हणाले...

"At some point, we need to reinforce the "buyer beware" principle when it comes to news and culture stories. "

What you really need is to get rid of the MSM monopoly.

Char Char Binks, Esq. म्हणाले...

'Tis better to garner than to get.

glenn म्हणाले...

If you don’t pay attention to them what difference does it make. Jordan who?

Dust Bunny Queen म्हणाले...

We need to develop a better sense of what is real and can't count on the government to save us.

It is called Critical Thinking. Something that your parents are supposed to help you develop and a skill that your "educators" (so called) are supposed to teach you in your studies.

If you are brought up to accept every utterance by those in "authority" without critically thinking you are being brought up to be a compliant peon who is easy to control.

That is the goal of public education. Not to learn and think, but to listen and obey.

Ken B म्हणाले...

John Oliver should be worried. His kind of fakery looks old fashioned now. So should Anderson Cooper; CNN might be able to switch to purely automated lying, with no human intervention.

JayBee म्हणाले...

And I am reminded of this.

https://youtu.be/hiuL_rezDZo

MD Greene म्हणाले...

Temujin said: We've entered the time where our technology has sped past our maturity as a species.

So true. We've already demonstrated that we can't understand arguments based on logic, that we value feelings over thoughts, that we are unable to use pronouns or prepositions with precision and that we are pushovers for any internet head fake, be it from China or Snopes.

My prediction is that in the near future we will use Facebook's invaluable store of 1,000+ emojis in all written expressions. Then nobody will care who said what.

mockturtle म्हणाले...

Temujin rightly observes: We've entered the time where our technology has sped past our maturity as a species.

Indeed. Just as bio-engineering research has leaped ahead of ethical considerations and common sense.

Ralph L म्हणाले...

I would not want to be detained by anyone in urine-soaked pants, even 3 pair covered by judge's robes.

Nonapod म्हणाले...

The more cunning deep fake creators will probably generate fakes that slightly change what someone actually said in plausible ways. Around here we often complain that major "news" outlets will inaccurately or unfairly paraphrase something said by Trump in a way that makes it seem like he said something he didn't really say (the "fine people" hoax being the most prominent example of this). Now a bad actor could simply tweak a few words and flood the internet with the deep fake video so nobody knows what to believe.

The Cracker Emcee Refulgent म्हणाले...

"This deep fake shit is another excuse for pearl clutching by snowflakes."

And the product of it an excuse for pearl clutching by legions of other snowflakes.
I went to Althouse's elk link to The Guardian yesterday. My God, they're not even trying anymore. No wonder Snopes feels threatened by The Babylon Bee. The Left is spewing self-parody like a snowblower.

Enlighten-NewJersey म्हणाले...

Update the forgery laws to outlaw deep fakes.

DarkHelmet म्हणाले...

Rule of thumb: don't believe anything implausible. Be suspicious of things that are plausible. Assume the government and the media are lying most of the time, and simply wrong the rest of the time. Don't believe anything on social media.


"Everybody's got to believe in something. I believe I'll have another beer."

- W.C. Fields

My name goes here. म्हणाले...

The movie Looker with Albert Finny and Susan Dey imagined this world back in 1981.

Jordan Peterson should sue, but instead of for impersonation, or stealing his visage (or voice or whatever) it should be for lowering his brand because they did not autotune his voice.

If some entrepreneur were smart he would make completely fake models now (they already exist on instagram) and build up huge following. Then cash out by making an adult movie with actors doing the actual adult part and then putting the fake model face "deep fake" style on those actors.

Can you imagine if the CIA were competent and they had a deep fake of Chairman Xi saying that "the Hong Kong people were free and we must allow that without letting the rest of the Chinese see that." Or that "Hong Kong should be seceded to Taiwan."

rcocean म्हणाले...

Newspapers already lie about what people say - they just do it in print. Steven King for example. OR Donald Trump. Trump is different because he's POTUS, but anyone else, they can lie and tribute words to. And they when they're called on it, they can just say "oh my bad" or "We typed the manuscript wrong"

rcocean म्हणाले...

I can see Hollywood using this in their TV shows. One actor, two characters. Think of the $$ saved!

mockturtle म्हणाले...
ही टिप्पणी लेखकाना हलविली आहे.
mockturtle म्हणाले...

We, in the age of technology, are being led into a trap. Not that technology is inherently bad but that it has replaced the reality of nature and of our fellow creatures.

rcocean म्हणाले...

can they resurrect Steve McQueen? CGI McQueen with computer generated voice is wonderful in BULLITT II. At your local netflix.

rcocean म्हणाले...

Mock: for all I know you're a computer generated character - like everyone else. Is the internet making us more used to artificial communication and less willing to interact Face-to-face with real people?

You make the call. Of course, young people thiink its always been like this. They have a hard time grasping the Pre-internet world.

My name goes here. म्हणाले...

Someone should patent a process to put an encoded timestamp in the non-visible portion of the video.

Only the person that made the video could decrypt the timestamp to show that the video is real. All of the people whose livelihoods depend upon moving pictures will adopt the technology as fast as they can.

So deep fakers would need to fake that encrypted timestamp as well, but without the ability to decrypt it to a known source it would be rejected by Right Thinking People (tm).

Ingachuck'stoothlessARM म्हणाले...

Woe to those who call evil good and good evil, who put darkness for light and light for darkness, who put bitter for sweet and sweet for bitter. Isaiah 5:20

obvious satire is 'fact-checked', and fake news is swallowed whole.

The capital you function on in news media is TRUST. You lose some of that every time your readers/viewers figure out you sold a lie to them.

tainted but tolerated now, eventually dismissed altogether--
Truth the first casualty

mockturtle म्हणाले...

Rcocean, my three months in Alaska with no TV and limited internet access has assured me that there is a reality--and a beautiful one, at that. And the people I met, both locals and tourists, mostly foreign, gave me hope for mankind. The media, whether unwittingly or not, are dividing us and by feeding our hate and mistrust make us more dependent on them. Our best bet is to ignore them as best we can and trust our own senses--including our 'common' sense.

rcocean म्हणाले...

Mock - great answer.

Meade म्हणाले...

"My Deep Fake ate my homework."

Hey, don't laugh — for many people Deep Fake homework cuisine is a real serious problem.

wildswan म्हणाले...

We can't rely on a new legion of fake experts detecting expert fakes. It may be that these kinds of audio fakes are very clearly edited if you look at audio analysis. Maybe soon there'll be an iPhone app that looks for jerks and joins and other signatures of audio fakes. In the Seventies and Eighties there was an outbreak of art faking in response to rising prices. Technology brought an end to this secret faking (but if you openly fake and just claim your turd is art, you are OK.)

But, basically, maintaining the truth really rests on a degree of skepticism and a sense of probability by aware citizens and it depends on us responding to instant emotional communications by slowing down into rational response time. When MSNBC posts that video of Trump shooting down RBG on Fifth Ave, we need to be ready. The number of echoes is not the measure of validity for what is being said or supposedly said.

Michael म्हणाले...

Perhaps unauthorized use of a person's artificial but convincing image or voice should be a tort (maybe it is?) That would put it somewhere in the area of libel or copyright infringement. If you're doing satire, etc., draw a cartoon or make it otherwise clear that the image or voice is fake.

Hagar म्हणाले...

It does not have to be for anything "political." Perhaps just combined with a scheme for automated by/sell algorithms on the stock market and creating a selloff situation like we saw last week, and someone could make a bundle of loose money.

Ingachuck'stoothlessARM म्हणाले...

I was a muslim beggar in WWII
I posed as a Jewish mover-& shaker.
During 'Operation Jubilee'
they called me the Dieppe Fakir

PM म्हणाले...

1. The ability of a guilty party to deny something as a deepfake is more alarming.

Jupiter म्हणाले...

"We need to develop a better sense of what is real and can't count on the government to save us. "

Wilbur म्हणाले...

I spent a few seconds watching/listening to the Sanders and Trump videos. They are remarkably poorly done, not clever nor funny in any way.

People get upset about this? It's like getting upset about the lips moving in a Clutch Cargo cartoon. Not one person in 10,000 would suspect this was really Trump or Sanders voicing these songs.

Ingachuck'stoothlessARM म्हणाले...

by the time Dylan meets his Maker the process will be perfected,

and we can expect a trove of "unreleased material" from Bobby Z.

“Some men who receive injuries are led to God, others are led to bitterness.”

Why did Dylan make up this Melville quote?

Char Char Binks, Esq. म्हणाले...

Peterson has no room to complain, since he stole that voice from Kermit.

Jaq म्हणाले...

I don’t think my wife ever faked an orgasm... She wasn’t quick enough! Ba dump bump!

Rocketeer म्हणाले...

Once again, I have to point out that "garner" is not the same as "get".

You have to be active to "garner". Any ol' lazy-bones can just sit on their ass and "get".

Peterson's precisely switched their proper uses, but garnering is not the same as getting.

buwaya म्हणाले...

Mockturtle,

These things are rarely started by common people.

Quaestor म्हणाले...

The vid of Trump singing the Eurythmics pointless (s)hit Sweat Dreams is hilarious (certainly more entertaining than the Eurythmics themselves wandering amid cows) — BUT not at all convincing.

narciso म्हणाले...

Why i never:

https://thefederalist.com/2019/08/26/journalists-meltdown-journalism-done/#.XWP_2fADyhU.twitter

Quaestor म्हणाले...

Peterson's precisely switched their proper uses, but garnering is not the same as getting.

Well said. You can get sick; you may get well; you can get out or get in; you can get on your mark and get ready; you can get set. BUT, you cannot garner sick, nor can you garner well, nor garner in or out, etc.

Jaq म्हणाले...

'You have to be active to "garner". Any ol' lazy-bones can just sit on their ass and "get”.’

Bzzzzt! Thanks for playing, but you are wrong. With reasoning like this I bet you didn't get a perfect score on your SAT.

Lincolntf म्हणाले...

Sounds very fake to me. I get that it's his voice, but the A.I. has a ways to go.

LA_Bob म्हणाले...

After reading Althouse's take and then the comments, I'm coming around to the idea that Peterson is over the top here.

JFK thought Vaughn Meader's imitation of him sounded more like brother Teddy.

I sent an eCard (through American Greetings, I think) that used an imitation of Trump's voice and his sloganeering.

Brian Whitman, a talk-show host currently on KRLA in Los Angeles, does some pretty good voice imitations, especially Trump's voice.

This is all old hat.

The folks who might be fooled into thinking a Jordan Peterson imitation is the real deal wouldn't know who he is in the first place and wouldn't care.

Peterson might have an exaggerated respect for (or concern with) AI.

Now if you can demonstrate that the artificial version can be made to voice-print like the real thing, he might have a point, and then Begley's solution (lawsuits) might come into play.

Quaestor म्हणाले...

...for many people Deep Fake homework cuisine is a real serious problem.

I never lost my homework to a canine appetite, nor did I ever try that excuse. But my dog really did eat (chew and mangle, actually) my first pager given to me by my first full-time actual salary-paying employer. The boss wanted to be able to consult me on virtually anything that crossed his mind 24/7, thus the pager. That was on a Friday. By Saturday the boss was as flustered and panicky as a wet hen. On Friday night I set the pager to vibrate and placed it on my nightstand as I took a shower. On returning I found the device crushed on the floor with obvious tooth marks pressed into the plastic shell. Evidently, my boss had tried to summon me while I showered which caused the pager to make a noise like a bumblebee, so the dog, my loyal guardian attacked and killed the wretched thing before it could do me a mischief. The boss absolutely would not believe my explanation, even after I showed him the evidence.

narciso म्हणाले...

The prestige press is already fake, but it wants to naintain its monopoly



https://mobile.twitter.com/ZoomingIn_NTD/status/1165995250931965954?fbclid=IwAR0usEt2Q_hUaYcYUln-NGlCTSZrlyIe_JTkUfgxFsADADlEA-1NnwZQpoU

Yancey Ward म्हणाले...

Fecabook?

I am shamelessly stealing this, Fernandastein.

Quaestor म्हणाले...

...but the A.I. has a ways to go.

Artificial intelligence has had a long way to go for a very long time. Within a decade! Ten years! In our lifetimes! The goalposts have on their running shoes it seems. I remember the first time I played Turing's imitation game. I wasn't impressed.

I'm still not impressed.

Yancey Ward म्हणाले...

Some of you are missing the point- sure, these videos are obvious fakes.......right now. However, I pretty much guarantee you that in 10 years you won't be able to detect the fakeness in audio or video with your own eyes and ears. In 25 years, you won't be able to detect it by any means whatsoever.

Mike (MJB Wolf) म्हणाले...

This will lead to plausible deniability of even “real” video of “public” people. Unless there are sworn witnesses every unhelpful video can be plausibly denied by the elite. Those of us without a public footprint in the internet era can still be slimed by DF.

Quaestor म्हणाले...

However, I pretty much guarantee you that in 10 years...

See my comment above.

Rogue One had two mostly synthetic actors — a CGI Peter Cushing and a CGI Carrie Fisher. Both had genuine voice actors doing their lines, but the visuals were combinations of body doubles with CGI heads and faces. Both were obvious fakes, particularly the computer-generated Princess Leia. It looked like an off-her-diet Carrie Fisher wearing about a pound of pancake with a few appliqué freckles stuck on for character. 4K HD works against these effects, nor do the artists creating the wireframes and textures appear to understand what asymmetry contributes to human appearance.

DavidD म्हणाले...

The Moon is a Harsh Mistress

Jaq म्हणाले...
ही टिप्पणी लेखकाना हलविली आहे.
Yancey Ward म्हणाले...

Quaestor,

It won't be AI producing the fakes that are convincing- it will be ordinary everyday programmers using increased computing power and graphics.

Tyrone Slothrop म्हणाले...

I have no idea about the technology for this, but there has to be a "real" category for images etc. where the file is registered as "real" and contains a digitalchain of custody, with dates,edits etc logged and unchangeable. When published, there would be a distinction between "real" and "unreal".

Quaestor म्हणाले...

However, I pretty much guarantee you that in 10 years...

We are all too blithe to make predictions and prognostications, particularly in regard to high technology. I recall reading an article about Stanley Kubrick's preproduction work on 2001: A Space Odyssey. In late 1966 Kubrick paid a visit to an IBM research facility to learn about what computer scientists and engineers thought about computation devices in the 21st century. Many were confident that within 35 years computers would use visual images and natural language for I/O, this was at a time when input was almost entirely by EBCDIC entered from a terminal keyboard, Hollerith card, or MM, and output was by line printer or CRT. Consequently, Kubrick wrote his screenplay to include a machine as a character with motivation and autonomy — HAL 9000. (Several companies Kubrick consulted on the subject of future technologies happily agreed to "product placement" in the film, notably Bell Telephone and Panamerican Airways. IBM wisely did not consent, thus IBM became HAL by backing in the alphabet one position in each letter of International Business Machines logo.)

I think it's interesting to think of Kubrick's speculative computer technology, particularly AI in light of what was predicted, and what we had installed in 2001. Windows 98SE. Before we assign to ourselves the roles of techno-oracles it might also be instructive to consider the 2001 destinies of high-tech companies like PanAm and Bell Telephone.

Yancey Ward म्हणाले...

Quaestor,

I am just basing this on the quality of the graphics today vs what they were 10 years ago. We are already at the point where one can't tell a computer generated face from a real one- a static image. It is only a matter of computer power before we can put such static images into motion. I don't think it will even be 10 years, to be honest. I am less certain about the assertion that you won't even be able to prove a video with audio is a fake in 25 years using the technology available, but I am pretty damned certain my own senses are reaching the point of unreliability, and rapidly at that.

Quaestor म्हणाले...

It won't be AI producing the fakes that are convincing-

Many professional-grade imagine rendering applications are already using AI. For example, in applying shading or lighting effects to dynamic textures. Adobe's After Effects has many AI features. For things like flames, flares, and ground shadows it works well enough for YouTube and low-budget productions. For faces? It's not so good.

Yancey Ward म्हणाले...

And again, this has nothing to do with advances in AI- it is just the power of processing and graphics chips advancing.

Quaestor म्हणाले...

We are already at the point where one can't tell a computer generated face from a real one- a static image.

Are we?

Jaq म्हणाले...

Artificial Intelligence is more accurately described as artificial sensory processing. There is no question that if they ever develop genuine “intelligence” they are going to need this stuff to offload the processing, just as our brain does, but I have the feeling that they are going to need the always ten years away nuclear fusion to power the sucker.

Jaq म्हणाले...

Some of the static faces are pretty convincing, at least in a jpeg on my computer screen. Not sure about in 4K.

Yancey Ward म्हणाले...

Those aren't true AI, Quaestor. The term is being polluted here. The "AI" in those cases aren't really creating anything, they are simply tools being used by the humans to make their creations more realistic. True attention to detail takes a vast amount of computing power, or lots of time. Ms. Althouse just a week or so ago posted photographs of paintings that artists had created by hand that fooled my visual cortex completely. Now, had I been in the room with the painting, it wouldn't have fooled me, but in two dimensional representation, I couldn't detect the fake.

Quaestor म्हणाले...

And again, this has nothing to do with advances in AI- it is just the power of processing and graphics chips advancing.

GPU have AI built into their microcode. Custom shaders and texels are AI subroutines.

Yancey Ward म्हणाले...

For Quaestor: Which is real?

Yancey Ward म्हणाले...

For the record, I just did 10 of them again, and only scored 6 correct.

Yancey Ward म्हणाले...

Then is AI always ten years away or not?

Quaestor म्हणाले...

AI is why those synthetic faces you find so convincing are real looking. No human has enough life to manually apply a bounded yet unordered texture to a megapixel image, particularly one rendering in three dimensions. What happens is the artist chooses a texture and a shading condition and the application AI applies it. It's not just a rote use of a static texture or shading — that's what graphics engines used in games developed twenty years ago did, and the result is obviously artificial.

SeanF म्हणाले...

Yancey Ward: For the record, I just did 10 of them again, and only scored 6 correct.

I lost count of how many I did, but I only got two or three wrong.

But, then, I pretty early on stopped looking at the faces. The AI doesn't make convincing backgrounds, so if you see a clear, distinct background (wall, leaves, legible text, etc), it's real. Also, if the person has an earpiece in, it's real. (At least once I identified the real person because I didn't believe the AI would be able to correctly put a reflection of the cameraman in the subject's glasses!)

If those clues aren't there, look for little distortions around the ears and neckline.

:D

Otherwise, the faces themselves are very convincing.

Jaq म्हणाले...

"Now, had I been in the room with the painting, it wouldn't have fooled me,”

There’s a painting in the Louvre that is pretty amazing. I don’t recall the name, but the image will always be with me. It almost looks like a face in lucite hung on the wall.

Quaestor म्हणाले...

I got eight out of nine.

AI that can consistently pass the imitation game may always be ten years away. Once you get accustomed to the rendering method you unconsciously learn how to recognize what's real from what's unreal.

I recall when 1920x1080 HD monitors first came out I was stunned by the realism of a scene filmed on a coral reef — wonderfully detailed Moorish Idol fish swimming amidst bright green soft corals. Now stuff that pales compared to 4K, at least to my eyes. The quality of 4K video vs HD is an example of your brute force "computing power" at work — when the novelty wears off the differences, however subtle, become apparent.

Jaq म्हणाले...

Artificial Intelligence is a name given to a class of techniques where the computer programs the computer. It’s not the same thing as what we would consider intelligence. Maybe never will be. It’s a programming multiplier.

Jaq म्हणाले...

I got two out of the first three wrong, after that the only one I got wrong was an African American whose hairstyle and skin tone made it hard to see how fake the hair looked. I soon got bored.

But I tell you what, that second girl that was made up by AI? You can send her by my place anytime. She can wear a wig.

Quaestor म्हणाले...

It’s a programming multiplier.

Precisely. That's the point I've been trying to make to Yancey Ward. That's why AI is intimately involved in CGI.

JaimeRoberto म्हणाले...

The current panic about deep fakes is because some powerful people are worried that Epstein's blackmail tapes will surface. Surface like a lobster.

Jaq म्हणाले...

Imagine if you are the guy who is supposed to release all of the Epstein’s blackmail material on his death, and you suddenly realize that you are sitting on a billion dollar goldmine.

Ingachuck'stoothlessARM म्हणाले...

There’s a painting in the Louvre that is pretty amazing. I don’t recall the name, but the image will always be with me. It almost looks like a face in lucite hung on the wall.

https://www.sciencealert.com/samsung-s-ai-can-now-generate-talking-heads-from-a-single-image

Our deepfake problem is about to get worse: Samsung engineers have now developed realistic talking heads that can be generated from a single image, so AI can even put words in the mouth of the Mona Lisa.

Ingachuck'stoothlessARM म्हणाले...

epstein tapes/cd's etc (mostly) are before the deepfake tech, fwiw.

n.n म्हणाले...

Water Closet continues to overflow and soil the social landscape. Another job for Deep Plunger, who did an amazing job clearing the muck from the first, second, third, and fourth estates.

iqvoice म्हणाले...

"Deep fakes" will actually be a social good, if it results in the full distrust of the media. The nytimes is ALREADY deep fake! They gave us the Russian collusion hoax, and now they are busy peddling the Big Lie that Trump is a racist.

SeanF म्हणाले...

Ingachuck'stoothlessARM: epstein tapes/cd's etc (mostly) are before the deepfake tech, fwiw.

Caption on an image of Abe Lincoln taking a selfie: "Don't try to tell me this is photoshopped - photoshop didn't exist in Lincoln's day".

Ingachuck'stoothlessARM म्हणाले...

@SeanF

chelsea's mom is a deep state deep fake
https://www.newsbusters.org/blogs/nb/tom-blumer/2017/03/26/chelsea-clinton-hopes-maga-hat-pic-lincoln-photoshopped

Char Char Binks, Esq. म्हणाले...

"Imagine if you are the guy who is supposed to release all of the Epstein’s blackmail material on his death, and you suddenly realize that you are sitting on a billion dollar goldmine."

He'd better hurry before AI makes it all worthless.

SeanF म्हणाले...

Ingachuck'stoothlessARM: chelsea's mom is a deep state deep fake
https://www.newsbusters.org/blogs/nb/tom-blumer/2017/03/26/chelsea-clinton-hopes-maga-hat-pic-lincoln-photoshopped


I'd like to give Chelsea the benefit of the doubt and believe she meant she was hoping the Republicans weren't actually using that on their program cover, but I find myself unable to.

At any rate, my point was that deepfakes of "epstein evidence" could easily be produced, despite the date of the actual events.

There are a handful of (quite frankly amazing) deepfakes on Youtube with Jim Carrey's face replacing Jack Nicholson's in "The Shining". Despite my knowledge that deepfake didn't exist when Kubrick filmed "The Shining," I remain convinced that Carrey didn't actually play the role. :)

mockturtle म्हणाले...

He'd better hurry before AI makes it all worthless.

Probably already too late, Char Char. Will anyone ever again believe anything they hear/see/read in the media [unless, of course, it impugns our President].

Joe म्हणाले...

Deep fakes are typically not very deep, but mostly work through confirmation bias.

Rocketeer म्हणाले...

"Bzzzzt! Thanks for playing, but you are wrong. With reasoning like this I bet you didn't get a perfect score on your SAT."

Right back atcha! You are correct; I didn't get a perfect score on the SAT, but I came pretty darn close on the verbal section back in the day. I can promise you I didn't get marked down because I couldn't differentiate between garner, gather, collect, glean and get.

JaimeRoberto म्हणाले...

It's not that someone could make a deepfake Epstein video, it's that deepfakes become a convenient explanation when the deepreal video comes out.

narciso म्हणाले...

I imagine the resolution, will be much less than that available for deep state productions,

stlcdr म्हणाले...

One of the star wars movies (recent ones) has Carrie Fisher and Peter Cushing as they were back in 78. Pretty convincing.

People are already trying to make sweeping political changes based on a lie: imagine if you can make live video of someone making statements or doing things which are not real, but saying that it is real. Maybe a video of Trump shooting someone in broad daylight.

War of the worlds - the original radio show - comes to mind, also.

The self-professed fact checkers offer no service in this regard.

PM म्हणाले...

rcocean: "I can see Hollywood using this in their TV shows. One actor, two characters. Think of the $$ saved!"

Actors have been ™ing their image because fakes.

narciso म्हणाले...

I thought they looked artificial, well more than expected.

Char Char Binks, Esq. म्हणाले...

"Probably already too late, Char Char. Will anyone ever again believe anything they hear/see/read in the media [unless, of course, it impugns our President]."

AI and deep fakes make for very frightening prospects, considering the possibility of fraud, political skullduggery, blackmail, etc., but the saving grace is that now we can probably do a lot of horrible, embarrassing, or criminal things and then claim that whatever evidence there is agains us is just part of the fake out.

mockturtle म्हणाले...

Peter Cushing was the quintessential Dr. Van Helsing, I recall. And not a bad Sherlock Holmes.

Yancey Ward म्हणाले...

Quaestor wrote:

I got eight out of nine

Which proves my point- ten years ago you would have gotten nine out nine, and wouldn't have even been in any doubt on any selection- in fact, you would have laughed me out of the comments section for suggesting that you could even make such a single mistake. Ten years from now you will get 50%. Ten years from now, they won't just be static images, they will be walking and talking computer generated images of unreal people that you won't be sure are fake.

Yancey Ward म्हणाले...

Quaestor also wrote:

"Once you get accustomed to the rendering method you unconsciously learn how to recognize what's real from what's unreal."

No, you won't consciously or unconsciously always be able to do this. Eventually, it will get so good you won't be able to tell the difference. Your eyes and ears aren't improving, the fakes are improving at an astonishing rate. I hope that we won't lose the ability to detect fakes by using electronic analysis of the videos, but I am not convinced this will continue to be possible either since most videos made these days are digital data.

Ingachuck'stoothlessARM म्हणाले...

the image of the Beast
---
btw-- Trump-bashin' Joe Walsh could make a passable Jeffery Epstain.

Oops! did we say "EpSTAIN" ?? Our bad.

Aussie Pundit म्हणाले...

Peterson has more reason than most to be paranoid about this stuff.