২৩ মার্চ, ২০২৪

"As I see it, Google and other search engines are recklessly directing traffic to porn sites with nonconsensual deepfakes."

Writes Nicholas Kristof, in "The Online Degradation of Women and Girls That We Meet With a Shrug" (NYT).

That's a free access link so you can read the description of how easy it is to "nudify" real individuals and make stills and video. Ugh! I realize that giving out that information will cause more people to produce that material, but how else are those of us who loathe this sort of thing going to get outraged enough to have any effect?

৪৪টি মন্তব্য:

RideSpaceMountain বলেছেন...

Last I checked, Google and other search engines never needed nonconsensual deepfakes to recklessly direct traffic to porn sites.

Dave Begley বলেছেন...

You used to be able to buy some eyeglasses that would nudify women. They were advertised in magazines. Remember magazines?

narciso বলেছেন...

Have you been doing research nick

Wince বলেছেন...

Get Smart movie sequel: “The Nude Bomb”

While enjoying retirement, secret agent Maxwell Smart (Don Adams) soon finds the world needs his services again. With a KAOS terrorist plot underway, the chief (Dana Elcar) of the PITS agency reenlists Smart to keep KAOS from detonating bombs that leave their helpless victims in the buff. With the help of fellow agents (Sylvia Kristel, Pamela Hensley, Andrea Howard), Smart must stop the terrorists before the world is faced with indecent exposure of monumental proportions.

Ann Althouse বলেছেন...

"Last I checked, Google and other search engines never needed nonconsensual deepfakes to recklessly direct traffic to porn sites."

Kristof isn't saying it didn't, just that there is a special need to protect against directing traffic to deepfake pornography.

Quayle বলেছেন...

I’m sure if and when Google felt there was a special need to direct people away from the Great Barrington Declaration, The algorithm and/or data weighting was changed the very same day.

hombre বলেছেন...

My comment to the NYT: Mr. Kristof has evidently not been paying attention. The red carpet at the Met Gala is all about women exposing their bodies, fashionably of course. Who ever heard of Kim Kardashian or Paris Hilton until their sex tapes appeared on the Net. And then there were the "50 Shades" movies. What major movie actress has not "simulated" sex on the big screen. The difference is only a matter of degree, isn't it.

What about the school library debate? Isn't it okay to have children of all ages exposed to books depicting anal and oral sex between men for the sake of what, diversity?

If the damage, according to Kristol is "deepfakery", isn't that just a symptom of what otherwise passes for entertainment? Australia notwithstanding, how is the search industry supposed to police this? It is just another, albeit unfortunate, extension of the decline in sexual morality. Perhaps the Congress can pass some useless laws instead of dealing with problems they might actually fix.

Balfegor বলেছেন...

I don't see this as categorically different from celebrity porn photoshops or lookalikes, which have been around for decades. The skill required has dropped thanks to AI assist tools, so maybe there's more out there than in the past, but I don't think this should be Google's problem. If anything, mucking around with search results is one reason Google search is so much worse than it used to be. Go after the hosting sites and content creators.

Sebastian বলেছেন...

What exactly does "directing" mean here? In what sense is such directing "reckless"?

mezzrow বলেছেন...

Technology facilitates the monstrous as well as the miraculous, often at the same time.

With the cat out of the bag, what expectation can we have for a regime that will treat this with the opprobrium that it deserves? It's not like this will be harder to do over time. What degree of sacrifice will society make of its free speech and privacy rights to protect the individual privacy of all from this, whether male, female or transitional? I can make no prediction about how the society of this time will react. I am a twentieth century man.

What constructs (and/or contracts) will arise to license rights to use for this kind of material?

What humans desire, they will create and hide - particularly when it is not pretty.

rhhardin বলেছেন...

It seems like a clickbait article for women, for a trend I've never heard of.

The traditional defense for photoshop fake nudes was flood the market with nudes of the same fake with different tit sizes. It makes them all seem obviously wrong.

MikeD বলেছেন...

I'm curious as to what people are searching for that this algorithm can "send" you to a "porn" site?

Narr বলেছেন...

You mean those pix of Laura Linney and Mary Louise Parker may not be real?

Howard বলেছেন...

Keep your insane readers dialed to 11 with every salacious triggering story you can dig up version 47,673.

Enigma বলেছেন...

There are few accidents in technology and business.

Google Image search used to show Michelle Obama merged with a monkey near the top of its results. Not so much now, as Google's "Black Nazi" and "Black Viking" AI engineers apparently cleansed that result. Google image search used to and still puts stuff comparing George W. Bush to a monkey high in their results.

Eco-saint Bill Gates Microsoft's Bing image search is a porn tool even with neutral terms such as "making love." It also has explicit celebrity "deepfake" results too (Bing's image censoring is 1,000,000x looser than Google's).

None of the above were accidents. I'm guessing there's $$$$$$$$$$$ in celebrity deepfakes. Corporate policies often provide cover for antitrust lawsuits (i.e., Democrats rarely attack their own donors, except for Harvey Weinstein after they spent 20 years taking his cash). So-called green and woke firms (and certainly the dirty liars offering "ESG" investments) make lots of money with sleaze under the radar.

Jaq বলেছেন...

"Keep your insane readers dialed to 11"

There are other blogs, or do you feel some responsibility to make sure that we are all kept up to date on what you see on MSNBC?

Rafe বলেছেন...

“Keep your insane readers dialed to 11 with every salacious triggering story you can dig up version 47,673.”

Howard answers the MoonBat Signal.

- Rafe

Ann Althouse বলেছেন...

"I'm curious as to what people are searching for that this algorithm can "send" you to a "porn" site?"

They are searching for a named person with "naked" or something like that. The person harmed is the named person whose image is used without consent. I know you're trying to be funny, but your premise is wrong. It's not about protecting the person doing the searching.

Ann Althouse বলেছেন...

As I've noted before, many years ago The Daily Show put out a book that showed "photos" of all the members of the Supreme Court naked.

Randomizer বলেছেন...

I realize that giving out that information will cause more people to produce that material, but how else are those of us who loathe this sort of thing going to get outraged enough to have any effect?

Does the math work out?

Giving out the information will inform people that they can make fake nudes, and how to get started. More fake nudes will be produced.

On the other hand, how many of us that get outraged will prioritize that outrage above other outrages and have the ability to have an effect? And will that effect help?

An article like this will generate more fake nudes and more people looking for fake nudes.

If reminds me of those articles about disturbing college party antics. The last one I recall is partiers taking alcohol shots through the eyeball. Sounds painful and unlikely, but a UK paper inflates an exceedingly rare activity amongst American college kids. American papers pick it up, American college kids read about it and a few try it out. It trends for a minute, and is gone.

Chris N বলেছেন...

It might start out as supposed liberal idealist, an influence-seeker, a man writing for money and promoting the latest moral cause for consumption, but it always ends somewhere in the 7th level of modern Marxist moral cause-mongering hell.

You can do better Althouse, than a mildly reformed hippie chick linking to has-been, stale idealists.

Jupiter বলেছেন...

"how else are those of us who loathe this sort of thing going to get outraged enough to have any effect?"

No doubt. But I have a technical question. If you "nudify" Dylan Mulvaney, does the software get the details right?

The real puzzle here, is how it is remotely possible that the World does not have all the pornography it could possibly want. Why do we keep producing more of it? It's 100% recyclable. Well, maybe more like 90%, but close enough. And it isn't like it's improving.

rhhardin বলেছেন...

They are searching for a named person with "naked" or something like that.

Which reminds me (I have the audio somewhere) Google VP for User Experience Marissa Mayer came on Armstrong and Getty to explain the many new Google search capabilities on sports, music and zip code (Nov 4 2009), and they finally mentioned to her that they had just searched for Marissa Mayer nude and it came up with no results.

This resulted in a stern instant phone reprimand right after the interview from the Google PR department.

Google wanted a positive, if boring lady, ad out of it and Armstrong and Getty wanted an interesting show more in line with Google's come-on for the interview.

It was a joke about how people use search engines, not a joke about her. She couldn't see that.

0_0 বলেছেন...

Kristov hides his government control desires here, saying the FCC controls "old media" content and wants gov't control over "new media". He is incorrect; the FCC controls the use of the public airwaves. Cable, the internet, and such are not in it's purview for a reason.

FullMoon বলেছেন...

Dave Begley said...

You used to be able to buy some eyeglasses that would nudify women. They were advertised in magazines.


Don't remember that, but back in the 35 mm days, I told the 13 year old boy next door that I had a special Minolta lense that would do that.

He begged me to show him, but the rule was, you had to be eighteen years old..

Jupiter বলেছেন...

Cited without comment.

Jupiter বলেছেন...

"They are searching for a named person with "naked" or something like that. The person harmed is the named person whose image is used without consent."

Well. A "search engine" is supposed to search. And return results. If it doesn't do that, then it doesn't work. The search engine is not using the named person's image.

Jupiter বলেছেন...

"... they had just searched for Marissa Mayer nude and it came up with no results."

Indeed. Did she respond, "Are you sure you're spelling it right?"

Deevs বলেছেন...

Keep your insane readers dialed to 11 with every salacious triggering story you can dig up version 47,673.

Sorry, Howard, but so far the majority of readers seem to be only scratching a 2 on this article. Better luck next time, champ.

Achilles বলেছেন...

Ann Althouse said...

"I'm curious as to what people are searching for that this algorithm can "send" you to a "porn" site?"

They are searching for a named person with "naked" or something like that. The person harmed is the named person whose image is used without consent. I know you're trying to be funny, but your premise is wrong. It's not about protecting the person doing the searching.

They are only worried about protecting the privileged.

Laws against this will not protect little people. Nobody is going to help the high school girl that is deep faked by her class mates. They will selectively enforce this law.

And obviously all of the deepfakes involving Trump that are already out there will be openly supported by the regime that passes this law. They will probably be the authors of many of them as well.

They lied during the "bloodbath" fiasco openly. The extinctionists will be the most avid users of deep fake tech. The media has selectively edited videos of their enemies repeatedly and blatantly lied so many times you can easily predict what is coming next.

These laws will only be used as a tool to suppress enemies of Google and Facebook and the democrats.

Howard বলেছেন...

Your right Deevs. I completerly missed the mark. No trans angle to this story and you dudes can't get a viagra chubby from just nudidity these days.

JK Brown বলেছেন...

Deepfake AI images can have a beneficial effect. All but the most ignorant should just assume any nude image is fake unless certified by the girl or woman. Then the problem goes away. It is only the salaciousness and "shock" that give these images value.

When is the last time you had "shocked eyes" from a fleeting glimpse of shin-bone or kneecap on a woman? Or a woman going sleeveless in the evening much less the light of day.

The flappers wore thin dresses, short-sleeved and occasionally (in the evening) sleeveless; some of the wilder young things rolled their stockings below their knees, revealing to the shocked eyes of virtue a fleeting glance of shin-bones and knee-cap; and many of them were visibly using cosmetics.

Rabel বলেছেন...

This would seem to put the free speech absolutists in a bit of a quandary.

Christopher B বলেছেন...

Jupiter said...

No doubt. But I have a technical question. If you "nudify" Dylan Mulvaney, does the software get the details right?


Obligatory disclaimer that this is mostly a SWAG and based on very little research but my understanding is that, unlike Begley's glasses (which I have enough seniority to remember), the usual practice for fake pr0n is to photoshop a celebrity face onto a 'body double'. Sometimes it's possible to find someone who is reasonable close in appearance and then add a signature hair style and/or fashion accessories. I think this was done to create Sarah Palin pr0n by copying her somewhat distinctive shaggy bob hair style and big glasses.

What AI can do is something similar to 'green suit' skinning sometimes done to create movie monsters by 'wrapping' the actually photographed actor with a 'skin' created from pictures of a celebrity.

Tina Trent বলেছেন...

We will get outraged enough when someone makes deepfake porn featuring Nicholas Kristof. Perhaps this will unite us as a nation.

Ampersand বলেছেন...

The deepfakes phenomenon, grotesque though it is, is being used to propose vague, overbroad laws that will try to expand the right of publicity from a restriction on commercial use to a restriction on communicative use generally. Be careful how you legislate to deal with this. A lot of state legislators lack legal sophistication.

There is already a civil action available for false light invasion of privacy. It prohibits publication of words or images that show the plaintiff in a way that would be highly offensive to a reasonable person, where defendant knew or acted in reckless disregard as to the falsity of the portrayal and the false light in which plaintiff would be placed.

Jupiter বলেছেন...

Oh, this is Karpetbag Kristoff, back at his desk in the Lying Weasel Kingdom of New York. Well, that's different. Kristoff is doing his white knight schtick. So the operant word here is "nonconsensual". The problem is that the deepfakes are nonconsensual. I won't go into the technicalities of it all, I leave that to trained journalists like Kristoff. But to boil it down, in an admittedly oversimplified fashion, when the deepfake is nonconsensual, we call it a "nonconsensual deepfake".

Now, this requires some special thinking skills, because the deepfake is a purely digital object. It exists as a bit-pattern, stored on a binary storage medium, or transmitted serially over a telecommunications channel. And neither the bits, nor the bytes, nor the storage media, nor the telecommunications channel, are normally considered as having the ability to "consent" to their employment. They are tools, machines. They are used as their makers would use them.

The consent in question, therefore, involves a third, or perhaps a fourth or fifth party, one who is not present anywhere in the business, and most likely is not even aware that it is taking place. But that person is a woman, which, of course, brings into play the well-known principle that a woman has a right to have it both ways. Even Ketanji Brown Jackson, who does not know whether she is a woman, is, in fact, a woman, and therefore entitled to have it both ways. As, indeed, I am sure she does. Routinely.

So fellas, don't try to figure it out. Don't try to get to the bottom of the thing. You want to know how it works? It works both ways. Have some gal womansplain it to you.

effinayright বলেছেন...

JK Brown said:

The flappers wore thin dresses, short-sleeved and occasionally (in the evening) sleeveless; some of the wilder young things rolled their stockings below their knees, revealing to the shocked eyes of virtue a fleeting glance of shin-bones and knee-cap; and many of them were visibly using cosmetics.
************

In one of his novels set in the 1920's Sinclair Lewis commented that a certain daring woman exposed her "nice ankles".

Back when their women all wore kimonos, Japnese men used to get excited getting a peek at the nape of a lady's neck.

It's a long way from there to bukkake. (if you need to look that up, don't!)

Aggie বলেছেন...

"Writes Nicholas Kristof, in "The Online Degradation of Women and Girls That We Meet With a Shrug" (NYT)."

Is the problem that we're not getting up in arms about Google directing traffic to sordid porn and fake nudity sites and asking the government to intervene (yet again), or is the problem that we're giving a shrug and saying 'who gives a sh*t? I don't go there, I don't do that, so it's not a problem.' It would seem that, either way, Kristof is upset with people's reactions, not the original condition.

Smilin' Jack বলেছেন...

“I realize that giving out that information will cause more people to produce that material, but how else are those of us who loathe this sort of thing going to get outraged enough to have any effect?”

You’re not. What with global warming, systemic racism, and the very existence of Donald Trump, most people aren’t going to have enough outrage left to even muster a shrug for deepfakery.

PM বলেছেন...

The techs get rich appealing to our baseness.

The Vault Dweller বলেছেন...

If we are talking about tech company involvement in addressing this I don't want search engines hiding information or programs. I certainly don't want Governments banning programs like that or the fake, but convincingly real-looking images they generate. I wouldn't mind and would like if companies like Twitter, Meta, Instagram, Snapchat etc just banned all pornography on their site. It certainly wouldn't hurt their advertising potential. When it comes to school kids doing this it should be treated as a serious offense and punished accordingly by the schools. But that is specifically in relation to children. I don't want the Government involved in regulating it for others even if it is disgusting and offensive. In the 2016 election I recall an artist creating a nude and unflattering sculpture of Trump and posing it in public. Lots of people laughed at it. A little bit later, presumably a different artist, created a nude and unflattering sculpture of Hillary Clinton. Some people were not very happy with that. One woman even screamed in rage in public and attempted to destroy the sculpture. This isn't a problem that Government can or should try to fix. This is a cultural problem. We would be better off if our ideas regarding sex and sexuality returned to a more circumspect and modest understanding. These kinds of things aren't like a sound engineer's board, full of a bunch of tiny knobs and dials where we can make individual, discrete, and isolated changes. Life usually comes as package deals. It isn't unusual that many wouldn't feel it is beyond the pale to create a faked nude image of another person given much of society's detachment from the older ideas regarding sex, and sexual modesty.

Tina Trent বলেছেন...

One way to stop crime is to stop defunding police.

Former Illinois resident বলেছেন...

My spam filter was once a cache of solicitations from work-related businesses. Now suddenly, after 20+ years, spam filter folder is filling-up with cached porn emails. Never been on porn site, and wrong demographic. I too blame Google.