२६ जानेवारी, २०२४

"One image shared by a user on X, formerly Twitter, was viewed 47 million times before the account was suspended on Thursday."

"X suspended several accounts that posted the faked images of Ms. Swift, but the images were shared on other social media platforms and continued to spread despite those companies’ efforts to remove them.... Researchers now fear that deepfakes are becoming a powerful disinformation force, enabling everyday internet users to create nonconsensual nude images or embarrassing portrayals of political candidates."

From "Explicit Deepfake Images of Taylor Swift Elude Safeguards and Swamp Social Media/Fans of the star and lawmakers condemned the images, probably generated by artificial intelligence, after they were shared with millions of social media users" (NYT).

Combining a photo of the head of a famous person with a photo of someone else's body is an old trick. I remember when Jon Stewart did it to the Supreme Court Justices in his book "America (The Book)." From 2004:
Wal-Mart executives deemed photographs of nine naked and aged bodies inappropriate for its shelves this week when the company decided not to offer shoppers copies of the best-selling book “America (The Book),” written by Jon Stewart and “The Daily Show” writers....

“We were not aware of the image that was in the book (when Wal-Mart ordered it) and we felt the majority of our customers would not be comfortable with it,” said Wal-Mart Stores Inc. spokeswoman Karen Burk...

The book — a mock school textbook — parodies the American government in typical “Daily Show” fashion. The page facing the nude photos has cutouts of the justices’ robes, complete with a caption asking readers to “restore their dignity by matching each justice with his or her respective robe.”...

“It’s not gratuitous and it’s very much in tune with the rest of the book,” [said the publisher]. “It’s funny, yet to the point. When you undress the Supreme Court justices, they’re just men and women and you have to judge them on who they are and what they do. It makes you look and think and laugh.... You don’t look at that page to get your rocks off, you do it to laugh”...
Is it more acceptable to laugh at a celebrity's nakedness than to find it beautiful and titillating? Well, that was the debate back then. Jon Stewart was only embarrassing them, not using them to cause sexual excitement.

I must add the news that Jon Stewart is returning to "The Daily Show." But I'll do a separate post about that.

And I have to repeat the Bob Dylan line: "But even the president of the United States sometimes must have to stand naked."

And that requires me to call attention to Biden's penchant for swimming naked... and Theodore Roosevelt's.

End of post.

३२ टिप्पण्या:

RideSpaceMountain म्हणाले...

Sorry Swifties. You can't stop the signal. Nobody can stop it. It's out. It's in the wild. I love it and I'm laughing my butt off and if you're not well jokes on you.

If you don't you can take it up with the techbro-coasties who came up with it, cause it's gonna get worse...or better...depends on who you are.

Kate म्हणाले...

You think people are posting deepfakes of Swift because it's beautiful and titillating? Like with Jon Stewart, it's a manipulation. Make the powerful uncomfortable and demean them. Bring them down a peg.

Stewart's detached irony has damaged honest reporting. The Millennials wanted to be cool and just, like he seemed to them. The state of today's press can be traced back to his show.

Lloyd W. Robertson म्हणाले...

Jonathan Winter: have you ever let your dog watch you when you're naked? With a bird or cat it wouldn't matter, but a dog looks at you ...

Howard म्हणाले...

I worked at an engineering company in Pasadena in the early 1990s where the drafting department was still doing a lot of old school illustrations. The senior draftsman, Sandro, would take candid photos of people around the office and then put them together in sexually suggestive montages. In one example he had taken a photograph of one of the senior managers taking a big bite out of a hot dog. He was able to pair him up with his boss, the regional manager sitting back in a chair with his eyes closed.

typingtalker म्हणाले...

But even the president of the United States, Sometimes must have, To stand naked.

Bob Dylan

Readering म्हणाले...

Photoshopped celebrity porn and nudes have circulated on the internet for years, and in light of the hacking of celebrity accounts containing intimate selfies, some realistic photoshpped images have been more plausible than others. I bet there have been many of Taylor Swift, a celebrity for decades, even as a teen. It will get worse with AI, and at some point, I assume, entire AI porn films will be generated and become a fact of life, even for non-celebrity victims.

I view parodies differently. Didn't Hustler's late owner get into extended litigation with the head of Moral Majority over outrageous images?

re Pete म्हणाले...

"I hot-footed it . . . bare-naked . . ."

rhhardin म्हणाले...

For women just compare with other fakes of her.

richlb म्हणाले...

In the words of Swift herself, "look what you made me do."

Ann Althouse म्हणाले...

The effort to suppress it is causing more people to look for it, and it will be findable.

A big Streisand effect.

Sorry to add to it, but I can't believe I'm making things any worse.

Larry J म्हणाले...

AI generated deep fakes are almost certain to be a prominent feature of this year's election season.

Tom T. म्हणाले...

Maybe some people will find it titillating, but the intent clearly seems to be bullying. It's meant to send a message to Swift's fans that no matter how successful a woman might be, to some people she's still nothing more than a f*cktoy. I don't think this remotely compares to the Supreme Court justices or a president.

I don't understand the reference to the Streisand effect. I haven't heard Swift or her management say anything about the fakes.

Howard म्हणाले...

It's impossible to make a nothing Burger worse, Althouse.

Leon म्हणाले...

I'm shocked, shocked I say. Deep fake nudes? Where are they posting theses disgusting pictures. That's terrible they're posting them where anyone can see them. I mean like where exactly.


Yes a bit of the Streisand effect I'd say. Tons of articles none of which.....that I can see.... posted even blurred out shots of what they were talking about. No doubt they'll linger forever in corners of the Internet but I'll probably never see them.
How much reputational damage they'll ever do is questionable since we all know it's possible with technology so skepticism is built-in.

AlbertAnonymous म्हणाले...

If any of you find photos online of me in sexual situations, know that they are fakes.

Lucien म्हणाले...

When will people get around to saying things were posted on X, rather than "X, formerly Twitter"?
I wouldn't be surprised if Taylor Swift were savvy enough not to have sent any naked photos of her to anyone; but have none of her romantic partners ever done that surreptitiously?

CJinPA म्हणाले...

This one's on us.

We've known deep fakes were coming. They're here. Big Social and Big Government can't protect us from ourselves. Anyone who uses AI to mislead will lose credibility (and hopefully their accounts). But that's punishment after the fact. We'll just have to endure the coming wave. After that I think it will lose its ability to attract eyeballs.

ALSO: The person who invents an app to quickly ID AI-manipulated images will become rich.

Yancey Ward म्हणाले...

It won't be long before there are deepfakes of Swift being fucked by a horse that you won't be able to tell are fake. It is a brave new world we are entering.

Yancey Ward म्हणाले...

Lucien asked:

"When will people get around to saying things were posted on X, rather than "X, formerly Twitter"?

It will be a while, but it will be the majority of such instances at some point. I see it more and more the last few months where "twitter" is being omitted.

RideSpaceMountain म्हणाले...

"The person who invents an app to quickly ID AI-manipulated images will become rich."

This already exists. I can't remember it's name but it's an Artificial Intelligence Identifier App (AI/IA). It's not bad either. I've seen it work.

But it also won't matter.

Jupiter म्हणाले...

Eh. She's not my type.

Sheridan म्हणाले...

Howard - was that Parsons or Jacobs? I was at Braun (very old school) and actions like what your Sandro did would have resulted in permanent blacklisting in the industry. He's lucky he wasn't working directly for some of our middle eastern clients.

Louise B म्हणाले...

How can it be bad if Jon Stewart did it? Yes, Wal-Mart protested but only deplorables shop there. All the right people love Jon Stewart and probably supported him, so what's wrong with this stunt again? Just in case anyone is confused, this is sarcasm.

Rabel म्हणाले...

As I recall, one of Kipling's copybook headings was:

"If it exists, there is porn of it."

Y'all are so innocent.

loudogblog म्हणाले...

The problem is that the internet is too big and too fast for anything to keep up with it. (even AI)

Trying to stop the spread of malicious content is like trying to stop the spread of the common cold. You can mitigate how many people get it it, but millions will still get it.

Kevin म्हणाले...

I honestly don't see what the issue is. The whole point of naked pics or videos of someone is that they depict the person. If you are the viewer, that makes it hotter if you think the person is hot. If you are the person, and you don't consent to the release of the pic or video, then that is a violation of your personal privacy. If you do consent, then no harm no foul.

But if it is a fake naked pic or video, ai or otherwise, then it is not hot to the viewer, because it is fake. 100% of what makes the pic or video hot is that it is real, that you are really seeing what the person looks like naked. Similarly, it is not a violation of the person depicted, because it is not what the person looks like naked in any way. Just as 100% of what makes the pic or video hot is that it is real, 100% of what makes the pic or video a violation is that it is real.

The sole remaining purpose of the pic or video is parody/satire, which is exactly what the Taylor Swift AI pics are. Parody/satire is protected by the 1st Amendment.

chuck म्हणाले...

There were fakes of Sarah Palin back in the day.

n.n म्हणाले...

Parameterized libertinism... it's funny, now it's not funny. GenO-uroboros

Howard म्हणाले...

Sheridan: neither, but a civil engineering firm of national stature. Perhaps in those days, people were better at keeping secrets. Lots of office affairs, drinking at lunch, mad scrambling to FedEx at closing time. Those were the days

Joe Smith म्हणाले...

What it they were real and the 'Deep fake' stuff is a ruse?

How do we know?

n.n म्हणाले...

Inferential defamation.

Bunkypotatohead म्हणाले...

Eventually there will be faked videos of her singing. The music and lyrics will all be AI. She can live on forever for her fans.
She might even pay to make it happen.