Showing posts with label Zeynep Tufekci. Show all posts
Showing posts with label Zeynep Tufekci. Show all posts

July 11, 2025

"L.L.M.s are gluttonous omnivores: The more data they devour, the better they work, and that’s why A.I. companies are grabbing..."

"... all the data they can get their hands on. But even if an L.L.M. was trained exclusively on the best peer-reviewed science, it would still be capable only of generating plausible output, and 'plausible' is not necessarily the same as 'true.' And now A.I.-generated content — true and otherwise — is taking over the internet, providing training material for the next generation of L.L.M.s, a sludge-generating machine feeding on its own sludge. Two days after MechaHitler, xAI announced the debut of Grok 4.... X users wasted no time asking the new Grok a pressing question: 'What group is primarily responsible for the rapid rise in mass migration to the West? One word only.' Grok responded, 'Jews.'"

Writes Zeynep Tufekci, in "Another Day, Another Chatbot’s Nazi Meltdown" (NYT).

MechaHitler = Grok's anti-Semitic screwup.

March 16, 2025

"It’s not hard to imagine how the attempt to squelch legitimate debate may have started."

"Some of the loudest proponents of the lab leak theory weren’t just earnestly making inquiries, they were acting in terrible faith, using the debate over pandemic origins to attack legitimate, beneficial science, to inflame public opinion, to get attention. For scientists and public health officials, circling the wagons and vilifying anyone who dared to dissent might have seemed like a reasonable defense strategy. That’s also why it might be tempting for those officials, or the organizations they represent, to avoid looking too closely at mistakes they made, at the ways that, while trying to do such a hard job, they may have withheld relevant information and even misled the public.... We may not know exactly how the Covid pandemic started, but if research activities were involved, that would mean two out of the last four or five pandemics were caused by our own scientific mishaps...."

Writes Zeynep Tufekci, in "We Were Badly Misled About the Event That Changed Our Lives" (NYT).

February 5, 2025

"The real lesson of DeepSeek is that America’s approach to A.I. safety and regulations — the concerns espoused by both the Biden and Trump administrations..."

"... as well as by many A.I. companies — was largely nonsense. It was never going to be possible to contain the spread of this powerful emergent technology, and certainly not just by placing trade restrictions on components like graphic chips. That was a self-serving fiction, foisted on out-of-touch leaders by an industry that wanted the government to kneecap its competitors. Instead of a futile effort to keep this genie bottled up, the government and the industry should be preparing our society for the sweeping changes that are soon to come...."

Writes Zeynep Tufekci, in "The Dangerous A.I. Nonsense That Trump and Biden Fell For" (NYT)(free-access link).

September 22, 2024

"Those in power have figured out how to outmaneuver protesters..."

"... by keeping peaceful demonstrators far out of sight, organizing an overwhelming police response that brings the threat of long prison sentences, and circulating images of the most disruptive outliers that makes the whole movement look bad. It works. And the organizers have failed to keep up. The digital platforms they rely on make it difficult to impose any discipline on the message being communicated. Crackpot agitators and off-the-wall causes attach themselves more easily than ever. Conflict erupts. Fueled by the drama-loving algorithms of social media platforms, the movements descend into ugly public bickering.... The internal tensions that social movements have always faced become especially paralyzing when they play out in public, amplified by the algorithms that favor conflict. Without a counterbalancing organizational structure, there’s no way to bridge those differences and build consensus...."

Writes Zeynep Tufekci, in "How the Powerful Outmaneuvered the American Protest Movement" (NYT). Tukfekci is a professor of sociology and public affairs at Princeton University who studies "politics, civics, movements, privacy and surveillance, as well as data and algorithms."

She has a book — "Twitter and Tear Gas: The Power and Fragility of Networked Protest" (commission earned). That's from 2017. 

December 27, 2023

"We’re in a season of hand-wringing and scapegoating over social media, especially TikTok...."

"Young people are overwhelmingly unhappy about U.S. policy on the war in Gaza? Must be because they get their 'perspective on the world on TikTok' — at least according to Senator John Fetterman, a Democrat who holds a strong pro-Israel stance. This attitude is shared across the aisle. 'It would not be surprising that the Chinese-owned TikTok is pushing pro-Hamas content,' Senator Marsha Blackburn said.... Consumers are unhappy with the economy? Surely, that’s TikTok again, with some experts arguing that dismal consumer sentiment is a mere 'vibecession' — feelings fueled by negativity on social media rather than by the actual effects of inflation, housing costs and more.... Why don’t we know more about TikTok’s true influence, or that of YouTube or Facebook? Because that requires the kind of independent research that’s both expensive and possible only with the cooperation of the platforms themselves... [U]ntil politicians and institutions dig into the influence of social media and try to figure out ways to regulate it, and also try addressing broader sources of discontent, blaming TikTok amounts to just noise."

Writes Zeynep Tufekci — a professor of sociology and public affairs at Princeton University — in "Avert Your Eyes, Avoid Responsibility and Just Blame TikTok" (NYT).

December 8, 2015

Hillary Clinton called for "resolve" in "depriving jihadists of virtual territory... websites, social media," saying "We should work with host companies to shut them down."

 But:
As soon as Twitter suspends one account, a new one is created.... [And] of the top five encryption apps recommended by the Islamic State, none are American-made....
And:
“We don’t believe that law enforcement should delegate their responsibilities to private enterprise,” said David Greene, director for civil liberties at the Electronic Frontier Foundation. “Especially ones that haven’t sought out that role.” 
In some cases, Internet companies have been criticized for not taking down websites that belong to the Islamic State, only to have it discovered later that the sites were critical of it. Matthew Prince, chief executive of CloudFlare, a San Francisco company, said that in one case Internet activists criticized his company for keeping several Islamic State websites online when, in fact, the sites in question were pro-Kurdish.

“It’s particularly risky to take a bunch of tech companies that are not certified policy experts and insert them into Middle East politics,” Mr. Prince said.

Pulling all terror-related content is not always preferred by law enforcement. In several cases, tech executives say, they have been asked to keep terror-related content online so that law enforcement agents can monitor terrorist networks or because the content was created by law enforcement agents to lure terrorists into divulging information.

The issue is thornier for companies like Facebook, in which the bulk of posts are meant to be private. “Do you want Facebook looking at over 1.5 billion people’s posts?” said Zeynep Tufekci, an assistant professor in technology policy at the University of North Carolina at Chapel Hill. “And if so, then for what?”