Facebook does have terms of service — ones that every Facebook user has agreed to — that specify users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” The researchers of this psychology experiment argue that their experiments fall under these terms of use because “no text was seen by the researchers.” Rather, a computer program scanned for words that were considered either “positive” or “negative.”
“As such,” the researchers write, “it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”
June 28, 2014
"Facebook Conducted Psychological Experiments On Unknowing Users."
"The study... found that... manipulating the algorithm to show more 'positive' posts in your news feed will actually inspire you to write more 'positive' posts yourself."
Subscribe to:
Post Comments (Atom)
17 comments:
You can unknow things.
I at least partly agreed with three of the choices. (Your chances of a job with Pew are slim and none.)
On 1, I see social media sites as pretty abnormal customer service businesses, and the research is only desirable if it is truly aimed at providing better customer service. On its face, this research seems to be aimed at making Facebook more enjoyable.
On 2, I don't know if sneaky and creepy are the right words. It's a different use of our information than a retailer who keeps track of our purchases since we give that information directly to the retailer. I might go with sneaky, but not creepy.
On 3, anyone who thinks anything on Facebook is private doesn't deserve much respect.
I voted for 4, because I absolutely agree with everything after the "but."
What part of a published paper qualifies as "internal operations"? Research was listed as an example of the kind of internal operations that are allowed, not as an independent thing.
So, did their algorithm consider the word "impeachment" positive or negative? This type of research seems to be more a means of social engineering.
Bob R wrote: "On its face, this research seems to be aimed at making Facebook more enjoyable.
I doubt that. It's much more likely that the research was aimed at finding out ways of increasing the number of member posts/shares. Increasing the number of posts and shares increases the value of Facebook as an ad platform. I would bet my last dollar that they collected all sorts of information during this particular experiment, and they decided that publishing the most "feel good" part of the research would be good public relations.
Advertisers have been conducting psychological experiments on "unknowing users" for many, many years. In the old days, when print ads and catalogs were dominant, it was quite common to place an order over the phone and be asked to supply a code number from the ad or the catalog. That was to help the advertiser learn which colors, word choices, etc. made the most effective ads.
The advent of online media has made this even more common. Advertisers routinely set up multiple versions on online campaigns to gain insights into prospect behavior, and they use the results to build future campaigns.
Never forget that the purpose of ads is, in essence, to modify/influence the viewer's behavior. The difference between ad testing and traditional psychology research may be as arbitrary as "one requires informed consent, while the other doesn't."
If you aren't the paying customer, you're the product.
So who cranks out all those prefab Deep Thoughts, Snappy Sayings and Clever Retort (to my Invisible Enemies) graphics that people are so fond of sharing? Are those some kind of FB mindfuck?
I wish they'd at least use good grammar and punctuation.
Social activists are conducting psychological experiments on unknowing users. Progress.
Carol said..."So who cranks out all those prefab Deep Thoughts, Snappy Sayings and Clever Retort (to my Invisible Enemies) graphics that people are so fond of sharing?"
Behind every one of those graphics is an agenda, and it often is more than simply trying to attract clicks for advertisers.
For example, Upworthy.com supplies a fair amount of political material that pretends to be apolitical and that gets shared on Facebook. Its carefully worded "About" page talks about being neither Republican nor Democrat, and only interested in the Truth, but its causes are overwhelmingly those of the Left, and its key staff are avowedly from the Left, including a former Executive Director of MoveOn.org. Yay for truthiness!
Anyone still on Facebook? Everyone has moved to Instagram and other more hip social networks.
The most hip site is ______.
(Name omitted for exclusivity.)
Facebook users aren't the customers. They are the product.
This is at the bottom of the ThinkProgess story:
By clicking and submitting a comment I acknowledge the ThinkProgress Privacy Policy and agree to the ThinkProgress Terms of Use. I understand that my comments are also being governed by Facebook, Yahoo, AOL, or Hotmail’s Terms of Use and Privacy Policies as applicable, which can be found here.
So when you leave a comment, they can continue to do research. w00t!
This is what happens when you're a lefty, and you draw your opinions from the Correct Ideology, as represented by the Honored Group Consensus. It's a fucking hall of mirrors, man!
See my post in the comments at the link to this story on instapundit.com
"What Facebook did was…"
Why I don't use Facebook,….
I'm a big fan of the idea of "sousveillance", i.e. using social platforms and other means to turn the tables on the snoops.
Facebook is easy to manipulate with nonsense posts and spam (don't say or post anything about what you really think or believe), and of course anyone with an IQ higher than a single digit would avoid Facebook on principle anyway.
Let them do research using garbage input, and watch the Progressive agenda spin even farther away from reality!
Post a Comment