5
   

Facebook's psychological experiments

 
 
Reply Thu 3 Jul, 2014 05:55 am
Facebook's psychological experiments and influencing elections
Shamelessly copied from PDiddie's blog

Perhaps you heard? About that Facebook mind control thingie? If it was clever satire, it would be a great Hollywood script. Except it's not.

Facebook’s disclosure last week that it had tinkered with about 700,000 users’ news feeds as part of a psychology experiment conducted in 2012 inadvertently laid bare what too few tech firms acknowledge: that they possess vast powers to closely monitor, test and even shape our behavior, often while we’re in the dark about their capabilities.

The publication of the study, which found that showing people slightly happier messages in their feeds caused them to post happier updates, and sadder messages prompted sadder updates, ignited a torrent of outrage from people who found it creepy that Facebook would play with unsuspecting users’ emotions. Because the study was conducted in partnership with academic researchers, it also appeared to violate long-held rules protecting people from becoming test subjects without providing informed consent. Several European privacy agencies have begun examining whether the study violated local privacy laws.

It's cool, though. The NYT tech blogger says there's nothing to worry about and that we should welcome our new overlords. Except for this part.

In another experiment, Facebook randomly divided 61 million American users into three camps on Election Day in 2010, and showed each group a different, nonpartisan get-out-the-vote message (or no message). The results showed that certain messages significantly increased the tendency of people to vote — not just of people who used Facebook, but even their friends who didn’t.

Zeynep Tufekci, an assistant professor at the School of Information and Library Science at the University of North Carolina, points out that many of these studies serve to highlight Facebook’s awesome power over our lives.

“I read that and I said, ‘Wait, Facebook controls elections,’ ” she said. “If they can nudge all of us to vote, they could nudge some of us individually, and we know they can model whether you’re a Republican or a Democrat — and elections are decided by a couple of hundred thousand voters in a handful of states. So the kind of nudging power they have is real power.”

Okay then. I feel calmer already.

How much do you think Facebook might charge... say, a well-heeled politico like Greg Abbott to "promote posts" that could swing an election his way? A few million bucks? More than that?

Would it be money better spent than advertising on Fox News? I would have to think so, since that's a captive (and already well-manipulated) audience. Not much fresh ore to be mined there.

Sort of gives pause to the traditional 'grassroots organizing' effort, doesn't it?

Oh well, I'll think about that after I level up in Candy Crush. After all, my desire to be well-informed is currently being overwhelmed by my desire to remain sane.
Posted by PDiddie at Thursday, July 03, 2014
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 5 • Views: 1,710 • Replies: 14
No top replies

 
edgarblythe
 
  1  
Reply Thu 3 Jul, 2014 08:13 am
Also in PDiddie's blog

In another experiment, Facebook randomly divided 61 million American users into three camps on Election Day in 2010, and showed each group a different, nonpartisan get-out-the-vote message (or no message). The results showed that certain messages significantly increased the tendency of people to vote — not just of people who used Facebook, but even their friends who didn’t.

Zeynep Tufekci, an assistant professor at the School of Information and Library Science at the University of North Carolina, points out that many of these studies serve to highlight Facebook’s awesome power over our lives.

“I read that and I said, ‘Wait, Facebook controls elections,’ ” she said. “If they can nudge all of us to vote, they could nudge some of us individually, and we know they can model whether you’re a Republican or a Democrat — and elections are decided by a couple of hundred thousand voters in a handful of states. So the kind of nudging power they have is real power.” http://www.nytimes.com/.../the-bright-side-of-facebooks...
0 Replies
 
maxdancona
 
  1  
Reply Thu 3 Jul, 2014 08:43 am
@edgarblythe,
I think this story is being a little over-hyped.

There is an issue here, but words like "mind-control" and claims of election rigging are a little exaggerated (the election study was a small effect). The claim that certain messages "significantly" increased the tendency of people to vote, is a gross exaggeration.

There is an interesting line here. Every good web company does research. A very common form of testing is "A-B" testing where different information is displayed to different groups of customers to see which information leads users to behave differently.

This is a little different in that Facebook is very big, and has control of particularly personal information. A discussion about ethics is a very good thing.

But this isn't mind control, nor is it a threat to modern democracy... at least not any more than previous forms of social media such as Television.


0 Replies
 
boomerang
 
  1  
Reply Thu 3 Jul, 2014 12:17 pm
@edgarblythe,
I think it's creepy as hell but apparently when you agree to Facebook's terms of service, you agree to participate in such experiments.

People don't care. The ability to update others on what they like and had for lunch trumps all ethical considerations they might have about Facebook.
maxdancona
 
  2  
Reply Thu 3 Jul, 2014 01:01 pm
@boomerang,
How would you feel if a site like Able2Know did A-B testing?

This would mean that they would change some feature for half of us, but leave the other half of us with the existing page to see whether this feature would effect how we post.

This is a very standard type of testing that many (if not most) websites do and can lead to better websites. This is not exactly what Facebook did, but the lines get blurry awfully quickly.
boomerang
 
  2  
Reply Thu 3 Jul, 2014 04:06 pm
@maxdancona,
I suppose how I'd feel about it would depend on whether I were informed about the nature of the experiment and whether I could opt out. I would also like to know the results of any experiment I participated in.
maxdancona
 
  2  
Reply Thu 3 Jul, 2014 05:18 pm
@boomerang,
In a A-B test people are rarely consulted. You simply present one version of the website to one group, and another version to another and see which one performs better (whatever your definition of "perform" is).
boomerang
 
  1  
Reply Thu 3 Jul, 2014 10:42 pm
@maxdancona,
Do companies that do these types of tests consider that more than one user of the same computer might visit their sites? If they don't, their information is pretty meaningless.

For example -- me, Mo and Mr. B all use the same Amazon account. There isn't any way they could present us all with different versions of their site so I'm thinking we'd really muck up their psychographics.

In such instances, if only I could find what I wanted they would lose sales to the rest of my family if they presented only a limited version of their site.

But perhaps I'm misunderstanding why a company would limit their site to any user.

I'm really curious and would love for you to clarify.

Facebook is so user specific that the creepiness factor of this kind of testing goes up, in my opinion.

I admit I'm kind of creeped out by some of the things our machines know about us. I posted kind of a jokey thread the other day about how my phone, unprompted, told me how much time it would take me to drive to the dog park. Obviously my phone recognized that I drove there every morning even though I'd never asked it to calculate the time it would take to get there.
Finn dAbuzz
 
  1  
Reply Thu 3 Jul, 2014 11:42 pm
@edgarblythe,
I don't think I'm all that different from other people when it comes to reading terms of service and software license, which is to say I don't.

I know they could contain a number of provisions with which I would not be happy, but I assume they don't contain anything egregious. I base this assumption on the belief that there are a sufficient number of people who do read these documents and that if there was something egregious it would be made known. As a matter of fact, every once and a while I get a Facebook message from a "friend" that informs me of some Facebook practice that's perceived as objectionable with instructions on how to deal with it. These sort of posts don't often originate with my "friend," and it's clear they get widely circulated through "sharing." How effective this method of protecting myself from hidden terms and conditions is unknown to me, but it, obviously, it's enough to satisfy any fear I have because I'm using Facebook, this forum and a number of software programs without ever having actually read the TOS or license.

I admit that I don't welcome the news of these experiments, and it does violate my, obviously unfounded, belief that my homepage (or whatever its called) is only accessible to those to whom I've given permission, and not Facebook employees or psychologist they've paid or who paid them. However, in the final analysis it's not enough of an invasion of privacy for me to stop using Facebook.

It's certainly not mind-control, and I doubt they can do anything more than nudge you into doing something you were already inclined to do. Wendy Davis isn't going to get me to vote for her no matter what games might be played on Facebook and I doubt she advertises on MSNBC; not because its viewers are not easily manipulated, but because they are so few.

If they try something really dastardly, it probably won't work and it will eventually be made public, although Facebook is probably better at destroying e-mail than the IRS.

0 Replies
 
roger
 
  1  
Reply Fri 4 Jul, 2014 12:41 am
It's often been said that if you don't pay for an internet service and they profit from your information you are not the customer. You are the product.
Finn dAbuzz
 
  1  
Reply Fri 4 Jul, 2014 12:44 am
@roger,
It's true what often has been said, and particularly with Facebook.
0 Replies
 
edgarblythe
 
  1  
Reply Fri 4 Jul, 2014 07:47 am
@roger,
They - facebook - ply one with so many commercials, they have to be getting rich without such subversive actions.
0 Replies
 
maxdancona
 
  1  
Reply Fri 4 Jul, 2014 08:38 am
@boomerang,
Yes, companies do consider that more than one user of the same computer might visit their sites. It doesn't matter to them. In the case of these tests, companies don't care about you as an individual, they care about very large groups in aggregate.

I did exactly this type of testing (as a programmer) for a travel site. This travel site understands its business. Almost all people coming to the site are already planning to take a trip, and they already know where and what they want. People are looking for bargains, but the truth is that all of these travel sites use the same database of flights and they all offer the same trips for the same price. Which site customers use to buy their trips doesn't really matter to the customer... although of course it matters a great deal to the travel site.

One test I worked on involved moving the "Buy Now" button from the bottom of the screen to the top. All we care about is the percentage of the large number people who come to the page who click that button. If we can increase the percentage of people who click that button by 3 or 4% it is worth literally millions of dollars to the company.

The experiment we did is obvious. When someone comes to the site we assign them to group A or group B, and we leave a cookie on their computer to be consistent if they come back. Group A gets the button on the bottom. Group B gets the button on the top. Over the period of the test about a million people come to page. If Group A generates more clicks, then we move the button. If Group B generates more clicks, then we keep it.

This experiment is valid because it is asking the question in aggregate. We are looking for a percentage improvement among hundreds of thousands of people... but this is exactly what we want to measure, because this percentage of the aggregate translates directly into money.

Consider Finn's election example. The people trying to get Wendy Davis elected are certainly relying on this type of experiment on aggregates of people. Like the travel site, they aren't looking at Finn as an individual. The fact that Finn wouldn't vote for Davis to save his own mother doesn't matter to them. They care about "Texas voters" as a group. What they need is 50.01% of the part of some 13 million voters in Texas who vote.

And so anything they can do to get a few more percentage points of votes for Davis is worth doing. The experiments they rely on are targeted exactly for that purpose.

(Yes, I do realize I have just undercut my own argument in a previous post).
0 Replies
 
edgarblythe
 
  2  
Reply Sun 6 Jul, 2014 10:20 am
Thousands of Facebook Inc. FB -0.24% users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real.

In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures. In the end, no users lost access permanently.

The experiment was the work of Facebook's Data Science team, a group of about three dozen researchers with unique access to one of the world's richest data troves: the movements, musings and emotions of Facebook's 1.3 billion users.

The little-known group was thrust into the spotlight this week by reports about a 2012 experiment in which the news feeds of nearly 700,000 Facebook users were manipulated to show more positive or negative posts. The study found that users who saw more positive content were more likely to write positive posts, and vice versa.

Facebook operating chief Sheryl Sandberg says the company's experiment on user emotions was 'poorly communicated.' Bloomberg News
Facebook Chief Operating Officer Sheryl Sandberg said Wednesday during a trip to India that the study was "part of ongoing research companies do to test different products" and was "poorly communicated."

The company said that after the feedback on the study, "We are taking a very hard look at this process to make more improvements."

Until recently, the Data Science group operated with few boundaries, according to a former member of the team and outside researchers. At a university, researchers likely would have been required to obtain consent from participants in such a study. But Facebook relied on users' agreement to its Terms of Service, which at the time said data could be used to improve Facebook's products. Those terms now say that user data may be used for research.

"There's no review process, per se," said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. "Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior."


He recalled a minor experiment in which he and a product manager ran a test without telling anyone else at the company. Tests were run so often, he said, that some data scientists worried that the same users, who were anonymous, might be used in more than one experiment, tainting the results.

Facebook said that since the study on emotions, it has implemented stricter guidelines on Data Science team research. Since at least the beginning of this year, research beyond routine product testing is reviewed by a panel drawn from a group of 50 internal experts in fields such as privacy and data security. Facebook declined to name them.

Company research intended to be published in academic journals receives additional review from in-house experts on academic research. Some of those experts are also on the Data Science team, Facebook said, declining to name the members of that panel.

A spokesman said Facebook is considering additional changes.

Since its creation in 2007, Facebook's Data Science group has run hundreds of tests. One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how "political mobilization messages" sent to 61 million people caused people in social networks to vote in the 2010 congressional elections.


Many of Facebook's data scientists hold doctoral degrees from major universities in fields including computer science, artificial intelligence and computational biology. Some worked in academic research before joining Facebook.

Adam Kramer, the lead author of the study about emotions, said in a 2012 interview on Facebook's website that he joined the company partly because it is "the largest field study in the history of the world." Mr. Kramer, who has a doctorate in social psychology from the University of Oregon, said that in academia he would have had to get papers published and then hope that someone noticed. At Facebook, "I just message someone on the right team and my research has an impact within weeks, if not days."

Much of Facebook's research is less controversial than the emotions study, testing features that will prompt users to spend more time on the network and click on more ads. Other Internet companies, including Yahoo Inc., YHOO +0.72% Microsoft Corp. MSFT -0.24% , Twitter Inc. TWTR -1.05% and Google Inc., GOOGL +0.39% conduct research on their users and their data.

The recent ruckus is "a glimpse into a wide-ranging practice," said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies "really do see users as a willing experimental test bed" to be used at the companies' discretion.


Facebook's team has drawn particular interest because it occasionally publishes its work in academic journals that touch on users' personal lives, including the study about positive and negative posts.

"Facebook deserves a lot of credit for pushing as much research into the public domain as they do," said Clifford Lampe, an associate professor at the University of Michigan's School of Information who has worked on about 10 studies with Facebook researchers. If Facebook stopped publishing studies, he said, "It would be a real loss for science."

Dr. Lampe said he has been in touch with members of the Data Science team since the controversy erupted. "They've been listening to the arguments and they take them very seriously," he said.

Mr. Ledvina, the former Facebook data scientist, said some researchers debated the merits of a study similar to the one that accused users of being robots but there was no formal review, and none of the users in the study were notified that it was an experiment.

"I'm sure some people got very angry somewhere," he said. "Internally, you get a little desensitized to it."

Write to Reed Albergotti at [email protected]
Finn dAbuzz
 
  1  
Reply Sun 6 Jul, 2014 11:10 am
@edgarblythe,
It's easy to forget that Facebook's product is data: the personal data of its users.

I don't know the history of the thing (never saw the movie) but if it began as a means for people to communicate with one another, that's not what made Zuckerman a very wealthy nerd.

They are forever asking me to update my profile to "help me find new friends" or whatever the BS explanation is. I guess the pretense isn't completely without foundation, but it's clear that their motivation is to gather more product to sell. I'm fine with this, but I think Facebook could be a little more transparent about what how they execute their business model.

Whether or not they should be is another question. Caveat Emptor applies here, and especially since users are not actually buying anything, at least not in the sense that they are paying money for goods or services. The "coin" of the Facebook transaction is personal data though, and it shouldn't be spent with blind faith in the seller.

They're pretty clever with how they go about their business. How indicating what your favorite books, movies or television shows are, actually "enhances" your Facebook experience is beyond me, but they realize that for some reason, most people can't resist creating these lists. They also realize that for some reason people can't resist expressing what they like by using the "Like" feature. I'm sure this feature alone provides them a digital ton of valuable information. I'm also sure that they have had very serious policy discussions about providing a "Dislike" feature.

Since data is their product, it only makes good sense that they've hired "Data Scientists," and it shouldn't be surprising that "Data Scientists" in conjunction with "Data Marketers" are going to come up with new and creative ways to collect, collate and sell data. There's been a lot of superficial talk about the "Information Age" replacing the "Industrial Age," but Facebook is a perfect, albeit somewhat shallow, example of the commercial manifestation of this transition. There are products or means with much greater depth, such as 3D printers, which are having and will have a much greater impact on the way we live.

I would prefer to do business (which is after all what you are doing when you use Facebook) with a more ethical company, but their ethical shortcomings isn't a deal breaker for me, or millions like me. This, of course, is the commercial downside to their little experiments and how they are communicated: piss off or scare enough of the producers of your product and your source dries up. I don't see this happening though. They have a bit of a PR problem that they will likely get over.
0 Replies
 
 

Related Topics

I saw the girl who isn't there.... - Question by boomerang
Mentally ill. - Discussion by sometime sun
Adulthood Life Questions - Question by inkluv99
Trolls represent human's basic nature - Discussion by omaniac
weird dream - Discussion by void123
Is being too strong a weakness? - Question by ur2cdanger1
Zombies Existence - Discussion by RisingToShine
How can we be sure that all religions are wrong? - Discussion by reasoning logic
 
  1. Forums
  2. » Facebook's psychological experiments
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.07 seconds on 12/22/2024 at 10:13:44