Last July, there was a media firestorm when Facebook was caught manipulating user emotions with newsfeed experiments. The data science team wanted to understand how positive and negative stories impacted the way users interacted with content and facebook itself. Since the uproar, a Facebook spokesman confirmed to VentureBeat that it has, in fact, chosen to stop some experiments, including a massive study to see whether the social network can encourage voting among its young audience.
In 2010, one of the nation’s leading political scientists, Professor James Fowler of the University of California, San Diego, conducted a massive experiment to see if placing an “I voted” counter at the top of the newsfeed had a measurable impact on election turnout. Since the “I voted” counter was randomized and the team had access to actual voter records, they could reliably see how exposure to certain messages influenced voting.
As a former political scientist, I would argues that is the most important academic study on the value of social media in democracy to date. Facebook managed to boost turnout 2.2 percent, which is really large by traditional election standards. Previous research found that is nearly the entire effect of the 2008 Obama campaign. If that seems small, it’s because most campaigns acknowledge that elections are won at the margins; they spend millions just trying to get a few thousand more voters to the polls.
Facebook continued to conduct more experiments from the 2012 elections as well. But, for the upcoming 2016 election, they’ve stopped. They will show everyone the exact same button (or no button at all). This means they won’t know how much they’ve encouraged people to vote, or if some messages are more effective than others. In other words, they can no longer innovate.
Facebook’s answer is pretty non-responsive. In a statement, Facebook wrote:
“Voting is a core value of democracy and we believe that encouraging civic participation is an important contribution we can make to the community. We have learned over the past few years that people are more likely to vote when they are reminded on Facebook and they see that their friends have voted. We’re proud of this.”
They continue (shortened for brevity):
“We first offered an ‘I’m a Voter’ button in 2008 to increase awareness about elections in the U.S. We did similar work in 2010 and 2012 and publicly released a paper in the journal Nature about the 2010 results. We plan on publishing an additional paper on the 2012 results. In these elections we have worked to improve people’s experience like we do with every Facebook product”
The only real explanation I can imagine is that they’re scared of the press. It’s not that difficult to randomly assign users certain messages. They could completely replicate their previous work without much effort. Unfortunately, Facebook was front-page news for a week when they’re more controversial news feed experiment was discovered last July. Since then, they evidently don’t want to anger critics anymore, even for experiments that would probably be quite popular (the election experiment in 2010 received moderate fanfare and little criticism).
Facebook’s decision is disappointing. A mission-driven company like Facebook shouldn’t let the loudest critics control their decisions.
You can read more about the original report on Facebook’s decision here at Mother Jones.
Facebook is the world’s largest social network, with over 1.15 billion monthly active users. Facebook was founded by Mark Zuckerberg in February 2004, initially as an exclusive network for Harvard students. It was a huge hit: in 2 w… read more »
Powered by VBProfiles