[Update] More information is coming out about the study, its methods, and the organizations involved that call into question not just the ethics, but the legality of the study. You can read the new information in this story by Victoria McNally.
Facebook has been caught in a scandal after it was discovered that in 2012 they turned 689,003 users into unknowing participants in a psychological experiment. It’s sneaky, arguably unethical, but surprisingly I think I’m actually on Facebook’s side on this one.
For the experiment, Facebook tweaked the newsfeeds of some users to show more negative posts to some users, and more positive posts for others to see how that affected what those users posted themselves. Users who saw fewer negative comments were more positive in their own posts, and vice versa.
The study, titled “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings of the National Academy of Sciences of the United States back in March of this year.
Facebook and other large tech companies are dealing with a bit of a creep factor as people become more aware of the scope of the data they’re collecting on us, and secret psychology experiments certainly aren’t going to help that at all. That said, unlike most of the reactions I’ve seen to this story, I think I’m fine with Facebook doing this sort of thing. After all, we’ve all agreed to it in the incredibly long incomprehensible Terms of Service we all lied about reading and agreeing to.
It’s a trade off. Facebook is a free service that lets us stay in touch with people from almost literally anywhere in the world (and they’re working on making it literally the whole world.) In exchange, we let them collect data on how we use the service. We’ve all come to accept that the primary use of that data is to target ads to us, but it’s not the only thing it can be used for.
Facebook has more than a billion users. That’s a huge potential N for any scientific study. Why not take advantage of it? Particularly given that this specific experiment was to better understand how the use of social media affects people.
With great power comes great responsibility, and it would be very easy for Facebook to cross a real ethical line with this type of thing, but as for this specific experiment — I think this might actually be the best possible use for Facebook.
(via Business Insider, image via Facebook)
- Facebook wants to use drone to give the Internet to everyone
- Minecraft developer killed Oculus Rift integration because Facebook bought it
- Facebook’s automatic tagging software has near-human levels of facial recognition