world of internet security, latest cyber security news,information,updates on technology,it job vacancies,internet security,breaches,and safeguards

Facebook sort-of apologizes for treating users like lab rats

Facebook says there are a few things about its experiment on users' emotional states that it "should have done differently."
Like maybe receive informed consent from people before you modulate their newsfeeds so as to show them sadder/madder/gladder content in your efforts to determine if emotional states are contagious?
Well, no, not exactly.
In a blog post on Thursday, Facebook Chief Technology Officer Mike Schroepfer said that the crew was "unprepared" for the ruckus stirred up at the end of June about its emotional contagion research, that Facebook has taken the comments and criticism to heart, and that aspects of the research could have/should have been tweaked:
For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.
For example, instead of controlling users' expression of emotion by keeping positive or negative items out of newsfeeds, as Facebook did in the 2011 experiment that upset users and ethicists, it could have used a methodology presented in another recent research article to which Schroepfer linked.
In that article, the researchers describe a hands-off approach to studying emotional synchrony as it related to gloomy weather:
Instead of changing the user's emotion directly with an experimental treatment, we let rainfall do the work for us by measuring how much the rain-induced change in a user's expression predicts changes in the user's friends' expression.
Following the brouhaha, Facebook said it plans to change research methodology in these ways:
  • Guidelines: Researchers are to be given clearer guidelines, and future studies dealing with "deeply personal" topics, such as looking at particular groups or people of a certain age, will go through an "enhanced review process" before research can begin.
  • Review: Facebook's created yet another cross-function panel of people to weigh in on research, including senior researchers, engineers, lawyers, and the company's privacy and policy teams.
  • Training: Facebook has added research education to its six-week training bootcamp for new engineers, as well as adding it to the annual privacy and security training all Facebook staffers go through.
  • Research website: Facebook's academic research is now available in one location, here.
Was Facebook's "we'll do things differently" post an apology?
I hesitate to call it such, given that it didn't address the aspect of the research that ethicists found most problematic: i.e., that Facebook's experiment on nearly 1 million users' newsfeeds was done without asking those people if they wanted to be part of the study.
But it did acknowledge one thing that was sorely lacking in the 2012 study: transparency.
...we failed to communicate clearly why and how we did it.
However, not telling people why and how you did the research (after it happens) is not the same thing as asking for their informed consent.
What do you think: does Facebook's apology work for you?

0 comments:

Post a Comment