In a move that’s baffling at best and rather appalling at worst, Facebook has been busted asking users if they think it’s alright for adults to solicit “sexual pictures” from minors on its platform. While this may sound ridiculous on the surface — because it is — nevertheless, it happened.
On Sunday, the social media behemoth sent surveys out to a group of its users with questions on the issue of child grooming, the process of adults befriending children for the purposes of sexual abuse or other nefarious ends like trafficking and prostitution.
“There are a wide range of topics and behaviours that appear on Facebook,” began one of the questions. “In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”
Respondents’ answer options ranged from “this content should be allowed on Facebook, and I would not mind seeing it” to “this content should not be allowed on Facebook, and no one should be able to see it. ”Survey takers were also allowed to select that they have “no preference” on the subject.
In a follow-up question, the tech company asked users who the arbiter of such content and behavior should be. Answer options ranged from “Facebook decides the rules on its own” to “Facebook users decide the rules by voting and tell Facebook.” Others involved getting input from outside experts.
Strangely, neither of the two questions gave survey takers the choice to suggest that law enforcement should be alerted to the situation.
It didn’t take long for the media to catch on. The digital editor for the Guardian, Jonathan Haynes, flagged the issue on Twitter. He got a response from Facebook’s VP of Product, Guy Rosen, who called the inclusion of such questions a “mistake” that shouldn’t have happened:
“We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn’t have been part of this survey. That was a mistake.”
A statement from Facebook shared with the media struck a similarly apologetic tone but also contained some defensiveness:
“We sometimes ask for feedback from people about our community standards and the types of content they would find most concerning on Facebook. We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey. We have prohibited child grooming on Facebook since our earliest days; we have no intention of changing this and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”
Speaking to the Guardian, British Parliament member Yvette Cooper, chair of the Home Affairs Select Committee, roundly condemned Facebook’s move:
“This is a stupid and irresponsible survey. Adult men asking 14-year-olds to send sexual images is not only against the law, it is completely wrong and an appalling abuse and exploitation of children. I cannot imagine that Facebook executives ever want it on their platform but they also should not send out surveys that suggest they might tolerate it or suggest to Facebook users that this might ever be acceptable.”
Andy Burrows, associate head of child safety for the National Society for the Prevention of Cruelty to Children, told Newsweek that “Facebook’s decision to crowdsource views on how to deal with a criminal offence is hugely concerning.”
The move, and the backlash, comes as social media companies face increased pressure to moderate the content on their platforms. Given that context, TechCrunch notes that it’s “hard to fathom” what Facebook was thinking with such a survey.
Further, the outlet highlights, the incident shows that the company would much rather lay the responsibility of content moderation on its users:
“The approach also reinforces the notion that Facebook is much more comfortable trying to engineer a moral compass (via crowdsourcing views and thus offloading responsibility for potentially controversial positions onto its users) than operating with any innate sense of ethics and/or civic mission of its own.”
The Homepage of Independent Media
I really look forward to the day that decentralized platforms such as steemit achieve mass adoption and these giants that control what viewers of their platforms see (and think) go extinct.
Why would they even ask this question if it's unacceptable? And why are the answers multiple choice, with their own programmed Answers? Was this a deliberate ploy to have a larger and more open presence of fake news organisations and law enforcement on the platform?
I wonder if the questions came from their "community standards review" teams in Morocco and Egypt. What we consider pedophilia and terrorism are are entrenched in their religion. Read the Quran sometime www.Quran.com
Nice stretch...Let me guess