Facebook has a Big Tobacco Problem

in #social-media7 years ago

Facebook’s problems are more than a temporary bad PR issue. Its behavior contributes to a growing negative view of the entire tech industry.

In 1996, it required the tremendous courage of one whistleblower to expose the wrongdoings of the Big Tobacco company, Brown & Williamson, which artificially maintained the public’s addiction to cigarettes. It also helped to have robust media support, despite Big T’s intimidation. (Read this journalistic masterpiece by Marie Brenner, The Man Who Knew Too Much).

Today, things unfold much more quickly, with a cohort of people publicly denouncing the effects of tech to our children (their kids are safely shielded from addictive devices, which is not the case of the ones living in trailer parks).

Nearly every week, we see Silicon Valley execs or funders voicing their concerns about the toxicity of our tech-dominated society — especially our addiction to social media, and Facebook in particular.

A few months ago, Chamath Palihapitiya, former VP for user growth at Facebook, said he felt “tremendous guilt” about his past work (watch on You Tube here):

“I think we have created tools that are ripping apart the social fabric of how society works,” he said, even suggesting we take a “hard break” from social media. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works. (…) No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
Even Sean Parker, who played a major role in the creation of Facebook, had his epiphany.

“[Facebook] literally changes your relationship with society, with each other … It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.(…) The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments.” “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.” “The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway.”

Later, Roger McNamee, who presents himself as one of Zuck’s mentors and is a significant shareholder in the company, said adamantly, “Your users are in peril” (read his open letter here).

The threat, according to McNamee, actually involves the entire tech world, and he referred to an open letter to Apple from the investment firm Jana Partners and the California State Teachers’ Retirement System (CalSTRS) saying Apple must do more to help children fight addiction to its devices.

One of the latest initiatives involves a bunch of Facebook and Google alums who joined the Center for Humane Technology.

Facebook is justifiably concerned by this wave. True to the company’s hyper-centralized culture, its top management hired a full-time pollster to assess damages inflicted on the image of Mark Zuckerberg himself, and the other, more human face of the company, COO Sheryl Sandberg.

Facebook has a “Big Tobacco problem”.
The comparison seems exaggerated, but parallels do exist. Facebook’s management has a long track record of sheer cynicism. Behind the usual vanilla-coated mottos, “bringing people closer together” and “building community”, lies an implacable machine, built from day one to be addictive, thanks to millions of cleverly arranged filter bubbles.

Facebook never sought to be the vector of in-depth knowledge for its users, or a mind-opener to a holistic view of the world. Quite the opposite. It encouraged everyone (news publishers for instance) to produce and distribute the shallowest possible content, loaded with cheap emotion, to stimulate sharing. It fostered the development of cognitive Petri dishes in which people are guarded against any adverse opinion or viewpoint, locking users in an endless feedback loop that has become harmful to democracy. Facebook knew precisely what it was building: a novel social system based on raw impulse, designed to feed an advertising monster that even took advantage of racism and social selectiveness (read ProPublica’s investigation on the matter).

Part of this blind frenzy appeared justified by the fierce belief in the absolute superiority of engineering. Most tech companies are convinced that no problem in the world can resist a great team of engineers, and they all compete to get the best in every discipline. This naive view of the world led to an unabashed superiority complex, augmented by greed and sheer cynicism (it also breed a Lords vs. Serfs culture, but that is another subject).

After the election of Donald Trump, Facebook’s initial attitude was to bluntly deny any involvement in the torrent of misinformation that contributed to the Trump victory. It took a series of solid journalistic investigations to prove the contrary. Now it is certain that Facebook, for the sake of short-term profit, turned a blind eye to what was unfolding. Take a look at this video from the BBC . It is a candid interview with Theresa Hong, who worked on Trump’s team to win social media. She describes in explicit terms, how Facebook knowingly helped the Trump campaign win.

Theresa Hong, a Trump operative, talking candidly about Facebook’s boost to the campaign (BBC)
This is no accident. We’re talking about a sales team deliberately helping a big customer to win a national election, unbeknownst to top management. Supposedly. Was it ignorance, cynicism, incompetence?

Another example. As in the 1990’s, when Big Tobacco felt its home market dwindling, the companies decided to stimulate smoking in the Third World. Facebook’s tactics are reminiscence of that. Today, it subsidizes connectivity in the developing world, offering attractive deals to telecoms in Asia and Africa, in exchange for making FB the main gateway to the internet. In India, Facebook went a bit too far with Free Basic, an ill-fated attempt to corner the internet by providing a free or nearly free data plan. Having some experience with Western colonialism, the Indian government rejected the deal (read this superb investigation by the Guardian).

Mark Zuckerberg is not giving up on capturing the global internet experience inside Facebook’s walled garden. Far from it. The Internet.org initiative embodies Zuck’s dream of granting global access to the internet, extolling its benefit for local economies, as recounted by The Guardian:

[Mark Zuckerberg talking:] “ There was this Deloitte study that came out the other day, that said if you could connect everyone in emerging markets, you could create more than 100 million jobs and bring a lot of people out of poverty.” The Deloitte study, which did indeed say this, was commissioned by Facebook, based on data provided by Facebook, and was about Facebook.

This digital colonialism is a pattern. Last year, when Facebook decided to test a new feature called Explore, it picked Bolivia, Cambodia, Guatemala, Serbia, Slovakia and Sri Lanka, a bunch of markets that look like FB’s soft version of Trump’s “shit hole countries.”

Facebook is in deep trouble. The kind of trouble that threatens its very existence. Despite its two billion users (half of the global internet population), it is far from having consolidated its service as WeChat did in China. Zuckerberg might not have the time to achieve that goal.

Facebook is not as morally flawed as Big Tobacco were. But it should forgo using similar tactics. Even if the so-called “values of Silicon Valley” Facebook wants to embody equal roughly to the relation that a Miami plastic surgeon has with the Hippocratic Oath.

Sort:  

Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://steemit.com/news/@podolski240/facebook-has-a-big-tobacco-problem