Facebook Mobs and Facebook Profits

A Facebook team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart. “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”

That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?  The answer it found, in some cases, was yes.  Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms… 

But in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products…

An idea [proposed by those who wanted to reduce polarization at Facebook] was to tweak recommendation algorithms to suggest a wider range of Facebook groups than people would ordinarily encounter.  Building these features and combating polarization could have come, though, at the cost of lower engagement and it was “antigrowth” [meaning less profits for Facebook].

Excerpt from Jeff Horwitz and Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the Site Less Divisive, WSJ, May 26, 2020

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s