Give it up, Facebook
by Jelani Drew-Davi
In the lead up to the 2020 election, Facebook has spent more energy trying to mend their broken public image rather than fixing the mayhem happening on their platform.
To take the most recent example, Facebook is “upset” about how it was portrayed in the Netflix documentary Social Dilemma. The documentary isn’t perfect — it’s overwhelmingly white and creates an easy redemption arc for “tech bros” who make the problematic platforms we use.
But The Social Dilemma is not wrong when it shows that Facebook puts users in harm’s way by favoring conspiracies over facts, ad dollars over human dignity.
Instead of addressing these serious and legitimate criticisms, Facebook decided to simply refute The Social Dilemma, claiming their platform isn’t so bad. Their wildest claim is that we are not the product. Facebook says it is “funded by advertising so that it remains free for people.” So if we’re not the product, then who are advertisers selling to? Why would they want to pay for space to advertise on Facebook?
Our data and behavior do not have to be directly sold for us to become a product. Facebook has collected our information and behaviors to build their entire ads infrastructure, something that companies, organizations, and political campaigns alike deem necessary to their success. And it pays off: 99% of Facebook’s revenue comes from advertising. Regardless of whether Facebook sees us as “products” or not, the reality is that we make or break Facebook’s business model.
Facebook says its algorithm “keeps the platform relevant and useful.” Facebook compares its algorithms to ones used by Netflix. The difference is that Netflix recommends the new season of Grey’s Anatomy; it doesn’t decide the fate of U.S. elections or suggest that I join a conspiracy theory group. The algorithm that Facebook built makes inexplicable decisions like censoring onions (yes, onions!) because they’re deemed to be “overly sexualized.”The Verge, in their latest Facebook exposé, outlines how compensation at Facebook is tied to employees’ ability to boost engagement in Facebook products. Sensational and fringe content — especially from the right wing — ends up getting far better engagement. MIT Professor and author Sinan Aral calls this the “hype machine.” “They give you little dopamine hits, and … get you riled up,” he says. “That’s why I call it the hype machine. When posts provoke emotional responses, we’re more likely to engage or share them, even if our emotions are anger or fear.” This matters: According to CNBC, U.S. political campaigns spent $264 million on Facebook ads in just the third quarter this year.
Which brings us to problems that are being exacerbated on Facebook precisely because of their algorithms: polarization and the spread of misinformation. We already know that Facebook algorithms are built to favor extreme content. The company even fired an employee who was collecting evidence to show Facebook’s bias. The more people Facebook pushes to engage in fringe topics and to join conspiracy groups, the more polarization will continue to exist on the website and have an impact on our elections and our lives offline.
“We fight fake news, misinformation, and harmful content, using a global network of fact-checking partners,” Facebook says. The truth is that it has a mis- and disinformation problem. Facebook’s COO Sheryl Sandberg told NPR that the company wants to “make sure people don’t get bad information.” But their response to lies about the validity of mail-in ballots is nothing more than a small link to their own voter information center. If election mis/disinformation is a roar, Facebook’s labeling solution is a squeak.
Facebook’s business model is a threat not only to interpersonal relationships but to our very democracy. With only days left until the election, Facebook should stop spending precious time on PR and make some real changes.