Skip to content

Facebook Defends Algorithm-driven Reality

As regulators circle tech giants, Facebook explains how the social media platform works and argues users now have more control.

CANBERRA, Australia — Accused of being a Doomsday machine that fuels hate and manipulation, Facebook has posted a defense that it is a global force for democracy.

People are not powerless victims or playthings of technology, Nick Clegg, Facebook’s spin doctor, informed an audience of 170 million readers on Medium.

“This is the magic of social media, the thing that differentiates it from older forms of media,” Clegg said. “Machines have not taken over, but they are here to stay.”

There is no editor dictating the front-page headline that millions of people might read on Facebook.

Instead, there’s a “rich feedback loop” and billions of front pages, each personalized to individual tastes and preferences and each reflecting a unique network of friends, pages, and groups.

“This is dramatic and historic democratization of speech. Political and cultural elites are confronting a raucous online conversation that they can’t control, and many are understandably anxious about it,” he said.

But, feeling the heat from regulators in Australia and elsewhere, ground rules are needed, Clegg said.

He knows social media companies must also come clean about how algorithms work.

“Tech companies need to know the parameters within which society is comfortable for them to operate so that they have permission to continue to innovate.”

Facebook’s recent decision to stop recommending civic and political groups to users in the United States is now being expanded globally.

Facebook is also considering how to reduce the amount of political content in newsfeeds in response to “strong feedback” from users that they want to see less of it.

“People, not machines, make the algorithms that shape our online reality,” said an official from Facebook.

Facebook claims it is mindful of potential bias and harm.

“The reality is, it’s not in Facebook’s interest – financially or reputationally – to continually turn up the temperature and push users towards ever more extreme content,” Clegg said.

Last year’s #StopHateForProfit boycott by the civil rights groups saw more than 1000 companies stop paying for ads on Facebook.

“Learning an expensive lesson, the vast majority of Facebook’s revenue is from advertising, and advertisers don’t want their brands and products displayed next to extreme or hateful content,” Clegg said.

As per officials, Facebook was reportedly forced to curb its streaming feature in response to global criticism after material from the Christchurch mosque attacks in New Zealand was freely shared.

“It would be better if these decisions were made according to frameworks agreed by democratically accountable lawmakers. But in the absence of such laws, there are decisions that need to be made in real-time,” said Clegg.

“The internet needs new rules for the road that can command broad public consent.”

(Edited by Amrita Das and Pallavi Mehra)

Recommended from our partners