CANBERRA, Australia — Facebook has conceded that the internet needs new road rules but wants to keep its billions of users in the driver’s seat.
Whether governments choose to tighten online debate or private companies do it, don’t assume people can’t be trusted, the social media giant’s spin doctor has warned.
Nick Clegg, vice-president of global affairs, said in an essay published overnight, critics were wrong to say minds were being controlled by algorithms or sinister intentions of Big Tech masters.
No editor is dictating the front-page headline that millions of people might read on Facebook.
Instead, there’s a “rich feedback loop” and billions of front pages, each personalized to individual tastes and preferences and each reflecting a unique network of friends, pages, and groups.
The social media platform has 2.80 billion monthly active users and 1.84 billion who use it every day.
“This is a dramatic and historic democratization of speech,” Clegg said.
“Political and cultural elites are confronting a raucous online conversation that they can’t control, and many are understandably anxious about it.”
But, feeling the heat from regulators in Australia and elsewhere, Clegg concedes ground rules are needed.
He knows social media companies must also come clean about how algorithms work.
“And tech companies need to know the parameters within which society is comfortable for them to operate so that they have permission to continue to innovate,” Clegg said.
“The reality is, it’s not in Facebook’s interest — financially or reputationally — to continually turn up the temperature and push users towards ever more extreme content.”
The vast majority of Facebook’s revenue is from advertising, and advertisers don’t want their brands and products displayed next to extreme or hateful content.
Last year’s #StopHateForProfit boycott by civil rights groups saw more than 1,000 companies stop paying for ads on Facebook.
Users forced Facebook to curb its streaming feature in response to global criticism after material from the Christchurch mosque attacks in New Zealand was freely shared.
At the time, world leaders said Facebook had to remove 1.5 million copies of a gunman killing 51 people was a stark reminder to do more to stop it.
Facebook’s recent decision to stop recommending civic and political groups to users in the United States is now being expanded globally.
The decision was made in September 2020 when Mark Zuckerberg wrote a Facebook post saying, “The US elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned about the challenges people could face when voting.”
“I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.
This election is not going to be business as usual. We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
Facebook is also considering reducing the amount of political content in the news feeds in response to “strong feedback” from users who want to see less of it.
(Edited by Amrita Das and Ojaswin Kathuria)