Facebook is launching new tools to allow users better control over what they see in their newsfeeds as the company pushes back on claims it is a polarizing platform.
Overnight, the social network operator, which has around 2.7 billion users globally, detailed significant changes that will roll out first to Android users, then iOS and web users.
The changes will let users choose how their feed of followed pages and other users is categorised, currently an algorithmically-driven process. That means posts aren’t presented in chronological order, but based on a series of criteria that can see posts with high engagement – likes, shares and comments, elevated to the top of the newsfeed.
Facebook has weathered significant criticism that this approach, and the lack of transparency into the algorithm settings and how they are regularly tweaked, elevates the content that sparks the most heated responses.
Facebook’s new favourites settings allow you to prioritise what content you see in your newsfeed.
New newsfeed control settings will see a prominent filter feed tab appear, which allows you to toggle between chronological and algorithm-driven newsfeed ranking of posts. You’ll also be able to select a feed of favourites you want to prioritise in your feed, drawn from groups, friends and pages you want to see first.
Another change will give greater control over who interacts with your content. Essentially, you’ll be able to decide who can comment on your posts, either allowing everyone who can see it to comment or restrict commenting to only those who have been tagged by the profile or page in the post. That mirrors a similar move by Twitter recently to allow users to control who can reply to their tweets.
That should go a long way to stopping abusive comments appearing on people’s posts and causing them distress.
In a lengthy Medium essay reflecting on the changes, Facebook’s Vice-President for Global Affairs and Communications, Nick Clegg, accepted that the inner workings of the Facebook newsfeed was a mystery to most people and that needed to change.
“People should be able to better understand how the ranking algorithms work and why they make particular decisions, and they should have more control over the content that is shown to them,” Clegg wrote.
“You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes-to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform.”
Clegg argues that the personalisation aspect of social media, where algorithms deliver content tailored to your interests, is what has made it so much more compelling than traditional media, where an editor decided what was prioritised on the front page of newspapers and websites.
But he adds that the black box that is Facebook’s algorithmic settings has led to a gap in understanding that has led to “assumptions, half-truths, and misrepresentations” about how Facebook works.
But Clegg, the former Deputy Prime Minister of the United Kingdom, rejected claims Facebook has made the world more polarized.
“What evidence there is simply does not support the idea that social media, or the filter bubbles it supposedly creates, are the unambiguous driver of polarization that many assert,” he wrote on Medium.