Facebook alters fact-checking controls for US users

Facebook alters fact-checking controls for US users

Facebook is giving users more control over their feed.
Facebook is giving users more control over their feed.. Photo: Tobias SCHWARZ / AFP
Source: AFP

Meta-owned Facebook has handed US users the controls over fact-checked content, in a potentially significant move that the platform says will give them more power over its algorithm but some analysts insist could benefit purveyors of misinformation.

For years, Facebook's algorithm automatically moved posts lower in the feed if they were flagged by one of the platform's third-party fact-checking partners, including AFP, reducing the visibility of false or misleading content.

Under a new "content reduced by fact-checking" option that now appears in Facebook's settings, users have flexibility to make debunked posts appear higher or lower in the feed or maintain the status quo.

Fact-checked posts can be made less visible with an option called "reduce more." That, according to the platform's settings, means the posts "may be moved even lower in feed so you may not see them at all."

Read also

What we know about Threads, Meta's 'Twitter killer'

Another option labeled "don't reduce" triggers the opposite effect, moving more of this content higher in their feed, making it more likely to be seen.

"We're giving people on Facebook even more power to control the algorithm that ranks posts in their feed," a Meta spokesman told AFP.

PAY ATTENTION: Join Legit.ng Telegram channel! Never miss important updates!

"We're doing this in response to users telling us that they want a greater ability to decide what they see on our apps."

Meta rolled out the fact-checking option in May, leaving many users to discover it for themselves in the settings.

It comes amid a hyperpolarized political climate in the United States that has made content moderation on social media platforms a hot-button issue.

Conservative US advocates allege that the government has pressured or colluded with platforms such as Facebook and Twitter to censor or suppress right-leaning content under the guise of fact-checking.

Read also

French MPs urge TikTok ban ultimatum

On Tuesday, a federal court in Louisiana restricted some top officials and agencies of President Joe Biden's administration from meeting and communicating with social media companies to moderate their content.

Separately, misinformation researchers from prominent institutions such as the Stanford Internet Observatory face a Republican-led congressional inquiry as well as lawsuits from conservative activists who accuse them of promoting censorship -- a charge they deny.

'Exposure to misinformation'

The changes on Facebook come ahead of the 2024 presidential vote, when many researchers fear political falsehoods could explode across social media platforms. The move has also prompted concern from some analysts that it could be a boon for misinformation peddlers.

"Downranking content that fact-checkers rate as problematic is a central part of Facebook's anti-misinformation program," David Rand, a professor at the Massachusetts Institute of Technology, told AFP.

"Allowing people to simply opt out seems to really knee-cap the program."

Read also

Asia tracks Wall St lower as Fed minutes warn of more hikes

Meta downplayed the concerns, saying it will still attach labels to content that is found to be misleading or false, making it clear that it was rated by one of its third-party fact-checkers. The company said it was exploring whether to expand this control to other countries.

"This builds on work that we've been doing for a long time in this area and will help to make user controls on Facebook more consistent with the ones that already exist on Instagram," Meta's spokesman said.

Aside from this control, Facebook is also allowing users to decide the degree to which they want to view "low quality content," such as clickbait and spam, and "sensitive content," including violent or graphic posts, on the platform.

The impact of the changes, analysts say, is only likely to be known over time when more users -- especially those who distrust professional fact-checkers -- start tweaking their settings.

Read also

Twitter rivals pile up with Meta's Threads launch

Fact-checkers, who are not able to review every post on the mammoth platform, routinely face an avalanche of online abuse from people who dispute their ratings -- sometimes even when they peddle blatantly false or misleading information.

"Someone who dislikes or distrusts the role of fact-checkers could use it to try to avoid seeing fact-checks," Emma Llanso, from the Center for Democracy & Technology, told AFP.

Facebook, she said, should be researching and testing whether it will increase or decrease users' "exposure to misinformation" before it rolls it out more widely around the world.

"Ideally they should share the results of that kind of research in an announcement about the new feature," Llanso added.

Source: AFP

Authors:
AFP avatar

AFP AFP text, photo, graphic, audio or video material shall not be published, broadcast, rewritten for broadcast or publication or redistributed directly or indirectly in any medium. AFP news material may not be stored in whole or in part in a computer or otherwise except for personal and non-commercial use. AFP will not be held liable for any delays, inaccuracies, errors or omissions in any AFP news material or in transmission or delivery of all or any part thereof or for any damages whatsoever. As a newswire service, AFP does not obtain releases from subjects, individuals, groups or entities contained in its photographs, videos, graphics or quoted in its texts. Further, no clearance is obtained from the owners of any trademarks or copyrighted materials whose marks and materials are included in AFP material. Therefore you will be solely responsible for obtaining any and all necessary releases from whatever individuals and/or entities necessary for any uses of AFP material.