According to tech site The Verge, misleading or problematic content has recently been mistakenly prioritized in users’ Facebook feeds, thanks to a software bug that took six months to fix.
Facebook has opposed the report, which was released Thursday, saying it “greatly exacerbates what this bug was because it ultimately had no meaningful, long-term impact on problematic content,” according to Joe Osborne, a spokesman for the parent company Meter.
But the bug could create an internal report citing “massive ranking failures” of content critical enough for a group of Facebook employees, The Verge reported.
In October, workers noticed that some content that had been identified as suspicious by external media – members of Facebook’s third-party fact-checking program – was still supported by algorithms widely distributed to users’ news feeds.
“Unable to find the root cause, engineers saw the waves subside a few weeks later and then started burning again and again until the March 11 ranking issue was fixed,” Verge reported.
But according to Osborne, the bug has affected “very few views” of the content.
This is because “the lion’s share of the feed’s posts do not deserve to be ranked lower in the first place,” Osborne explained, while other mechanisms designed to limit the viewing of “harmful” content remain, including “other demotion, reality-label testing and content removal violations.” . “
AFP currently works with Facebook’s fact-checking program in more than 80 countries and in 24 languages. Under the program, which began in December 2016, Facebook paid for the use of fact checks from about 80 companies, including media outlets and specialized fact checkers on its platform, WhatsApp and Instagram.
Content rated “false” has been downgraded to a news feed so fewer people will see it. If anyone tries to share that post, they are presented with an article explaining why it is confusing.
Those who still want to share the post will receive a notification with a link to the article. No posts are down. Fact checkers are free to choose how and what they want to investigate.