Yesterday, The Verge reported that over the past six months, there was a âmassive ranking failureâ in Facebookâs algorithm, which caused potential âintegrity risksâ for half of all News Feed views.
The information comes from a group of engineers, who found the algorithm failure last October. They said that when a batch of misinformation was flowing via the News Feed, instead of blocking spreading, the system was promoting them. In effect, misinformation that appeared on the News Feed got 30% more views. Whatâs strange, Facebook couldnât find the root cause. They were only monitoring how such posts were acting. Interestingly, such news didnât get so many views after a few weeks of being on the top. Facebook engineers could fix the bug on March 11.
Facebookâs internal investigation showed that Facebookâs algorithm couldnât prevent posts from being displayed even if they were consisting of nudity, violence, etc. internally, the bug got a SEV level. SEV stands for site event. However, this is not the worst that can happen with the system. Though itâs a level-one bug, there is also a level-zero SEV used for the most dramatic emergencies.
Facebook has already officially admitted that there had been a bug. Meta spokesperson Joe Osborne said that the company âdetected inconsistencies in downranking on five separate occasions, which correlated with small, temporary increases to internal metrics.â
According to the internal documents, this technical issue was first introduced in 2019. However, at that time, it hadnât a noticeable impact until October 2021. âWe traced the root cause to a software bug and applied needed fixes,â said Osborne. âThe bug has not had any meaningful, long-term impact on our metrics.â
Downranking As The Best Way To Display Content You Need
For the time being, Facebook hasnât explained what kind of impact it has on the displayed content in the News Feed. We just know that downranking is the way that the Facebook algorithm uses to improve the quality of the News Feed. Moreover, Facebook has been actively working on this approach, making it the primary way of displaying content.
In some sense, this is a logical approach. We mean posts about wars and controversial political stories could have high rankings. But in most cases, they do not match the terms of Facebook and could be banned. In other words, as Mark Zuckerberg explained back in 2018, downranking struggles against the instinct of being obsessed with âmore sensationalist and provocativeâ content. âOur research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average â even when they tell us afterwards they donât like the contentâ.
Downranking âworksâ for the so-called âborderlineâ content too. Thus, Facebook doesnât show not only content that is close to violating its rules but also content that is determined as violating. Facebook’s AI system checks all posts. However, later, corresponding employees check the âviolatingâ content themselves.
Last September, Facebook shared a list of downranked content. But it didnât provide any info on how demotion impacts the distribution of affected content. We hope that in the future, Facebook will be more transparent when it comes to downranked content on News Feed.
Does Your News Feed Showing Content You Like?
At the same time, Facebook doesnât miss out on an opportunity to show off how its AI system works. They prove that the algorithm is getting better each year and successfully fights against content including hate speech and the likes. Facebook is sure technology is the best way for moderating the content to such an extent. For instance, in 2021, it officially announced to downranking all political content in the News Feed.
Many would confirm that there hasnât been malicious intent behind this recent ranking. But Facebookâs report shows that any web-based platform and the algorithms they use should be as transparent as possible.
âIn a large complex system like this, bugs are inevitable and understandable,â said Sahar Massachi, a former member of Facebookâs Civic Integrity team. âBut what happens when a powerful social platform has one of these accidental faults? How would we even know? We need real transparency to build a sustainable system of accountability, so we can help them catch these problems quickly.â