Meta is taking all the wrong lessons from X


“Meta has always been a home for Russian, Chinese and Iranian disinformation,” claims Gordon Kurowitz, co-CEO of NewsGuard, a company that provides tools to assess the reliability of online information. Now, Meta has apparently decided to fully open the floodgates.

Again, fact checking is not perfect. NewsGuard has tracked down several “false narratives” on meta platforms, Crovis says. And the community notes model with which the meta replaces its fact-checking battalions can still be somewhat effective. But research by Mahavadan and others has shown that crowdsourcing solutions miss a lot of misinformation. And until Meta commits to maximum transparency in how its version is implemented and used, it’s impossible to know if the systems work at all.

It’s also unlikely that a change to forum notes will solve the “bias” problem that meta moderators are ostensibly worried about, given that it’s unlikely to exist in the first place.

David Rand, a behavioral scientist at MIT, said: “The driving force behind all of this meta-policy shifting and Musk’s dominance of Twitter is the accusation that social media companies are biased against conservatives.” There is no good evidence of that.”

In a recent paper published in Nature, Rand and colleagues found that although Twitter users who used a Trump-related hashtag in 2020 were four times more likely to end up suspended than those who used pro-Biden hashtags. “Low quality” or misleading news is likely to be shared.

“Just because there’s a difference in who’s doing it doesn’t mean there’s bias,” Rand says. “Population ratings can do a pretty good job of reproducing fact-check ratings. . . . You’ll still see more conservatives sanctioned than liberals.”

And while X gets a lot of attention in part because of the mask, remember that it’s an order of magnitude smaller than Facebook’s 3 billion monthly active users, which once it installs its social note-style meta system. It presents its own challenges. “There’s a reason there’s only one Wikipedia in the world,” Matzarlis says. “It’s very hard to get something off the ground on a mass scale.”

As for reducing the politics of hateful meta behavior, that in itself is an inherently political choice. It still allows some things and disallows others. Shifting these boundaries to accommodate prejudice does not mean they do not exist. It just means that the meta has no problem with it compared to the day before.

Much depends on how exactly the meta system will work in practice. But between moderation changes and reformed community guidelines, Facebook, Instagram, and Threads are staring at a world where anyone can say that gays and trans people have “mental illness,” where AI tilts even higher, where outrageous claims . spread without control, where truth itself is malleable.

You know: just like X.

Leave a Reply

Your email address will not be published. Required fields are marked *