We select and review products independently. When you purchase through our links we may earn a commission. Learn more.

Meta Wants to Fix’s Wikipedia Biggest Problem Using AI

The Meta logo over the Wikipedia logo.
Meta, Wikipedia

Despite the efforts of over 30 million editors, Wikipedia sure ain’t perfect. Some information on Wikipedia lacks a genuine source or citation—as we learned with the Pringle Man hoax, this can have a wide-ranging impact on culture or “facts.” But Meta, formerly Facebook, hopes to solve Wikipedia’s big problem with AI.

Note: To be clear, this is an independent project by researchers at Meta AI, a division of the Meta corporation. The Wikimedia group is not involved, and it isn’t using SIDE to automatically update articles.

As detailed in a blog post and research paper, the Meta AI team created a dataset of over 134 million web pages to build a citation-checker AI called SIDE. Using natural language technology, SIDE can analyze a Wikipedia citation and determine whether it’s appropriate. It can also find new sources for information already published on Wikipedia.

An example of how SIDE can fact-check and suggest new citations on Wikipedia.
Meta AI

Meta AI highlights the Blackfoot Confederacy Wikipedia article as an example of how SIDE can improve citations. If you scroll to the bottom of this article, you’ll learn that Joe Hipp was the first Native American to compete for the WBA World Heavyweight Title—a cool fact that is 100% true. But here’s the problem; whoever wrote this factoid cited a source that has nothing to do with Joe Hipp or the Blackfeet Tribe.

In this case, Wikipedia editors failed to check the veracity of a citation (the problem has since been fixed). But if the editors had SIDE, they could have caught the bad citation early. And they wouldn’t need to look for a new citation, as SIDE would automatically suggest one.

At least, this is the hypothesis put forth by Meta AI researchers. While SIDE is certainly an interesting tool, we still can’t trust AI to understand language, context, or the veracity of anything published online. (To be fair, Meta AI’s research paper describes SIDE as more of a “demonstration” than a working tool.)

Wikipedia editors can now test SIDE and assess its usefulness. The project is also available on Github. For what it’s worth, SIDE looks like a super-powered version of the tools that Wikipedia editors already use to improve their workflow. It’s easy to see how such a tool could flag citations for humans to review, at the very least.

Source: Meta AI

Andrew Heinzman Andrew Heinzman
Andrew is the News Editor for Review Geek, where he covers breaking stories and manages the news team. He joined Life Savvy Media as a freelance writer in 2018 and has experience in a number of topics, including mobile hardware, audio, and IoT. Read Full Bio »