Citizen Scientists Fighting MisinformationCitizen Science SciStarter Blog 

Citizen Scientists Fighting Misinformation

FacebooktwitterlinkedinmailFacebooktwitterlinkedinmail

A new project called Public Editor invites citizen scientists to work together to fight misinformation and vet the credibility of news. 

SciStarter Blog

In an era of social media bots, deepfakes and “alternative facts,” reliable news is more important than ever. Now, a citizen science project called Public Editor is asking volunteers to help suss out credible news through online analysis. With guidance and support, Public Editor volunteers evaluate sections of news articles for “reasoning errors” and other issues.

The project takes its name from the public editor role at news publications, which is usually responsible for supervising an outlet’s journalistic ethics and, at times, acting as a liaison to the public. Public Editor has adopted aspects of this traditional post and digitized them with the help of citizen science and artificial intelligence.

“The most rewarding aspect of this work is hearing from volunteers who so appreciate being empowered to confront this massive and growing problem of misinformation,” says Public Editor’s co-director, Nick Adams. “The harmful effects seem to be compounding as bogus thinking accelerates both the spread of COVID and the erosion of our common bonds as a national community.”

RELATED: VACCINE MISINFORMATION AND SOCIAL MEDIA

Where Did Public Editor Come From?

Public Editor, an independent nonprofit, began as a collaboration between members of the nonprofit Goodly Labs and the Institute for Data Science at the University of California, Berkeley. Goodly Labs’ goal is to offer collaborative tech and experiences to benefit society. Adams is also the founder and chief scientist of Goodly Labs and CEO of Thusly, which provides the TagWorks artificial intelligence technology that Public Editor uses for “collaborative annotation.” Public Editor’s other co-director is physicist Saul Perlmutter, who won the 2011 Nobel Prize in Physics for his team’s discovery of the universe’s accelerated expansion.

The pair began work on the concept for Public Editor back in 2015. Along the way, they enlisted a design team that includes cognitive scientists, journalists, science educators, sociologists, librarians, psychologists, software engineers and other experts.

How Public Editor Works

Public Editor’s underlying TagWorks technology was designed to facilitate collaborative annotation and crowdsourced analysis. Public Editor uses that tech to combine responses from volunteers about bias and misinformation, and then uses the prevalent judgments from the group to create a final credibility score for an article.

In a sense, TagWorks takes the nonprofit team’s expertise and translates it into tools that anyone can use. In Public Editor’s words, experts use TagWorks “to set up an assembly line of simple annotation tasks that non-experts can complete.”

The citizen science project breaks contributors up into different roles. Some Public Editor volunteers serve as “triagers” to select article passages for review, and others serve as specialist editors. These specialist editors learn to label “praiseworthy or problematic” use of evidence and language, discussion of probabilities, and reasoning within a passage. Through a set of nine assessment tasks, volunteers can identify over 40 different types of reasoning errors.


Take Part: Join Public Editor and Help Fight the Spread of Misinformation


On the Public Editor site, new volunteers are invited to start out with what they call a Quoted Sources Specialist training video and set of activities, then move on to other categories like Reasoning or Argument Relevance Specialist.

In early stages of Quoted Sources Specialist tasks, volunteers are asked to highlight areas of text “to indicate how the quotes/paraphrases were gathered by the reporter.” For example, a reporter might reference a person, website or book as the source of a quote. The volunteer would highlight the specific words containing that reference.

After a task, the system asks the user how hard the task was, and how confident they are in their answers. “Do your best, but don’t stress,” the directions state. “Other people are also completing these tasks, and the [Public Editor] system will find the consensus among your best efforts.”

“[No] one can jump straight into Public Editor and immediately have an impact on an article’s labels or scores,” Adams says. “Training is required, and an individual user’s weight in the system rises over time with good performance.” As they progress, volunteers can earn virtual badges and laptop stickers. Public Editor answers volunteer questions and builds community in a Team Forum that also hosts online and in-person parties, and connects volunteers through location-based groups.

To see a Public Editor-annotated article complete with labels and scores, check out some of their samples.

Addressing Bias in Public Editor

Bias and inequity are long-standing problems in STEM fields, and artificial intelligence continues to struggle with issues of gender, race and other identities. These are enduring challenges for many tech organizations, as well as for citizen science projects that crowdsource information.

Public Editor has proactively grappled with bias through intentional systems and a team trained to find user bias and address it. For example, articles are presented without a news outlet or author name, but if a volunteer is still more critical of some news sources than others, Public Editor temporarily reduces the weight of their judgments and provides extra training.

Adams says eliminating bias is impossible. “[But] especially once a person has completed many tasks, it is easy to see how they are systematically out of alignment with their teammates,” he says. “We even include some questions and answer choices in the system that are especially likely to reveal individuals’ ideological and cognitive biases.” Public Editor can then share those biases with users and encourage them to correct for them in the future.

The group has also tried to construct its training process to prevent trolls from infiltrating the system. Someone with unhelpful intentions would have to work for hours before their judgments would carry weight. At that point, changes in contributor behavior would be flagged, and the account might be suspended.

But, if a majority of volunteers share a similar bias, how could the system find it?

There are overlapping protocols in place to counteract user bias. But, Adams says they’d have to use alternate means if the Public Editor community was ever very homogeneous, such as more than 85 percent from a single intersectional group (like “young, conservative, middle-class women of European ancestry”). “[The system] would have no information indicating that its aggregate perspective was failing to represent perspectives of the broader population,” he says. “That is something our team would discover through other methods like surveying our teammates.”

Adams says the best way to prevent “harmful bias” is through recruitment.

“We are inviting a diversity of people into Public Editor –– folks identifying with a variety of cultural, ideological, class, heritage and age groups,” he says. Diversity can help individual biases “average out” if or when Public Editor teammates don’t correct for their own biases.

Using Citizen Science to Fight Misinformation

Adams also says diversity makes the project’s consensus judgement more credible because, for example, when people from multiple perspectives identify the same reasoning flaw, “[We] feel much more confident that the flaw is real and problematic.” He says Public Editor seeks to be inclusive and is especially excited to work with people who’ve been historically underrepresented in public discourse. “Some of these communities are being targeted right now by misinformation campaigns seeking to discourage their political participation,” he says.

Along with its core founding institutions, Public Editor is working with a handful of partner organizations, and it also collaborates with external fact checkers. The nonprofit is open to working with individuals, and private and public groups. Public Editor plans to launch a Chrome browser extension so that readers can use its evaluations in real time. They suspect approval from Google could come soon.

Volunteers who join Public Editor today can immediately address an issue many are passionate about: the search for truth and creating a shared understanding of credible information.

“The good news is, every case of misinformation is a teachable moment; a learning experience for Public Editor’s volunteers and newsreaders,” Adams says. “So, we can get our arms around it, keep it in check, and use it to help everyone become more discerning and thoughtful.”

About the Author

Julia Travers writes about science, tech, art and creative responses to adversity. Her work can be found with NPR, APR and Earth Island Journal, among other publications. Find her on Twitter @traversjul.

Read more SciStarter Blog

FacebooktwitterlinkedinmailFacebooktwitterlinkedinmail

Recommended for You