Media

Facebook undermines its own effort to fight fake news

The Facebook logo is pictured here. | AP

The fact-checkers enlisted by Facebook to help clear the site of “fake news” say the social media giant’s refusal to share information is hurting their efforts.

In December, Facebook promised to address the spread of misinformation on its platform, in part by working with outside fact-checking groups. But because the company has declined to share any internal data from the project, the fact-checkers say they have no way of determining whether the “disputed” tags they’re affixing to “fake news” articles slow — or perhaps even accelerate — the stories’ spread. They also say they’re lacking information that would allow them to prioritize the most important stories out of the hundreds possible to fact-check at any given moment.

Some fact-checkers are growing frustrated, saying the lack of information is undermining Facebook’s efforts to combat false news reports.

“I would say that the general lack of information — not only data — given by Facebook is a concern for a majority of publishers,” Adrien Sénécat, a journalist at Le Monde, one of the news organizations that has partnered with Facebook to fact-check stories, said in an emailed response.

Representatives from Facebook say that privacy concerns prevent them from sharing raw data with outsiders.

In the wake of November’s election, Facebook CEO Mark Zuckerberg downplayed the amount of fake news on his platform and called it “a pretty crazy idea” that it could have influenced the election. But a month later, under pressure, the company announced a slew of efforts designed to combat the problem, including the arrangement with fact-checkers. “We’re committed to doing our part,” Facebook’s vice president for News Feed, Adam Mosseri, wrote. “We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully.”

Mosseri has publicly characterized those efforts as effective. In April, he said in an address, “We’ve seen overall that false news has decreased on Facebook,” but the company did not provide proof of the claim. “It’s hard for us to measure,” Mosseri had added, “because we can’t read everything that gets posted.”

Sara Su, a product manager on Facebook’s News Feed team, told POLITICO that she believes the fact-check program is working: “We have seen data that, when a story is flagged by a third party fact-checker, it reduces the likelihood that somebody will share that story.” She declined, though, to provide any specific numbers.

Facebook does plan on eventually sharing more information with the fact-checking groups it works with, according to Su, though exactly how much and when is undetermined. “I wish I could give you dates, but we are committed to working with our fact-checking partners to continue to refine the tools to be more efficient,” she said.

For now, many fact-checkers are taking Facebook’s claims of success with the proverbial grain of salt.

“This is going to sound super corny, but fact-checkers don’t really take anything at face value,” said Alexios Mantzarlis, the director of Poynter’s International Fact-Checking Network. “You need to support with evidence.”

In the United States, Facebook signed up PolitiFact, FactCheck.org, Snopes.com, the AP and ABC News to patrol news on the platform. Since March, users have been able to report stories that seem untrue and send them to a queue for the checkers. Facebook’s algorithms also search for stories that seem bogus, adding them to the queue. If two of the fact-checking groups label a story false, then Facebook slaps a “disputed” tag on it.

The issue, Mantzarlis said, is that fact-checkers don’t know how stories are affected by being flagged — whether reactions to them change, if sharing goes up or down, or, more broadly, if fact-checks are capable of changing the mind of someone inclined to believe a false story in the first place. There is also the question of whether a disputed tag from the fact-checking groups — all mainstream media — could be a “badge of honor” for a certain strain of story.

Though it occupies a central role in the public discourse, as a private company, Facebook is under no obligation to disclose internal data. Su said that doing so could get too close to revealing information about users. “I think it’s hard to strike the balance. We all have the same objective, to prevent false news from reaching people on our platform,” said Su. “We want to be as transparent as we can be while also respecting the privacy of people on our platform.”

On Wednesday, news emerged that Facebook acknowledged to congressional investigators that it had sold ads during the 2016 campaign to a Russian company trying to influence voters. The acknowledgment came as part of the probe into Russian efforts to influence the election, and underscored the role of Facebook as a political messaging tool.

Eugene Kiely, the director of factcheck.org, and Aaron Sharockman, the executive director of PolitiFact (which is owned by Poynter), both report largely positive working relationships with Facebook. Both say, though, that having more access to data would be helpful.

Some uneasiness appears to have crept into the larger relationship between Facebook and the fact-checking community. Sharockman described the mood as “tense” in July when representatives from Facebook and Google addressed a crowd of about 150 at the fourth Global Fact conference in Madrid.

The crowd wasn’t hostile, Sharockman said, but a theme emerged in questions from both the audience and Mantzarlis, who organized the conference and hosted the panel: Would Facebook and Google agree to share data with them?

Mantzarlis asked Áine Kerr, Facebook’s manager of journalism partnerships: “When will Facebook share what happens to the reach and engagement of stories that got flagged by fact-checkers?” And he asked Google’s Philippe Colombet, one of the company’s top news partnership officials: “When will Google share data on how users react to finding fact-checks in their search?”

During the crowd Q&A, others, including Sénécat, the Le Monde journalist, followed up with similar questions. In both cases, Kerr and Colombet acknowledged the fact-checkers’ concerns and said that they would relay them to their companies, but that they weren’t in positions to make any changes.

Sénécat does not believe the meeting in Madrid was tense, per se, but he said in an email that it “sure was kind of frustrating” because the Facebook and Google officials were “avoiding tough questions.”

“I think more than ‘tension,’ there was some kind of disappointment,” he said.

Mantzarlis described the discussion as “frank.”

One important way data from Facebook could help his team, Sharockman said, is with prioritizing which stories to fact-check. Currently, Facebook tells the fact-checking groups which stories in their queue are most popular on the platform, but it doesn’t give any information beyond a ranking.

“There’s 1,200, 1,500 stories that we could look at today, and we’re going to look at two,” Sharockman said. “Our process is pretty time intensive.” A single fact-check can take five hours to complete, he said. That means that choosing the right stories to examine is crucial.

Mantzarlis said he would want to see data on how the length of time it takes to flag a false story affects its spread. It’s not hard to imagine other questions: Do certain types of stories require more immediate attention than others? Are there other types that may not be viral yet, but data has shown likely will be soon? When an article is flagged, how often do copycat versions with altered headlines pop up to replace it? Even knowing what types of headlines work best for fact-checking posts would be valuable, Mantzarlis said.

Sénécat added that it would be helpful to have access to a list of posts that have been reviewed by other third party fact-checkers — something he said is currently not available.

Su noted that Facebook recently started listing fact-checks in the “related articles” stack beside stories on similar subjects that may be false. She added that the fact-checkers are just one of many ways that Facebook is attempting to curb false news. For instance, every fact-check that is completed, she said, is entered into Facebook’s machine learning model to help its software identify similar stories in the future.

Facebook said it has also tried to reduce the financial incentives for producers of false news.

Brendan Nyhan, a Dartmouth College political science professor who has studied fact-checking extensively, said he understands the private companies’ desire to keep raw data in-house, but that he would like to see shared more “results of the experiments they have presumably done evaluating the effectiveness of different approaches to countering misinformation.”

“This is an important issue,” Nyhan said. “Facebook was a key vector of misinformation during the 2016 campaign, so the effectiveness of their response is of great public concern.”

Sharockman and Kiely both acknowledge that they are relatively early in their program with Facebook. And Mantzarlis is optimistic that, eventually, the company will open up.

“I’m hopeful that we’ll get there before the end of the year,” Mantzarlis said. “I think this is important to people within Facebook, so I think they will share information.”