The Signpost

News from the WMF

What Wikipedia saw during election week in the U.S., and what we’re doing next

A sign with the U.S. flag, the word VOTE and a blue arrow
Sign for early voting in California, 2008
Ryan Merkley is Chief of Staff at the Wikimedia Foundation. He originally published this report on December 17 on The Wikimedia Foundation - News. It is licensed CC-BY SA 3.0.

Election Day in the United States was a critical moment for the country, with impacts that will extend well beyond one election cycle. For many Americans, it was an anxiety-inducing event. While voters waited – and waited – for the results to come in, Wikipedia editors across the globe stood ready.

As one of the world’s most trusted resources for knowledge, it's essential that Wikipedia provide its users with reliable information. In 2020, a high-stakes election and a deadly pandemic were just two of the many reasons that made that mission more important than ever.

That's why the Wikimedia Foundation took significant steps to protect Wikipedia from election-related disinformation. For the first time, a disinformation task force worked closely with Wikipedia's volunteer editors to identify potential information attacks targeting the integrity of the election before they could spread.

Wikipedia’s biggest worry wasn't vandalism – insults or pranks directed at candidates or biased campaign editing, as those types of changes are typically caught and reverted quickly. We were more concerned about the sort of activity that would disrupt the elections – voter suppression tactics affecting information about polling station locations or other topics that could undermine confidence in the facts.

In the end, Wikipedia dealt with only a small number of events relating to election influence activities; neither the Foundation's task force members or Wikipedia's admins saw evidence of large-scale state-sponsored disinformation.

  • Overall, Wikipedia protected about 2,000 election-related pages. Restrictions were put in place so that many of the most important election-related pages, such as the main page about the U.S. 2020 Presidential Election, could be edited only by the most trusted and experienced Wikipedia editors.
  • More than 56,000 volunteer editors monitored the protected pages via real-time feeds of pages they "watch" for new edits. Those editors were distributed across the globe. Someone was always vigilant, no matter the hour.
  • The Wikimedia Foundation's disinformation task force recorded and evaluated 18 events. As always, they worked closely with volunteers, who lead the process of editing and evaluating. All of those edits were quickly reverted by Wikipedia's community.
  • Nearly 800 edits to election-related Wikipedia pages were reverted by the community between November 3 and November 7.
  • The main U.S. Election article saw just 33 reversions during the same time frame — a testament to the community's preparedness and the defenses Wikipedia editors put in place.

Wikipedia's editorial standards played a major role in keeping the platform free of disinformation during the U.S. elections. Editors draw from accurate and verifiable sources – not the latest breaking news, or statements on social media. And they collaborate so that information on Wikipedia reflects multiple editors' areas of expertise.

For instance, the community kept a close eye on the Wikipedia entry for Benford's law, a statistical theory that was used to drive false allegations of voter fraud. Wikipedia’s community of mathematicians coordinated with political editors to make sure the Benford's Law article wasn't used to drive disinformation that would have undermined confidence in the election results.

This sort of interdisciplinary collaboration is possible because of Wikipedia's uniquely collective nature. Users see only the latest versions of articles, and they can investigate how pages have changed over time. That transparency and consistency makes Wikipedia special – there are no different timelines or feeds here. Ads and algorithms don't influence what users see, either.

The U.S. elections may be over, but the work doesn't end here. In the coming weeks, our task force will conduct a deeper analysis with community editors to learn more about what worked well and what didn't, to inform practices for similar events in the future.

The solutions are not simple – they'll require an approach that considers the entire ecosystem of knowledge – from education, to journalistic practice, to platform response. We're committed to doing our part to protect the integrity of information on the Wikimedia projects, and to support communities everywhere who want to share in the sum of all knowledge.

To help meet this goal, we hope to invest in resources that we can share with international Wikipedia communities that will help mitigate future disinformation risks on the sites. We're also looking to bring together administrators from different language Wikipedias for a global forum on disinformation. Together, we aim to build more tools to support our volunteer editors, and to combat disinformation.

As always, convening and supporting the global Wikimedia movement will be at the heart of how we work. Together with editing communities, we'll be looking to develop and refine data-driven tools to support the identification and response to disinformation.


+ Add a comment

Discuss this story

Can any of the editors who interacted with the WMF team give their views on what the process was like? WMF teams like this are much closer to content, which always sets antenna wigging - this article is well written and not problematic, but always good to get more viewpoints Nosebagbear (talk) 11:50, 28 December 2020 (UTC)[reply]

Similarly, does anyone have a rabbithole of links they can drop where such activities were coordinated? I'm finding the lack of links in the WMF statement a bit odd. — Bilorv (talk) 11:59, 28 December 2020 (UTC)[reply]
Ditto, I'm not aware of any such activities, but that's what this makes it sound like. I'd like to see statements like Overall, Wikipedia protected about 2,000 election-related pages. Restrictions were put in place so that many of the most important election-related pages made clearer. I know what it means (or at least I believe I do): At some point during the election process, editors requested protection/administrators protected 2000 pages (or protected pages 2000 times?) due to active disruption on those pages. But it could be read to mean that some sort of Powers That Be pre-emptively protected pages in service of WP's nefarious plans for world domination. —valereee (talk) 20:10, 28 December 2020 (UTC)[reply]
Sorry, are we not planning world domination then? I thought we were the liberal elite, poisoning people's minds with access to um, facts and statistics. And that American conservatives hated us because it was our propaganda that made Trump lose, ahem, win the 2020 election. — Bilorv (talk) 21:48, 28 December 2020 (UTC)[reply]
SHHH OMG BILORV... —valereee (talk) 13:36, 29 December 2020 (UTC)[reply]
This is the first I had heard of any such effort. And from the other comments here, I'd opine that this task force numbered far fewer than 56,000 volunteers. -- llywrch (talk) 23:24, 29 December 2020 (UTC)[reply]
I've been meaning to comment here for awhile, but have been enjoying the conversation also. @Nosebagbear, Bilorv, and Valereee: I think its important to realize that if we're protecting against, say Russia, Chinese, or even Vatican City (just theoretical examples) intelligence forces trying to disrupt Wikipedia - then there are things that can't be completely transparent. That said I do think that the WMF has been pretty open about what they are doing and trying to do. A couple examples from *before* the election involve talking to (non-Signpost) reporters for these two stories (which we reported in in "In the media" last month.

vox/recode and cnet from those I concluded that the WMF worked with ArbCom and likely checkusers and other bureaucrats, though it didn't spell that out in any detail. Presumably WMF staffers, including the "security team", and perhaps some "outsiders" were also part of the team. How the other participants were chosen - I have no idea. And let me emphasize that that is my reading of off-Wiki articles.

Somebody - who will not be mentioned - emailed me asking whether we factcheck our articles. Yes we do, but there are different levels of factchecking. All articles - even opinion articles of people reporting on their own opinions - are read and have to pass the smell test. Statements presented as facts are checked if they don't look right, but not every statement of fact is tracked down in detail. If you see an article with footnotes or extensive wiki-links, we do check those (often we ask for this documentation on specific facts). Emails are saved so that we can send them to ArbCom if somebody says "I never said that" and takes us to ArbCom. All in all, we're not The New Yorker but I'm comfortable with our level of factchecking. If I'm not comfortable we'll kill the story or just wait until I am comfortable.

Factchecking the WMF is a bit different, however. Much of what they state - e.g. the number of unique visitors in a month - they are the ultimate authority on. Like other publications we'll accept their word as fact on these type of things. Other statements we can check out to some extent, and I believe we can do that better, as experienced Wikipedians, than even the large mainstream newspapers can. Other statements about the inner workings of the WMF, we report on the basis that while it might be their opinion, that will be obvious in context, and nobody else is likely to have better facts available. So like I said - there are different level of factchecking.

Finally to @Llywrch:'s question on the 56,000 volunteer editors monitoring the 2,000 election-related pages. That did catch my eye on first reading - it seems way too high - somebody would have reported something to us if 56,000 active Wikipedians had been asked to do this. My interpretation after thinking a bit was that they probably meant the total number of page watchers on those 2,000 articles, that's an average of 28 people per article having it on their watchlist. That's very possible, even a bit low. But it's not 56,000 individual editors watching the articles. I should have asked Ryan for clarification. I'll ping @RMerkley (WMF): to see if he has anything to add. Smallbones(smalltalk) 02:07, 30 December 2020 (UTC)[reply]

@Smallbones: it was less that I would expect the Signpost to be able to provide more details, but I would hope that @RMerkley (WMF): can provide them - if the WMF believes that disinformation attempts are being made by state actors, that is important to know. Additionally, while we are happy enough to trust the WMF on, say, viewer numbers, on these content-related areas, we either need to be able to verify ourselves the extent of WMF involvement or know which groups, such as the en-wiki CUs, they are utilising. Hopefully it is not giving too much away to find out what form the changes are coming in - Ryan, is the team notifying purely of sock possibilities, etc, or is it things like "could you have a look at x paragraph, could be dubious/BLP", or is it "We've checked that, see source y, please remove it"? Nosebagbear (talk) 10:49, 30 December 2020 (UTC)[reply]
Regarding 56,000 editors, only around 40,000 editors make five or more edits to the English Wikipedia per month. [1] To get to 56,000 you would either have to include editors of other projects or include people who make between 1 and 4 edits per month. Clayoquot (talk | contribs) 01:28, 31 December 2020 (UTC)[reply]
If user X is watching US-politics-related article A, then they are probably also watching B, C, etc. So 28 watchers per page is only a lower bound. The number didn’t jump out at me until reading the comments above, but 56k unique watchers seems surprising given Clayoquot's note, even if you include inactive accounts. A better metric would be "editors who have the pages on their watchlist and who logged in during the election week".
Also, just because I have pages on my watchlist, doesn’t mean I regularly watch my watchlist. I suspect many other editors are in the same boat. Short of doing some serious pageview logging (how many of the watchers logged in and viewed their watchlist), I can’t think of a way to factor in that effect, though.
Pelagicmessages ) – (21:01 Fri 01, AEDT) 10:01, 1 January 2021 (UTC)[reply]
Add:— If it’s across all languages, then the watchers:pages ratio is more feasible. There won’t be many polyglots who watch US election pages in all of English, French, Russian, ... etc., etc. The overlapping-interest effect would be partly countered by user language fragmentation. But then, could things written in languages other than English and Spanish influence voting in USA? If the US Election page is protected in Igbo, Telugu, and so on, should those be counted in the 2000 protections total? Were they? Pelagicmessages ) – (21:24 Fri 01, AEDT) 10:24, 1 January 2021 (UTC)[reply]