Back to Op-ed

Discuss this story

Good information, Andreas. This trend will continue because there is no limit to how many articles can be created but there is a limit to human resources to manage those articles. Bots etc can solve a lot. The hardest problem is the correct-looking but not actually correct fact intentionally inserted. I think people know this, if not intuitively, which contributes to WP reputation as unreliable. Also, many people who disparage Wikipedia are disgruntled ex-editors who may have been reverted and treated unkindly by those core overworked and surly 3,000 editors - in part our problem is systemically self-inflicted. -- GreenC 15:32, 16 May 2015 (UTC)[reply]
I agree that the superficially plausible fact is the hardest hoax to identify. I do not agree that the number of articles is a problem: the amount of hoaxes is dependant on the number and activity of hoaxers, which at the size Wikipedia has been for some years is more or less independent of additional growth. Therefore the size of the problem is the size of the edit stream. Deeper inspection of edits (or more draconian restrictions on editing) is required to decrease hoaxes. All the best: Rich Farmbrough12:40, 21 May 2015 (UTC).
Surely it's both the number of articles and the number of hoaxers, Rich. Imagine 3 people busy hiding easter eggs in a field: it's your job to find them before a visitor accidentally steps on one. If your field is the size of your living room, with 10 visitors an hour, you can stay on top of things. But if it is the size of a football field, with bushes and hedges blocking your view, and 50 site visitors an hour, you'll find you can't be everywhere. And of course the superficially plausible lie is not always a hoax: sometimes it is just an error, a misunderstanding or an unsuccessful paraphrase. Andreas JN466 01:28, 22 May 2015 (UTC)[reply]
Indeed, for pragmatic purposes the distinction between edits based on motivation is irrelevant.
But the analogy is largely false: we don't need to be everywhere, just in the recent changes: furthermore the growth in number of articles in the mature project does not correspond to a growth in the amount of bed edits, at least not in a linear way. Someone could perhaps run some stats? All the best: Rich Farmbrough12:53, 22 May 2015 (UTC).
There is no need for a growth in bad edits, and none was stipulated: it's enough for bad edits to survive longer. You're right in theory: staying on top of recent changes would be enough. But that's all academic. As things are, even gross vandalism sometimes gets through recent changes. [1] Moreover, recent changes checking has never approximated anything resembling a rigorous check, incl. verification of sourcing, suitability of added content in article context, etc. It doesn't even approximate that in projects that have pending changes installed (though I believe pending changes cuts down on hoaxes, removing the instant gratification a hoaxer gets from seeing their change go live immediately, and if installed in en:WP would free up time currently spent by RC patrollers on competing with ClueBot). With such holes in the first line of defence, the fact that hundreds of thousands of articles are not on any active contributor's watchlist (or literally not on anyone's watchlist) comes into play. Andreas JN466 18:10, 22 May 2015 (UTC)[reply]
Meaning, because Wikipedia is such a highly ranked and familiar website, there is an incredibly varying level of experience and knowledge about it among individuals, and an appreciation of how it has changed over time really probably is only apparent to a small sliver of editors and readers. Liz Read! Talk! 20:17, 17 May 2015 (UTC)[reply]