Search results
Results from the WOW.Com Content Network
The most well-known bot that fights vandalism is ClueBot NG. The bot was created by Wikipedia users Christopher Breneman and Naomi Amethyst in 2010 (succeeding the original ClueBot created in 2007; NG stands for Next Generation) [9] and uses machine learning and Bayesian statistics to determine if an edit is vandalism.
John Seigenthaler, an American journalist, was the subject of a defamatory Wikipedia hoax article in May 2005. The hoax raised questions about the reliability of Wikipedia and other websites with user-generated content. Since the launch of Wikipedia in 2001, it has faced several controversies. Wikipedia's open-editing model, which allows any user to edit its encyclopedic pages, has led to ...
The positive outcomes of more restrictive counter-vandalism measures are clear – decreasing vandalism; the negative consequences can include a more hostile reception for new users who are experimenting or unfamiliar with the way Wikipedia works, discouragement to editors who find that their good-faith attempt at improving a protected page is ...
Making a bad edit followed by a good one. Since only the most recent edit to an article will show up on some watchlists, the bad edit may then go unnoticed. Making an edit following vandalism that says "remove vandalism" in the edit summary, so others will believe the vandalism has been reverted; Use of multiple accounts to hide vandalism
Despite the community's attempts to deny them personal attention, Wikipedia's anti-vandalism process starts to act as a positive reinforcer, providing them with something to react against to keep their battle going. The vandal begins to see their vandalism activity as part of themselves, something precious, to be defended by any means possible.
The best-known bot, ClueBot NG instead applies machine learning algorithms to a dataset of known good/bad edits, but even these catch only about 40% of all vandalism (that we know about). And since these tools only patrol new edits, they cannot find vandalism that already exists.
What happens to vandalism levels when edits won't show up in the current version of the article – a trial of something like stable versions, where the vandal cannot vandalize the actual article people see, or something functionally similar, is needed. Perhaps a small section (e.g. all articles in a certain category) could be tested out.
Vandalism on Wikipedia is a widespread problem. A number of tools have been created to confront these problems, including editors that review recent changes to Wikipedia, automated programs (bots) that undo very obvious vandalism, an edit filter that flags and sometimes prohibits problem edits, a notice board to block vandalizing editors, and a multitude of software programs and scripts that ...