
Wikipedia bots can enter in editing conflict with each other.
Wikipedia might be world’s most comprehensive online encyclopedia, which features millions of articles, which are quite difficult to be maintained by humans alone. Therefore, bots were developed to facilitate this task, but a new study shows that they occasionally start editing wars among themselves which can last for years.
Wikipedia bots are mostly used to undo editing vandalism, enforce bans, check the spelling in articles as well as create links and import content. However, the website employs other types of bots meant for mining and identifying data, copyright infringements and more, besides editing.
A new study, published in the journal PLOS ONE, conducted by researchers from the Alan Turing Institute in the UK and the University of Oxford, managed to analyze how the Wikipedia bots interacted over a period of ten years, between 2001 and 2010, on 13 different language editions of the website.
The researchers concluded that while the bots were not specifically designed to interact with each other, it occasionally happened and lead to unpredictable consequences. The study shows that bots are similar to humans in the way that they interact in culturally different online environments. The researchers highlighted the fact that their findings should be used as a warning for tech companies which are developing AI for various tasks without considering the different cultures of the bots and their diverse social life.
Bots have become quite prevalent in the online ecosystem, as a way to facilitate complex and time-consuming tasks. However, we still do not enough about how they interact with each other. The researchers found that the Wikipedia bots on the German language edition had the fewest conflicts, as the bots re-edited each other’s post around 24 times on average over ten years.
Wikipedia bots on the Portuguese edition conflicted around 185 on average, while on the English version, the bots undid each other’s work around 105 times, three times the rate of human editors, according to the researchers.
The Wikipedia bots represent just 0.1 of the website’s editors, but they are responsible for the largest proportion of edits and their conflicts only represent a small percentage of their overall work. However, the conflict does help scientists better understand their behavior, complexity, and unpredictability.
Image source: Wikipedia