Tuesday, November 29, 2022
No menu items!
HomeTECHIn Search of Spies on Wikipedia | technology

In Search of Spies on Wikipedia | technology



Wikipedia is a reasonable target for the desire of governments with a desire to influence. Social networks are a public arena where many countries are trying to promote their agenda. But Wikipedia is a few degrees above the trust the global community has placed in it: it is based on a complex system of checks and balances, and dedicated groups of editors monitor and protect every change. Coordinated state interventions, if any, are complex tasks.

These fights do not mean winning in a dark corner of the Internet, but in one of its corners. More than 10,000 editions made by hundreds of editors on the English page of “Russian Invasion of Ukraine 2022” in several months is the highest number this year. This page is the prime example, but others related to the war, such as the case of Vladimir Putin, Ukraine or Volodymyr Zelensky, have experienced similar, albeit small, struggles. To put English-language Wikipedia in context, in September alone, its pages were visited 7.5 billion times and nearly a billion in Spanish.

new report information war wikipediaAnd the From ISD (Institute for Strategic Dialogue) and CSAM (Center for Social Network Analysis) analyzes how state organizations can infiltrate and modify the language of important pages. “Our work does not say empirically that Wikipedia is subject to any kind of measurable degree,” says Carl Miller, one of the authors and director of research at CSAM. It’s trying to unravel what we know about the threat. The document’s general point is more modest: that disinformation researchers and journalists are ignoring Wikipedia as a potential place where information operations are carried out,” he adds.

Wikipedia’s influence isn’t just for those who visit it. Your information feeds in, for example, responses from Siri or Google assistants. In September 2019, the war between Chinese and Taiwanese publishers gave different answers to the question “What is Taiwan?” At one time she said: “A country in East Asia.” In another: “One of the provinces of the People’s Republic of China.” In this case, a BBC investigation revealed 1,600 biased editions of 22 sensitive articles devoted to China. “We cannot verify who made these modifications, why, and whether they reflect broader practice, but there are indications that they are not necessarily organic or random,” he added. In this case, legitimate publishers were harassed or pressured to leave.

long battles

Thus, the primary threat to disinformation is not vandalism, nor the usual battles between dedicated publishers with reasonable but divergent views. In Spanish, for example, there is a long dispute over toponyms for some of the municipalities of Catalonia, Valencia and Balearic, explains Alex Hinogo, editor of Catalan Wikipedia. “It was decided to keep the toponyms Francost Which in some places has more traditions or connotations than others: “San Quirico” for “Sant Quirze”. But this is a matter for society to decide. It is RAE and INE who present geographical names as valid and Wikipedians Spanish speakers use it. It’s a frequent argument, but I don’t think there is a file black hand‘”, Add.

But these chronic battles have nothing to do with the anxiety of the group of researchers in which Karl Miller was involved. “Our hunch is that the biggest threat is Entry [un término específico inglés que podría traducirse como “entrismo”]which is the long-standing infiltration of society by state-backed actors who build a reputation within Wikipedia and then can take advantage of the core policies and governance processes that protect it,” Miller says.

The English Wikipedia is the target of the most sophisticated attacks. But this does not prevent other languages ​​from being affected either, says Santiago de Viana, editor of Wikipedia in Spanish: “I am aware of the skepticism and accusations being leveled about sharing a government coordination to modify content in Spanish. But it is quite different to be able to pretend with reliable evidence or Implement sanctions for this reason,” he says, adding, “During electoral periods, for example, it is common to have an increase in both vandalism and vandalism of promotional editions of politicians, but they tend not to know the people behind these efforts.

east Entry It is more sensitive when presenting, for example, the views of the Kremlin on the invasion of Ukraine. Miller identifies these four characteristics: “First, use more subtle language changes within the rules, such as adding Kremlin versions; second, arrange voting so that you have Other travelers become managers. Third, it uses the administrator’s powers to resolve conflicts, and fourth, it changes the actual ground rules that govern, for example, fonts. “

This is not a hypothetical or imagined example. Wikimedia, the umbrella organization for Wikipedia, has blocked and removed management powers for a group of publishers in China for what it described as “infiltration of Wikimedia systems, including situations with access to personal information, specific information, and influence devices.” be elected.” In other words, the alleged activists or officials have directly attained privileged positions within the society.

What happened to the invasion of Ukraine?

The particular case study used by the report concerns the 86 banned publisher accounts that co-edited the Russian Invasion of Ukraine page in English. The difficulty of detecting the alleged coordinated activity of these accounts is shown in the figures. Over the years they seem to have collectively made 794,771 reviews on 332,990 pages. Prevailing themes ranged from Judaism, Poland, aviation and airports, or Iraq, Libya and Syria.

Here the problems begin. What kind of versions are likely to be biased? Says the report, which focuses on issues involving the use of openly biased media as sources. “The team manually evaluated the mods containing these links and found that 16 of the mods were controversial and contained narratives consistent with Kremlin-sponsored information warfare,” he explains.

But when they looked beyond the Russia Invasion page, they saw that this style of adding biased sources was more common: they found 2,421 examples across 667 pages, from every conceivable Russian conflict to the Formula 1 World Championship to the floods in Pakistan.

“This does not necessarily imply coordination or strategic intent, but it may highlight several areas of Wikipedia that can be investigated more closely,” says the report, whose conclusion echoes Miller: Wikipedia is neglected. “In a world where information warfare is more pervasive and complex, this worries me precisely because Wikipedia is so valuable.”

Wikipedia has safeguards such as warnings and article protections that restrict the type of users that can modify and block both IP addresses and registered accounts. Versions of Wikipedia in different languages ​​have it panels The community can report disruptive or suspicious behavior,” says de Viana. “It suffices to copy the links to the changes made by the editor, explain why there was a violation and notify the officials (“librarians”), who will proceed to make the decision.”

Here we see the interest of state operations to control the positions of officials. Although this is not easy: “It is very difficult for an official to take a controversial action unnoticed,” says Francesc Forte, editor of the Catalan newspaper Wikipedia. “If you ban a random account, someone will complain. It’s complicated.”

You can follow country technology in Facebook s Twitter Or sign up here to receive Weekly Bulletin.



RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments