Wikipedia vs. Scientologists 2.0
The Wikipedia/Scientology fiasco has finally gone mainstream: several articles in the mainstream press (and even a post on Gawker) attest to the growing public interest in the case. In the last few days, I’ve done a few media interviews on the subject and have had a time to reflect on Wikipedia’s decision since my first post (which, ...
The Wikipedia/Scientology fiasco has finally gone mainstream: several articles in the mainstream press (and even a post on Gawker) attest to the growing public interest in the case. In the last few days, I’ve done a few media interviews on the subject and have had a time to reflect on Wikipedia’s decision since my first post (which, surprisingly, attracted quite a few comments). Here are some more observations on the matter:
First, in our digital age when more and more PR companies are exploiting new media for the benefit of their clients, banning computers of an institution such as the Church of the Scientology is not going to be very effective. Given how much money the Scientology machine could spend on its image-shaping activities, I bet they can afford several new media companies that would do all the dirty work from a variety of IP addresses. Now that Wikipedia has raised the bar — i.e. everyone is going to watch the Scientology page with increased interest — there are additional benefits for the Scientologists to find a way in which to establish control over that page. Their incentives have quadrupled. This, by itself, is very interesting: a decision to ban an offeder generates enough media attention to make the banned activity more appealing to the. We should watch out for innovation in this space; Wikipedia is not immune to manipulation — and its administrators know this better than anybody.
Second, there are hundreds, if not thousands, of cases similar to Scientology. Wikipedia’s editing wars have been raging since the very beginning — and you can see an extensive list of the lamest of these cases here (yes, people have debated the gender of God, the ethnic roots of Jennifer Aniston, and even the occupation of Joe the Plumber). To me, the ban on Scientology (and some of their detractors, which, as I should point out, did get banned as well) never made sense for one reason: this is only a temporary solution, which doesn’t address the broader issue of how to ensure that controversial articles are being updated in a normal (i.e. timely and cost-effective) fashion even if they attract too many vandals. Look at that list of the lamest Wikipedia edit wars (which, I should also point out, is not at all comprehensive): it looks that the Scientology-type of pages are the easiest to deal with. After all, it’s easy to trace the IP addressed from which the edits were made to their office space and so forth.
However, I’d challenge Wikipedians to do the same with, say, people who think that Pluto should be listed as a planet (this has been a subject of an editing war as well). Short of pro-Pluto lobbying associations (do they even exist?), it would be impossible to track this loose group — and Wikipedians would need to go after IP-addresses of all the pro-Pluto contributors.
In my reading, this reveals the need for better tools to distinguish between legitimate and illegitimate edits; this has to be done — at least, on some very basic level — automatically, either based on the previous user edit history (say, by giving each user a score for how well they have fared in the past and then using this score to determine the quality of their edits) or by using some other proxy.
One such tool already exists. In fact, in has been around for almost two years, even though it’s not well known yet. This tool — available as an extension to the MediaWiki software that runs Wikipedia — is called WikiTrust and it has been developed by researchers at the University of California at Santa Cruz.
According to the WikiTrust Web site, it computes the reputation of each author by analyzing the author’s contributions. Whenever an author makes a contribution that is preserved in subsequent edits, the author gains reputation. When an author makes a contribution that is undone or reverted quickly, the author loses reputation. Various colors are then used to color Wikipedia articles according to the “trust” score of each user. Anyone who reads the article thus gets automatic cues on whether what they are reading is trustworthy or not (you can check how it works here with a rather controversial subject – Reaganomics).
I think that this is a very elegant solution, which passes down the costs of verification to the reader. Of course, without ANY editorial standards, most Wikipedia entries would become chaotic; this is not what I am advocating. Instead, we should think of new ways in which these tools can be incorporated into Wikipedia and similar projects, for it would not only unburden the editors, it would also empower the readers (not to mention that it can help increase new media literacy).
This, by the way, is not that different from Wikipedia’s own proposal to roll-out “flagged revisions“, whereby content contributed by newbies would first need to be approved by senior editors before unregistered users would be able to see it on the site. I think “flagged revisions” is not a bad idea, in principle, but I think it may not scale: imagine how much new content any senior editor may need to greenlight on a daily basis. Given that all of them are unpaid, it’s hard to expect them to do a decent job at this.
Personally, I’d rather have some combination of color-coding and “flagged revisions” to help Wikipedia automate its revision/editing process rather than continue observing outright bans on groups like the Scientologists. My reasoning here is quite simple: Wikipedia is too important of a project to have its editors waste their time on silly wars; there are so many other things that need to be fixed about Wikipedia and its content (making it easier for “normal” people to edit it, for example, or solving the unprodutive split between deletionists and inclusionists). As Wikipedia is still a voluntary project and its top contributors and admins work for free, I’d rather have them be a little bit more focused on how their prioritize their time and resources.
Thus, I’d rather have them spend this time editing and approving new articles rather than getting bogged down with juvenile arguments over the proper way to refer to Pluto (which is an important issue, but NOT really so important to spend hours debating it or even going through the past arguments to figure out who’s right). Having these people waste their time on battling with Scientologists and their enemies (and thousands of other disgruntled contributors with their own agendas) does not seem like a very good idea to me.
Unfortunately, with this decision to ban the Scientologists, Wikipedia admins will only make their lives more difficult (read: they will waste more time going through very obscure arguments). I am extremely skeptical that the Scientologists are ever going to give up — after all, Wikipedia is dealing with a rather fanatical cult, don’t they know?
Photo by mrmanck/Flickr
More from Foreign Policy
Can Russia Get Used to Being China’s Little Brother?
The power dynamic between Beijing and Moscow has switched dramatically.
Xi and Putin Have the Most Consequential Undeclared Alliance in the World
It’s become more important than Washington’s official alliances today.
It’s a New Great Game. Again.
Across Central Asia, Russia’s brand is tainted by Ukraine, China’s got challenges, and Washington senses another opening.
Iraqi Kurdistan’s House of Cards Is Collapsing
The region once seemed a bright spot in the disorder unleashed by U.S. regime change. Today, things look bleak.