Last Wednesday, the Syracuse Post-Standard published an article lambasting the authority of Wikipedia because it is user-edited and anyone can make a change to its content (Librarian: Don’t use Wikipedia as source).
Techdirt took the article to task for misunderstanding how Wikipedia works (Misunderstanding Wikipedia)
There’s just something that seems to freak people out about Wikipedia, when they can’t fathom the idea that “the masses” could produce something of value by simply being able to correct each other, allowing them to build something much more beneficial and much more useful than an expensive encyclopedia edited by just a few people. The columnist ends his piece by stating: “you need to be careful about trusting what you read,” while taking this email from a random librarian completely at face value.
Techdirt then contacted the author of the offending newspaper article with more information about how projects like Wikipedia work and why they can be authoritative. However, that exercise apparently collapsed into sheer invective on the part of the newspaper writer. This seems odd since the author of the original piece, Al Fasoldt, is a long-time tech reporter. In any case, see Techdirt’s version of the exchange (Who Do You Trust, The Wiki Or The Reporter?).
Joi Ito has chimed in with one explanation of Wikipedia’s resilience (Wikipedia attacked by ignorant reporter):
The fact that anyone can edit the pages appears to be why people like Mr. Fasoldt question its authority, but that is that exact reason that it has authority. Any comments that are extreme or not true just do not survive on Wikipedia. In fact, on very heated topics, you can see the back and forth negotiation of wordings by people with different views on a topic until, in many cases, a neutral and mutually agreeable wording is put in place and all parties are satisfied. Tradition authority is gained through a combination of talent, hard work and politics. Wikipedia and many open source projects gain their authority through the collective scrutiny of thousands of people. Although it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived.
Speaking of which, Techdirt challenged Fasoldt to make some factual changes to the Wikipedia and see how long untruths could survive. Fasoldt has not taken the challenge, but Alex Halavais, an Assistant Professor of Communication and the Director of the Masters in Informatics program within the School of Informatics at the University at Buffalo, has (The Isuzu Experiment):
No matter which side of the debate you find yourself on, this sounds like an interesting experiment. So, I have made not one, but 13 changes to the wikipedia site. I will leave them there for a bit (probably two weeks) to see how quickly they get cleaned up. I’ll report the results here, and repair any damage I’ve done after the period is complete. My hypothesis is that most of the errors will remain intact.
Nope. According to Halavais “all [the changes] were identified and removed within a couple of hours. I could have been a bit trickier in how I made changes; nonethess, I am impressed.” There are also some great comments on the ethics of the experiment as well as suggestions for future experiments.
One place this debate has been discussed with great insight is Corante’s own Many 2 Many which also provides a wealth of linkage (Wikipedia Reputation and the Wemedia Project). In addition to the insight, there is announcement of a cool new project for journalism schools and media centers
Which brings me to an lingering thought — that explicitly codifying reputation introduces a cost which can constrain commons-based peer production. Wikipedia was never supposed to work, somehow does because of good club theory and transaction costs, and has gained a reputation as a resource. Introducing reputation for contributors or articles is the greatest risk to the Wikipedia community. Getting a base study on factual accuracy can help inform this decision as well as educate the public on how to use and participate with this commons resource.
I’ve been quitely forming a group of journalism schools, media centers and experts to engage in the Wemedia Project, which begins with a formal Wikipedia Article fact checking excercise and publishing findings. The USC Annenberg Center has already announced their support and next month we will begin the collaborative research process within a Socialtext Workspace. Without getting into defining truth, you can separate issue of fact, value or policy. The approach is to apply a formal fact checking process to a sample of articles to gain a baseline measure of factual accuracy and explore issues of reputation. [links in original]
Read the whole thing.
Teleread also some interesting thoughts on the issue, though reputational changes are going to be tough ones to figure out (Wikipedia vs. bashers).
One aspect of this that is interesting to me, is the distinction between the authority of a relatively anonymous collective in contrast to the authority of named bloggers. For example, Dana Blankenhorn argues that transparency is a key element to the authority of bloggers (Transparency Makes Blogs Believable):
This transparent relationship is at the heart of blogging credibility. J.D. Lasica tried to explain this to the “media industry” in a recent OJR piece…
- Transparency of motives
- Transparency of process
- Transparency of expertise, and
- Transparency on mistakes are all keys to success, he writes.
Absolutely. Transparency is also critical in Wikipedia, but the emphasis is different. Process and mistakes (I would call it “corrections”) are emphasized, rather than motives or expertise.
Finally, Mary Hodder, who is now working like a demon for Technorati, has an intriguing post that unintentionally ties these two concepts (blog authority and wikipedia authority) together (Digital Ethics II.. and the New Commodity In Online Media). Her thread regards a debate about digital ethics, which is worth following as well.
Fascinating reading, all of it.