There was another point of danah’s I wanted to respond to, but yesterday’s post had gotten quite long enough, and in any case, I had a slightly different take in mind for this, including a LazyWeb request at the end, relating to this image:
danah says “Wikipedia appears to be a legitimate authority on a vast array of topics for which only one individual has contributed material. This is not the utopian collection of mass intelligence that Clay values.” This misconstrues a dynamic system as a static one. The appropriate phrase is “…for which only one individual has contributed material so far.”
Wikipedia is not a product, it is a system. The collection of mass intelligence that I value unfolds over time, necessarily. Like democracy, it is messier than planned systems at any given point in time, but it is not just self-healing, it is self-improving. Any given version of Britannica gets worse over time, as it gets stale. The Wikipedia, by contrast, whose version is always the Wiki Now, gets better over time as it gets refreshed. This improvement is not monotonic, but it is steady.
Picking up on yesterday’s theme of authority, the authority of, say, Coleridge’s encyclopedia was the original one: authority derived from the identity of the author. This is like trusting Mom’s Diner, or the neighborhood tailor — personal reputation is worth preserving, and helps assure quality.
The authority of Britannica, by contrast, is the authority of a commercial brand. Their sales are intimately tied into their reputation for quality, so we trust them to maintain those standards, in order to preserve an income stream. This is like trusting Levis or McDonald’s — you don’t know the individuals who made your jeans or your french fries, but the commercial incentive the company has in preserving its brand makes the level of quality predictable and stable.
So, is Wikipedia authoritative? No, or at least not yet, because it has neither the authority of the individual merchant or the commercial brand. However, it does have something that neither mechanism offers, which is a kind of market, where the investment is time and effort rather than dollars and cents. This is like the eBay model, where people you don’t know (no Mom’s Diner effect) can sell unbranded things like art or one-of-a-kind clothes (no Levis effect). EBay can do this because the syndication of user attention and the possibility of recourse for bad behavior keeps people generally honest.
Now when eBay launched, people were skeptical, because the site wasn’t trustworthy. The curious thing about trust, though, is that it is a social fact, a fact that is only true when people think it is true. Social facts are real facts, and have considerable weight in the world. The fact that someone is a judge, for example, is a social fact — the authority that attaches to judgeship is attached by everyone agreeing that a certain person has the right to make certain statements — “Court is adjourned”, “I sentence you to 5 years in prison” — that have real force in the world. Those statements are not magic; their force comes from the social apparatus backing them up.
Ebay has become trustworthy over time because the social fact of its trustworthiness grew with the number of successful transactions and with its ability to find and rectify bad actors. Indeed, the roughest periods in eBay’s short life have been when it has seemed in danger of being a platform for fraud.
Like trustworthiness, authority is a social fact, though authorities often want to obscure this. A PhD is an authority figure because we all agree that the work that goes into getting a doctorate (itself a social fact) is a legitimate source of authority.
So, under what conditions might the Wikipedia become a kind of authority, based on something other than authorship or brand? And the answer to that question, I think, is when enough people regard it as trustworthy, where the trust is derived from the fact that many eyes have viewed a particular article.
And here danah points to something interesting — she believes, and I believe with her, that a Wikipedia page created by a single user isn’t as valuable as a page that has been edited by many users. Seeing this, I wrote a little script that fetches both a Wikipedia page and grabs 4 relevant facts from that page’s history: number of edits, number of editors, and the first edited and most recently edited dates, and put that info just below the page title, like so.
It’s damn slow, and should in any case be either a bookmarklet (O LazyWeb, hear my plea…) or a setting on the Wikipedia itself, but here are some others: Vinyl [Edited 25 times by 18 users, between 20 May 2001 and 3 Jan 2005], England [Edited more than 500 times, most recently by 209 users, between 2 Jun 2004 and 6 Jan 2005], Orca [Edited 319 times by 155 users, between 17 Oct 2001 and 6 Jan 2005], and so on.
You can always get to any given page’s history on the Wikipedia, of course, but it is a click away from the content page, and is so detail-filled that it is essentially inside baseball — it is mostly for other editors, not for readers. But the facts danah wants are there, under the surface — how many people have worked on this page? 1? 50? 200?, as well as the facts I’m interested in: how many times has it been edited? (More is better, and a low ratio of contributors to edits suggests a serious conversation went into the pages creation.) How long has it been around for review? (Older is better.) How recently was it updated? (How live is the work on the page?)
I am guessing that this kind of dashboard info, embedded in any given content page, would have two good effects — first, it would give the users a trust profile per item, and second it would serve notice to users about how often Wikipedia articles are edited — the article on England was edited over 500 times in the last 6 months alone. This would help viewers see that they are, at any given moment, seeing a static snapshot of a dynamic system, rather than a finished product.
And the more macro point is that Wikipedia is still in the early days of experimenting with models of governance, editing, or, as here, presentation to the users. That’s one of the remarkable things about the fluidity of web technologies — new things can be tried, modified, and retained or thrown out easily and continuously (it took me two dozen lines of python to add the header I wanted, and I am a lousy programmer,) giving Wikipedia a lot more headroom for earning the users’ trust.
And once that social fact is established, authority, albeit of a more diffuse and systems-oriented sort, won’t be far behind.