But how do you know if it's accurate?
Alright, this is the most awesome thing I have seen in a very long time.
People have been talking for years (at least, I have) about color-coding words in Wikipedia articles based on how long they've survived unchanged. You could see at a glance what had lasted hundreds of revisions, and what had just been added.
Other people have been talking for years about some sort of reputation system -- you could vote on whether an article was reliable, or whether a user was. There are all sorts of problems with this kind of thinking, but I won't get into them because they've
just been made moot: *We present a content-driven reputation system for Wikipedia authors. In our system, authors gain reputation when the edits they perform to Wikipedia articles are preserved by subsequent authors, and they lose reputation when their edits are rolled back or undone in short order. Thus, author reputation is computed solely on the basis of content evolution; user-to-user comments or ratings are not used. The author reputation we compute could be used to flag new contributions from low-reputation authors, or it could be used to allow only authors with high reputation to contribute to controversial or critical pages. A reputation system for the Wikipedia could also provide an incentive for high-quality contributions.
We have implemented the proposed system, and we have used it to analyze the entire Italian and French Wikipedias, consisting of a total of 691,551 pages and 5,587,523 revisions. Our results show that our notion of reputation has good predictive value: changes performed by low-reputation authors have a significantly larger than average probability of having poor quality, as judged by human observers, and of being later undone, as measured by our algorithms.
And I haven't even gotten to the good part.
The same people
developed a color-coding system based on their new trust metric. Text contributed by authors with a high content-driven reputation looks normal (black on white); text contributed by authors with a low reputation has an orange background (of varying shades).
Be sure to click "random page" a few times and page through the article histories.
(Damn. Now I want to go to
Wikimania. Anyone want to buy me a ticket to Taipei?)
A note to the UCSC Wiki Team, who created this: even if your proposal doesn't get implemented on the Wikipedia servers (because of community opposition, or lack of resources), you can still implement it yourself (on wikipedia content, yes) via greasemonkey, or a firefox extention, or a web-based mashup, or whatever.
* Update: Well, not completely moot. Any automated reputation system can be gamed, and
here are a few cantankerous side effects this one might have. That's one reason it may be a better idea to roll it out as an add-on than on the Wikipedia servers themselves.