Wikipedia is a free, multilingual, open content encyclopedia project operated by the non-profit Wikimedia Foundation. Its name is a blend of the words wiki (a technology for creating collaborative websites) and encyclopedia. Launched in 2001 by Jimmy Wales and Larry Sanger, it is the largest, fastest-growing and most popular general reference work currently available on the Internet. It is, by far, the largest encyclopedia in the history of the world, twenty times larger in its volume of words than its closest counter-part, encyclopedia Britannica.
As of April 2008, Wikipedia attracts 683 million visitors annually reading over 10 million articles in 253 languages, comprising a combined total of over 1.74 billion words for all Wikipedias. The English Wikipedia edition passed the 2,000,000-article mark on September 9, 2007, and as of May 28, 2008 it had over 2,390,000 articles consisting of over 1,034,000,000 words. The amazing success of Wikipedia and its implications for the future of knowledge exchange have prompted many to critique and question its model, its reliability, and its future.
The debate regarding Wikipedia’s reliability and social utility center around the following questions. Is Wikipedia reliable as a reference work? Is it as reliable as Britannica? Is it better than Britannica? Is Wikipedia’s openness to editing by anybody around the world a source of strength or vulnerability to non-experts, misinformation, and vandalism? Is its overall model of consensus over credentials valuable or susceptible to a kind of tyranny of the majority? Do these policies tend to cause Wikipedia to suffer from a systemic bias that gives too much priority and attention to popular, albeit insignificant topics. Is Wikipedia’s policy of Neutral Point of View (NPOV) a strength or a problem?
“For its study, Nature chose articles from both sites in a wide range of topics and sent them to what it called “relevant” field experts for peer review. The experts then compared the competing articles–one from each site on a given topic–side by side, but were not told which article came from which site. Nature got back 42 usable reviews from its field of experts.
In the end, the journal found just eight serious errors, such as general misunderstandings of vital concepts, in the articles. Of those, four came from each site. They did, however, discover a series of factual errors, omissions or misleading statements. All told, Wikipedia had 162 such problems, while Britannica had 123.”
Traditional encyclopedias are based on the reputation of certain authors. These authors, though small in number, are highly interested and ostensibly qualified to find good sources for their information, and are therefore expected to produce good quality articles – however, they are not immune to human error.
Articles are not strictly limited in size as they are with paper encyclopedias. Also, the fact that Wikipedia does not use paper is environmentally friendly, as compared to Britannica substantial reliance on paper.
The only recourse on Britannica is to write to an editor, and the errors may be corrected in print in a few years, as opposed to minutes in Wikipedia. Because Wikipedia runs on the internet, any changes are visible to readers almost immediately. Also, using the versioning system, users can determine when and what changes were made to a specific article.
What’s more, Wikipedia has no ads!
It’s important to realize that many of the same criticisms leveled against Wikipedia can be leveled against Britannica.
Cory Doctorow– “if you want to really navigate the truth via Wikipedia, you have to dig into those “history” and “discuss” pages hanging off of every entry. That’s where the real action is, the tidily organized palimpsest of the flamewar that lurks beneath any definition of “truth.”…The Britannica tells you what dead white men agreed upon, Wikipedia tells you what live Internet users are fighting over.”
While traditional encyclopedias might be revised annually, current affairs articles, as well as older articles being edited, are updated thousands of times an hour. That’s a big deal if your interest is in current affairs, recent science, pop culture, or any other field that changes rapidly.
“…Nature sent only misleading fragments of some Britannica articles to the reviewers, sent extracts of the children’s version and Britannica’s ‘book of the year’ to others, and in one case, simply stitched together bits from different articles and inserted its own material, passing it off as a single Britannica entry.” Encyclopædia Britannica argued that the Nature study showed that while the error rate between the two encyclopedias was similar, a breakdown of the errors indicated that the mistakes in Wikipedia were more often the inclusion of incorrect facts, while the mistakes in Britannica were “errors of omission”.
The Register (16 Dec. 2005) – “Science journal Nature chose 42 science articles from both Encyclopedia Britanica [sic] and gave peer reviewers a blind test to find mistakes. That gave the free-for-all web site a fighting chance — as it excluded the rambling garbage and self-indulgence that constitute much of the wannabe “encyclopedia” social science and culture entries. […] Britannica turned up 123 “errors”, and Wikipedia 162. In other words, the quality of information coming from Wikipedia was 31 per cent worse. […] Who could possibly hail this as good news? Two camps, we think. People with a real chip on their shoulder about authority, as we saw earlier this week. People with a contempt for learning, many of you say. But more broadly, only someone more obsessed by process than by the end result can regard this as any kind of victory — something all the popular press missed in their anxiety to gives us an upbeat, good news story from Planet Wikipedia yesterday.”
Stupid articles. Wikipedia has a large number of articles which could be considered rather irrelevant for something billing itself as an encyclopedia, such as teh (a misspelling of the word the), gas mask fetishism (just one of many of Wikipedia’s articles on obscure sexual fetishes), list of films that most frequently use the word “fuck”, goatse (an Internet shock site), Toilets in Japan and The Flowers of Romance (band) (a band that never played live or recorded any material).
The public perception surrounding Wikipedia is unlikely to change. Any resource that enables anyone to edit cannot be fully trusted.
Among Wikipedia’s nearly 3 million articles, such examples are likely to exist just from the size of this number alone. On the other hand, the few comparative studies which have been done so far have found the average factual accuracy of Wikipedia to be similar or sometimes even higher than that of traditional encyclopedias.
Britannica, Encarta and Bartleby (see Wikipedia:Non-Wikipedia disclaimers, which also includes examples from reputed news organisations). Sometimes the staff of those encyclopedias seem to forget about the disclaimers.
Andrew Orlowski, “Wikipedia science 31% more cronky than Britannica’s”, The Register (16 Dec. 2005) – “Science journal Nature chose 42 science articles from both Encyclopedia Britanica [sic] and gave peer reviewers a blind test to find mistakes. That gave the free-for-all web site a fighting chance — as it excluded the rambling garbage and self-indulgence that constitute much of the wannabe “encyclopedia” social science and culture entries. […] Britannica turned up 123 “errors”, and Wikipedia 162. In other words, the quality of information coming from Wikipedia was 31 per cent worse.
An amateurish article to be improved later is better than nothing.
It is a myth that other encyclopedia articles are written by experts in the field of the article. Rather, articles are written by professional writers and journalists that are non-specialists in the field in which they are writing articles. Typically, one of these writers is given the task of writing a couple dozen articles on topics in which they have no expert knowledge.
Human-beings are not static creatures that can be labeled, for their lifetime, experts or amateurs. Amateurs that enter the Wikipedia project often become experts in a particular area of interests and become, over-time expert researchers and editors. Any complaints about an amateur on Wikipedia today, may prove invalid over time.
Tim Berners-Lee, the inventor of the World Wide Web, repeatedly mentions in his book “Weaving the Web” that the web has grown into a medium that is much easier to read than to edit. He envisaged the web as a much more collaborative medium than it currently is, and thought that the browser should also function as an editor. Wiki-based sites are closer to his vision. In fact, the first web browser was also a web editor.
(with the exception of repeat vandals) with some time on their hands. Moreover, there are some experts at work here. Over time, the huge amount of solid work done by hobbyists and dilettantes can (and no doubt will) be hugely improved upon by experts. This both makes Wikipedia a pleasant intellectual community (or so it seems to some) and gives us some confidence that the quality of Wikipedia articles will, in time, if not yet, be high.
Because the highly intelligent editors on Wikipedia come from all over the world, Wikipedia can give the reader a genuine “world view”.
Many critics of Wikipedia insist articles are written by amateurs and are not reliable, but in fact many contributors on specific matters are professionals or have firsthand knowledge on the subjects they write about. Wikipedia’s mathematics section, for example, benefits greatly from the dedication of several mathematicians who are very active on Wikipedia.
Where else can you get lovely articles on such-and-such town or so-and-so bizarre hobby written by actual residents/practitioners?
Is restricting who writes about what the best way to reach and maintain high standards? Perhaps a more open way is better. Wikipedia is a good test of that proposition.
Anyone, anywhere, anytime can edit Wikipedia. Because of this, not all information on wiki resources can be reliable. Not all information is accurate and “weasel editors” constantly purposely ruin pages and change information to make false. While a lot of wiki information is reliable, there will always be false and unreliable information. Millions of people around the world edit Wikipedia and other wiki resources, and not all of them can or will give accurate, reliable information.
Wikpedia has a very community-based ethic. This gives little credit to “experts”. In fact, Wikipedians often despise “experts”. This undermines the involvement of experts and scholars in editing on the site, which undermines the quality of articles as well as the credibility that the public affords to the site.
Good quality requires peer review and expertise. Why should we care about articles written by an arbitrary group of people whose knowledge and ability could range from expertise to hopeless ignorance? Ignorance mixed with knowledge does not benefit knowledge.
In a related problem, large articles constructed via numerous (individually reasonable) edits to a small article can look okay “close up”, but are often horribly unstructured, bloated, excessively “factoid”, uncohesive and self-indulgent when read through completely. In short, adding a sentence at a time doesn’t encourage quality on a larger scale; at some stage, the article must be restructured. This happens nowhere near often enough. Users who try to do this inevitably encounter hostility or resistance, until they figure out that they should do it with a throwaway pseudonym, not a real name.
This is rather than having to become misanthropes, terrorists or political researchers. Some people will take great pleasure in demonstrating the idiotic futility of such rubbish. This seems like a positive quality of Wikipedia, until one realizes that any sufficiently toxic or stupid view will quickly acquire more adherents, and that defenders of a particular view will tend to create factions that might soon exist offline. And that any group perceiving itself as beleaguered or disadvantaged will band together more readily, and achieve common cause more readily. Is Wikipedia the breeding ground for this century’s cults?
Jaron Lanier, Digital Maoism – “The hive mind is for the most part stupid and boring. Why pay attention to it? The problem is in the way the Wikipedia has come to be regarded and used; how it’s been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it’s now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn’t make it any less dangerous.”
It is unfair for critics of Wikipedia to alienate this online encyclopedia as “unreliable” or a poor reference guide. It must be noted that it is generally frowned upon in the academic community to reference encyclopedias of any kind.
Many believe that Wikipedia can be used only as long as it is not considered a “serious reference”. But, what does serious mean? Serious can mean: timely and up to date, requiring of citations and footnotes, open to change all the time, with no unalterable dogma, and immune to political or economic pressure. Wikipedia fits all of these criteria. Only a select criteria of “seriousness” could prevent Wikipedia from being considered serious. In any case, its loyal writers and readers almost all consider it to be a very serious, and historic resource. It seems that those that are not involved in the project or that don’t consistently use the resource are the only ones that don’t consider it a “serious” encyclopedia.
Robert McHenry said that Wikipedia errs in billing itself as an encyclopedia, because that word implies a level of authority and accountability that they believe cannot be possessed by an openly editable reference. He said, “to the ordinary user, the turmoil and uncertainty that may lurk beneath the surface of a Wikipedia article are invisible. He or she arrives at a Wikipedia article via Google, perhaps, and sees that it is part of what claims to be an “encyclopedia”. This is a word that carries a powerful connotation of reliability. The typical user doesn’t know how conventional encyclopedias achieve reliability, only that they do.”[1]
John Willinsky reported a preference for online sources over print sources – “Only four out of the 100 entries relied exclusively on print sources (and they were single–source entries), while print sources turned up in a dozen entries in total…Online sources were clearly favored among contributors, as the greater interconnectivity which the Internet represents, compared to print culture, also forms part of Wikipedia’s quality as an instrument of knowledge and learning.”
Wikipedia brings hundreds of thousands of minds together to share their collective knowledge. This is compared to traditional encyclopedia’s in which a much smaller number of minds are brought together. More minds have greater knowledge and have clearly produced greater encyclopedia of knowledge on Wikipedia.
“Wikipedia seeks not truth but consensus, and like an interminable political meeting the end result will be dominated by the loudest and most persistent voices.”
It makes a travesty of the revert rule when one individual can simply send an e-mail alert to friend requesting a timely “revert favour” once he has reached the limit of his daily reverts. This may apply to deletion debates as well, where a group of editors may be organised so as to always vote en masse in favour of keeping article written by one of the gang, or related to the gang’s main field of interest.
Many great advances in the social and natural sciences have come by challenging the status quo and, because of that, their contributions were ignored or belittled by their peers. For example, George Akerlof, Nobel Laureate in Economics in 2001, had his classic paper (for which he won the Nobel Prize) entitled “The Market for Lemons: Quality Uncertainty and the Market Mechanism” rejected by the American Economic Review for being trivial and by the Journal of Political Economy because it conflicted with economic theory. Only after submitting it to a third journal, the Quarterly Journal of Economics, did the breakthrough article become published. Wikipedia allows for discourse where other venues would not.
While Wikipedia itself does not have an official peer-review process, it does encourage the use of reliable published sources that involve have a strong peer-review process. It, therefore, does rely on the formal peer-review; just not its own.
Wikipedia has achieved what success it has so far precisely by being as open as it has been. So, again, we don’t want to kill the goose that lays the golden eggs.
Why should we care about articles written by an arbitrary group of people whose knowledge and ability could range from expertise to hopeless ignorance? Ignorance mixed with knowledge does not benefit knowledge.
At the moment it is entirely in the hands of an individual whether he thinks a modification he intends is an improvement, so there comes a point when a modification is as likely to damage the resource. If some system could be installed, then you would protect against crank attacks as well as misjudgment, and ensure a continually improving resource.
It is unfair for critics of Wikipedia to alienate this online encyclopedia as “unreliable” or a poor reference guide. It must be noted that it is generally frowned upon in the academic community to reference encyclopedias of any kind.
Openness and transparency are the greatest of democratic principles. Indeed, the hypothesis that openness is to the benefit of quality has already been tested, and to the benefit of the hypothesis: articles that have been worked on by many different people in the context of Wikipedia are now comparable to articles that can be found in some excellent encyclopedias. If, however, you insist on considering the hypothesis a priori, please ask yourself: which is more likely to be correct? 1. A widely circulated article, subject to scrutiny, correction, and potentially constant improvement over a period of months or years, by vast numbers of experts and enthusiasts, possibly updated mere minutes before you read it. Or, 2. An article written by a nonspecialist professional writer or scholar (as many encyclopedia articles are), mostly shielded from public review and improvement, likely over a year ago.
The majority of people will want to help grow wikis for the public good. There are only a few people that want to cause harm.
While Wikipedia is a useful resource, it is generally factually unreliable.
or even the entire article itself, ruining lots of work. This is referred to as “blanking” by those in the Wikipedia community, and is considered vandalism. Such “blanking” is typically fixed (by reverting to the previous version of the page, before the text was removed), within minutes. However, within those few minutes, or in the few cases where such blanking is first noticed by a viewer who is not aware of the history feature of Wikipedia pages, a page may seem to be severely lacking information, or be otherwise incomplete, due to this deletion.
destroying readability and all sense of proportion. Attempts to redress this are often futile and occasionally result in warnings, due to the inherent bias in the Wikipedia community that bigger is somehow better.
Articles seem to be getting steadily more polished. Articles have a tendency to get gradually better and better, particularly if there is one person working on an article with reasonable regularity (in that case, others have a tendency to help). There are some articles we can all point to that started out life mediocre at best and are now at least somewhat better than mediocre. Now suppose this project lasts for many years and attracts many more people, as seems perfectly reasonable to assume. Then how could articles not be burnished to a scintillating luster? Also, how could the model itself not continue to evolve and improve?
Dross can proliferate, rather than become refined, as rhapsodic authors have their articles revised by ignorant editors. Anyone can add subtle nonsense or accidental misinformation to articles that can take weeks or even months to be detected and removed.
the ones followed by each contributor, and in some cases, these are very high standards indeed. (For example, we encourage all contributors to cite their sources.) As traffic increases, so will expert help, and as gaps are filled in, the only way remaining for Wikipedia to improve will be in quality and depth. This, in turn, is likely to attract more experts, who follow their own very high standards.
To make a claim about what standards Wikipedia follows is to make a claim about what standards present and future Wikipedia contributors follow; the current standard is always changing. To say that such people have no standards is baseless.
for what they put into their publications, both online and offline. Wikipedia has no such standards, so it’s bound to be low quality.
Many of your replies seem to assume that quality will improve as the website grows, but quantity doesn’t always beget quality. Perhaps it will get worse as it gets bigger?
To access the second half of this Issue Report Login or Buy Issue Report
To access the second half of all Issue Reports Login or Subscribe Now