Jump to content

Wikipedia:Replies to common objections

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by MartinHarper (talk | contribs) at 20:59, 4 August 2003 (rm 4). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


Short name: Wikipedia:replies

Some people have very strong reactions to Wikipedia. Some are nearly instantly hooked, and they love the idea; others think the idea is so absurd as not to require any serious consideration. There are a number of very common criticisms of the Wikipedia project, which we try to answer here.

Many of the criticisms levelled at Wikipedia are not unique to it, but are due to the fact that Wikipedia is, at bottom, a wiki. Many of the same objections have been made to other wikis.

Letting arbitrary Internet users edit any article at will is absurd

My prose

I can't imagine having my golden prose edited by any passer-by. It's mine, so why would I let others touch it?

We (on Wikipedia) don't individually try to "own" the additions we make to Wikipedia. We are working together on statements of what is known (what constitutes human knowledge) about various subjects. Each of us individually benefits from this arrangement. It is difficult to singlehandedly write the perfect article, but becomes easier when working together. That in fact has been our repeated experience on Wikipedia. Consider the following example:
I thought I understood Gödel's incompleteness theorem pretty well, and since the then existing article was short and incomplete, I decided to rewrite it. Since then, several people have chipped in, sometimes rewriting a paragraph, sometimes critizing an omission, sometimes deleting parts. I didn't agree with all changes, but with most of them. No material is ever lost since Wikipedia stores all previous versions of all articles. So I reverted a few changes back. Overall, the article is now much better than I could ever have written it alone.
We assume that the world is full of reasonable people and that collectively they can arrive eventually at a reasonable conclusion, despite the worst efforts of a very few wreckers. It's called optimism.

Cranks

Cranks are posting ridiculous theories on the Internet all the time. They will come here and ruin everything.

So far, we have had relatively few cranks on Wikipedia, and it's pretty easy to just delete patent nonsense as soon as it appears on the Recent Changes page.
There are websites out there that say the moon landing was staged in a movie studio, or describe supposed perpetual motion machines. But you cannot correct those websites, no matter how wrong you think they are, because they are written by people who would never allow their work to be edited without their permission. They do not thrive on Wikipedia.
This does not mean that idiosyncratic points of view are silenced or deleted; rather, they are contextualized by attributing them to named advocates. The more idiosyncratic an entry, the more likely it is to be modified. Because there is no ownership of the information on Wikipedia, an individual is compelled to contribute information that is convincingly true. Thus, cranks who cannot accept critical editing of their writing find they have no platform and leave; those who are willing to present their interests in less-biased ways stop being cranks.

Some cranks are very persistent. Someone could write up a crankish page on the Holocaust, and keep reverting it back to their version.

Generally, partisans of all sorts are kept under the gun. Wikipedians feel pretty strongly about enforcing our nonbias policy. We've managed to work our way to rough consensus on a number of different topics. People who stubbornly insist on an article's reflecting their personal biases are rare, and then they generally receive a drubbing.
If opprobrium isn't enough to stop someone, we can ban them and use technical means to stop them from making further edits to Wikipedia.

Trolls and flamers

Wikipedia is going to end up like Usenet--just a bunch of flame wars.

This is a bit more of a problem, but it is dealt with fairly handily by the social mores of Wikipedia, aka Wikiquette. Arguments on article pages get moved either to a corresponding talk page (e.g. talk:theory of relativity) or to a new article page which presents the arguments within a neutral context (e.g., operating system advocacy).
The argument on the talk pages tends to be centered on how to improve the article, rather than on the merits of various competing views. We have an informal but widely-respected policy against using talk pages for partisan wrangling that has nothing to do with improving articles.
Usenet lacks at least two features that are absolutely essential to Wikipedia's success: (1) on Usenet, you can't edit other people's work, while we can here on Wikipedia, thereby encouraging creative and collegial collaboration; or more strongly, on Wikipedia, there's no such thing as "other people's work", because there's no ownership of information; (2) Usenet does not have the possibility of peer pressure and community-agreed and -enforced standards, which Wikipedia does have. Moreover, Usenet is a debate forum. Wikipedia is, very self-consciously, an encyclopedia project!

Amateurs

There are plenty of ignorant people who think they know stuff: your articles will end up riddled with errors and serious omissions.

In all honesty, Wikipedia has a fair bit of well-meaning, but ill-informed and amateurish work. In fact, we welcome it - we'd rather have an amateurish article on a subject that can be later improved, rather than nothing. In any case, when new hands (particularly, experts on the subjects in question) arrive and go to work, the amateurish work is usually straightened out. Really egregrious errors are fixed quickly by the scores of people who read Wikipedia every day.
Amateurs generally recognise when they're talking to an expert on a subject, and start contributing in a different way - by asking questions, saying which bits of an article are unclear, and doing some of the "grunt work" of research. Wikipedia benefits from having amateurs and experts, working together. This makes our math articles more understandable than the average math textbook.

Partisans

There are plenty of partisans who are all too eager to leave out information that is important to presenting a balanced view. They'll be all too eager to post to Wikipedia, and that's going to create huge gaps in your coverage, which will ruin the project.

Frequently the initial author omits crucial information, whether due to ignorance or malice. In many cases, but not all, this is fixed quickly by the scores of people reading Wikipedia every day. For example, Wikipedia has fairly decent, balanced articles about abortion, Scientology, and prostitution.
Bear in mind is that Wikipedia is a work-in-progress, a draft, an "alpha release" if you will. It does have many important gaps, which we try to make explicit. This lack of coverage isn't due to ignorance, partisans, cranks, or anything else malicious--it's due simply to the finite amount of time that a finite number of people have been working on it.

Advertisers

But what about advertisers? Won't those with a product or service to hawk see the opportunity to hit a targetted market and write new articles for their product or worse, edit the article that corresponds to their generic product class (e.g. computer) to an ad for their product?

This kind of thing has already happened. There are basically three forms - adding excessive external links to one's company, outright replacing legitimate articles with advertising, and writing glowing articles on one's own company. The first and second forms are treated as pure vandalism and the articles are reverted. Most Wikipedians loathe spam, and spammers are dealt with especially severely. The third form is normally dealt with by editing the article for a neutral point of view, often by adding any relevant "dirt" available on the company.
Corporate advertisers would likely not find Wikipedia to be an attractive advertising medium. In traditional web-based advertising, such as banner ads, popup ads, and email advertising, the response rate can be directly measured, either through web bugs or server logs. If a company used Wikipedia to peddle their goods, the response rate would not be able to be measured.
Not being able to measure results may not stop individuals who want to advertise their new multi-level marketing scheme, but unless they're using a bot (see next section), it takes more time and energy to keep reverting the page back to the advertisement that the would-be spammer would get their message viewed (in an uneditable form!) more often and more reliably by using a traditional advertising medium.
Ironically, advertising spam can actually be beneficial to Wikipedia. Suppose an advertiser for penis enlargement products edited that wiki to an ad for its product. A reader that happens by and sees the spam could copy the advertisement, revert the page to its previous state, and then add information discussing the advertiser's specific methods or claims to the wealth of knowledge on the subject. In effect, advertisers' claims, when tempered and weighed against other knowledge associated with the subject, can yield a more robust article than before.

Bots

You still haven't addressed the real bane of Usenet: massive automated spamming. It would be trivial to write a script to post Viagra ads to all Wikipedia pages, and once spammers or vandals start to use wikibots, you're sitting ducks.

Sooner or later, yes, someone probably will write a script to deface wikis with how to MAKE MONEY FAST!!!, but there are several things that will keep this from being too much of a problem. It's easy to revert spam, and anyone can do so. We can already block IP addresses. There are other technical solutions available, such as posting limits, or some form of spam filtering.
But speaking in non-technical terms, the Wikipedia simply should not be the target of spammers, at least, not a large number. Spammers must consider their audience (to a degree, anyway). USENET and email are general modes of communication used by a great many people, not all of whom are technically inclined. By contrast, the Wikipedia is an encyclopedia used by far fewer people, many of whom are technically inclined - making it a far less attractive target.
The Wikipedia is also an unattractive spam target for well-established legal reasons. Most countries do not have laws against USENET or email spam ; most have laws against website defacement. Few people are silly enough to risk imprisonment for hawking penis-enlargement scams.

What do you do if people start running scripts to repost their own bit of vandalism or spam, and from different locations so you can't just block their IP address?

This would be similar to a distributed denial of service attack, such as major websites occasionally fall victim to. It hasn't happened yet, and to the best of our knowledge hasn't yet happened on any wiki. If someone did an extensive attack, all offending IP addresses could be blocked from further editing by the admins. In an emergency, we can restore yesterday's version from a backup we make of the server itself.
However, this can become a somewhat serious issue; no one wants an escalating arms-race of the sort that can be found between the Slashdot admins and trolls, where both sides use ever more-complicated filters and scripts; or destruction like that which has come to the IRC network, defenseless against true malice.

Systematic bias

Wikipedia coverage is heavily biased by the sorts of people who want to contribute to it.

This seems to be a perfectly legitimate concern. Certainly, Wikipedia coverage is patchy. It's easy to find examples of a really long article on one subject, where another, equally important subject, has only a stub. Sometimes this is just the result of a single enthusiastic contributor (eg, Atlas Shrugged). Other times it is down to systematic bias.
As of July 2003, we think our largest bias is that we are biased in favour of Western topics, and particularly topics relevant to English-speaking nations such as the United States. Also, many of our contributors are "geeks" of various descriptions: hackers, scientists/academics, and so on.
Earlier on, we had a systematic bias towards libertarian issues. However, as Wikipedia has grown, and become more mainstream, the libertarian contingent has declined as a proportion of Wikipedia in general. Perhaps our other biases will be partially neutralised in the same way?
Our hope is that, when Wikipedia really hits the big time, while the percentages of people working on unpopular topics might remain the same, the sheer numbers of those people will be higher than they are now. The idea is that we'll be getting more content in those areas then. Besides, it's not as though we have a time limit. Even if the computer and mathematics areas, for example, fill up faster than the dance and literature areas, it doesn't follow that the latter areas will always be weak.
Another thing that we can do is target the weak areas and try to get contributors for those areas in various ways. See meta:systematic bias for further discussion.

Many Wikipedia articles are of poor quality, and there isn't a peer-review process; no self-respecting intellectual would be associated with it

It seems like there should be a giant "under construction" sign on every page of the website. It seems worthless as a reference. I don't see what the point is.

Wikipedia is both a product and a process. As a product, right now, it may not seem all that exciting or even respectable. As a process, however, it is quite remarkable. Seeing it as a process, Wikipedia can be judged not by its state at any given moment but by how well it is growing, how well it is becoming what it will become.
One way of understanding the process is by imagining a perfect article, one that ranks as a 100 on a scale of 100. An article that does not yet exist would be a 0, and a stub article would be perhaps a 1. The Wikipedia process works by constantly improving the quality of any given article, such that any significant edit moves the article 10% closer to perfection.
A first edit might move an article from 0 to 10 on a scale of 100. Viewed as a product, an article with a 10 score must seem pretty lame. But as the process continues, the article constantly improves. The next edit moves it 10% closer, to 19, the next to 27.1, etc. As further edits accumulate, the quality of the article moves asymptotically towards perfection, and likewise the quality of the encyclopedia as a whole.
The people at work here think the Wikipedia process is remarkable, and they believe it's going to result in a fantastic product.


Surely it's not possible that very many upstanding intellectuals will want to participate in Wikipedia. After all, wiki software must be the most promiscuous form of publishing there is--Wikipedia will take anything from anybody!

But it is possible, because plenty of upstanding intellectuals do participate in Wikipedia. It's fun, first of all. But it can be fun for intellectually serious people only if we know that we're creating something of quality. And how do we know that? The basic outlines of the answer ought to be fairly obvious to anyone who has read Eric S. Raymond's famous essay on the open source movement, "The Cathedral and the Bazaar." Remember, if we can edit any page, then we can edit each other's work. To quote Linus Torvalds: "Given enough eyeballs, all bugs are shallow". We catch each other's mistakes and enjoy correcting them.
So, we're constantly monitoring the Recent Changes page. When a crank shows up and vandalizes a page, it's fixed nearly instantly. (We save back copies of all pages, and these are very easily accessible.) We (that is, we participants) work on a lot of different pages, and many of us feel some collective responsibility for how the whole thing looks. We're constantly cleaning up after each other and new people.
In the process, a camaraderie--a politeness and congeniality not found on many online discussion forums--has developed. We've got to respect each other, because we are each other's editors, and we all have more or less the same goal: to create a huge, high-quality free encyclopedia.


That's nice, but why should highly-qualified people get involved with Wikipedia? It's not peer-reviewed. So, isn't it lightweight? Why should any serious researcher care about it? Why should anyone rely on it? Wikipedia has a nice community, but it doesn't have much breadth, depth, or reliability; so if you want serious information, go to Britannica.

If Wikipedians believed that, we'd bag the whole thing. We think we are--gradually, and sometimes from very rough first drafts--developing a reliable resource. So what answer can we offer to the above concerns?
Part of the answer has already been given: Wikipedia's self-correction process (Wikipedia co-founder Jimmy Wales calls it "self-healing") is very robust. There is considerable value created by this open and public review process that is continually ongoing on Wikipedia--value that is very easy to underestimate, for those who have not experienced it adequately.
Another part of the answer is that, of course, we've only been around since January, 2001. (Britannica's had a few centuries' head start.) Significantly, Wikipedia's rate of growth has been steadily increasing--in terms of article numbers and quality, traffic to the website, and attracting more highly-qualified contributors. So it seems very reasonable to think that within a few years the project will surpass Britannica in both breadth and depth. In early 2003 we passed the milestone of 100,000 articles, and the rate of growth shows no sign of slowing. What's more, the community of Wikipedia contributors is growing just as fast, and the quality of existing articles is improving all the time.
Another part answer is that Wikipedia is providing free, unlimited server space and well-designed page construction tools for anyone who needs to do something that fits within the Wikipedia mission and doesn't care about owning the information; a description that matches the prototypical academic researcher.
Admittedly, Wikipedia is rather far from being the reliable resource that Britannica is. But it's growing beyond anyone's expectations. The rate of growth continues to increase. In short order Wikipedia will--many of us think--be able to boast a breadth, depth, and reliability to compare to any general encyclopedia you please.
Then we'll try to get to the depth and reliability of a whole reference library full of specialized encyclopedias --something no general encyclopedia has ever done.


Grand, but I looked at an area that I know something about, and I found all sorts of errors and omissions. I was surprised and amused. I obviously don't want to be associated with something of this low quality.

We certainly do not hide the fact that a lot of articles need a lot of work. We only began in January 2001; the Wikipedia article creation process has been very robust and effective over the period of many months.
We too deplore bad work. But then we often just go ahead and fix the problems we see. It would be great if you would help us increase the quality of the project by doing the same. Yes, there is a lot of mediocre stuff that has been added. But Wikipedia is not finished; think of it as one giant rough draft of an encyclopedia, written by a bunch of different people. And despite all that, much of what you'll find here, if you explore around a bit, isn't half-bad considering its youth, and some of it is quite good.
Initially, some areas of Wikipedia might indeed look like a deep morass, and article by article, bit by bit, you try to establish dry ground, but it seems too overwhelming. It is fun though; if initially you don't want to be associated with it, you can always choose an anonymous handle. The whole concept of authorship is not germane to wikis anyway. Bad articles cannot be credited to you because Wikipedia articles aren't credited to anyone!
We certainly don't expect everyone to want to jump on board. But if the main thing that's stopping you at this point is that some articles in one area of Wikipedia are of substandard quality, we'd ask you to come back next year, or the year after, or even correct those mistakes yourself. By that time, it seems pretty likely that the mistakes in those articles will be corrected, and a lot more details will have been supplied. In short, time alone will--it's reasonable to think--render the project something with which you'll want to be associated.


Maybe it will improve, and maybe it won't, but currently it's pretty lame. I looked up a topic I know something about and found just a few words, just a stub. That's ridiculous!

There are indeed a lot of "stub" entries, and we share your opinion of their ridiculousness. But it doesn't make a lot of sense to judge the entire project, either as a product or as a process, on the basis of the lameness of the stubs. There are a lot of excellent articles on Wikipedia, too. The majority, at present, are just OK--not spectacular by anyone's measure. But that's all changing because we're constantly working on them. People are expanding articles all the time.
It's reasonable to think there will come a time when we have exhausted most of the most common subjects, and we've got at least "stubs" about almost everything. Then we have no choice but to get into things more deeply. We can look forward to that. All in good time!


It seems Britannica has extremely high standards for what they put into their publications, both online and offline. Wikipedia has no such standards. It's bound to be of shoddy quality.

It's simply false to say that Wikipedia has no standards--the standards we follow are those followed by each of its contributors, and in some cases, these are very high standards indeed. As we gain more traffic, we will continue to gain more expert help, and as gaps are filled in, the only way remaining for Wikipedia to improve in will be in quality and depth. This, in turn, is likely to attract more experts, who follow their own very high standards.
To make a claim about what standards Wikipedia follows is to make a claim about what present and future Wikipedia contributors follow; to say that such people have no standards is baseless.



Good quality requires peer review and expertise, which Wikipedia doesn't have.

Britannica is good not only because it is big. If that were the case, there would be no reason not to be satisfied with World Book or something of that sort. When it is good, Britannica is so partly because it is authoritative, and it got that way by being selective. Wikipedia isn't selective; hence it will never be authoritative.

It's perfectly correct to say that Britannica is good not only because it's big; the high quality of its articles is very important. Certainly it got that way by having high standards. We can concede that, but what reason is there to believe that it is only "by being selective" (presumably by choosing who is going to write about what) is the only way to support and achieve high standards? Maybe there's another, more open way. Wikipedia is a good test of that proposition. We have, after all, managed to produce some really excellent articles--and, by the way, not all of these were written by the many Ph.D.s and other highly credentialed people that we have working on this project.


Your experiment will probably not go well. Good quality requires peer review and expertise. Why should we care about the products of an arbitrary group of people whose knowledge and ability could range from expertise to hopeless ignorance? Ignorance mixed with knowledge does not benefit knowledge.

First of all, the hypothesis that openness is to the benefit of quality has already been tested, and to the benefit of the hypothesis: articles that have been worked on by many different people in the context of Wikipedia are now comparable to articles that can be found in some excellent encyclopedias. If, however, you insist on considering the hypothesis a priori, we hope you will ask yourself: which is more likely to be correct?
  1. A widely circulated article, subject to scrutiny, correction, and potentially constant improvement over a period of months or years, by vast numbers of experts and enthusiasts.
  2. An article written by a nonspecialist professional writer or a fair-to-middling scholar (as so many encyclopedia articles are), and not subject to public review and improvement.

Look, all this speculation and "experimentation" is fine and well, but if there's one thing I've learned in my studies, it's that you can't really evaluate the validity of a piece of nonfiction writing unless you know something about the author and his/her qualifications to speak on the topic--or at least you are provided with the appropriate references to support his/her claims.

That certainly does seem to be a reasonable thing to say, but there are a few different points to bear in mind. First, an increasing number of Wikipedia articles do have references, and this is something we broadly encourage.
Second, the greater the number of participants, the greater the sheer number of experts who are involved in bringing our weaker articles up to par--so, while you might not know which experts have been at work on an article, if you know that an article has been around for many months and that we have some experts in the general area at work here, it's fairly likely that it's been given a going-over by those experts. In other words, knowledge of the process, and of the fact that it includes participants who are expert in a wide variety of subjects, is potentially a substitute for knowledge that some particular (alleged) expert has written some particular article. Perhaps the relevant question to ask is, "How expert is the community of people who have created Wikipedia?" The answer is, "We've got experts in several different fields, and new highly-qualified people are arriving all the time." We don't require that most or even very many of experts on this and that join us, or think well of us; we require only a few, who have been steadily "raising the bar" from the beginning of the project.
Third, if we find it advantageous, we will install some sort of approval mechanism. Alternately, because this is free content, somebody else might start a project that "approves" Wikipedia content itself.

Indeed, then, I should like to see some means of peer review before edits are accepted on articles which have already been approved by some similar process of peer review. At the moment it is entirely in the hands of an individual whether he thinks a modification he intends is an improvement, so there comes a point when a modification is as likely to damage the resource. If some system could be installed, then you would protect against crank attacks as well as misjudgement, and ensure a continually improving resource.

As a community, almost all of us are opposed to what has been called the policy of completely "freezing" particular pages--so that they can be edited only by a select group of people (e.g., only the author and an "editor"). We feel that our own collective monitoring of Recent Changes is an adequate safeguard against cranks--see above. Moreover, it is quite obvious that Wikipedia has achieved what success it has so far precisely by being as open as it has been. So--again--we don't want to kill the goose that lays the golden eggs.
That said, perhaps someone who has the above suggestion will be pleased by the approval system mentioned above and which can be found discussed at Wikipedia approval mechanism. Such a system would identify a body of experts that would put its official stamp of approval on some articles. Those articles could still be just as easily revised as they were before, but there would also be a version that would be presented as the "approved" version. This way we can "freeze" high-quality content without freezing the process.



Wikipedia's extrapolation to continued growth is dubious

Many of your replies seem to assume that quality will improve as the website grows. But quantity doesn't always beget quality. There is simply no reason to suppose that more articles is automatically better.

Actually, there is, at least in Wikipedia's case. There are at least three reasons to think that increasing numbers of articles and participants will lead to higher quality.
First, the more people are participating, the sooner we hammer out basically-OK articles on all the easy topics, thereby making the project of more interest to specialists who are turned off by the obvious omission of basic and reliable information on easy topics.
Second, the more eyes see our articles, the more transparent the errors will be (over the long haul). While we might have one or two philosophers on board during one month, a year later we might have ten or twenty--and then mistakes in their work will be caught much more quickly.
Third, statistically, the more people who are participating, the greater the sheer numbers of experts; that seems to be our experience so far. Moreover, as a matter of fact, people usually tend not to touch articles they know nothing about, particularly when the article is well-developed or when they know that some resident expert will pounce on their mistakes. (There are exceptions, of course.) So, the greater the number of participating experts, the higher the overall quality of the content produced under their general guidance. It is not mere hype to say that Wikipedia caters to the highest common denominator--it's actually an observation we've made!


It seems pretty foolish to make a simple extrapolation from past growth to future growth. It's easy to grow at a 20% growth rate for a few months, or even for a few years--but, of course, not indefinitely. More generally, it's surely fallacious to suppose that the growth rate in the past is any very good indication of what will happen in the future.

It seems our critic here believes we have the following simpleminded argument: "The number of Wikipedia articles has been growing at rate R for the past nine months; therefore, it will continue to grow at the rate of R for the indefinite future." If that's all there were to it, that indeed would be foolish to say; but that's not all there is to it.
To be clear, we agree that it's very risky to make any specific predictions about growth rates. But it does seem reasonable to suppose Wikipedia will continue to grow at a rapid rate.
Now, what makes it reasonable to think that Wikipedia will continue growing at a rapid clip is not simple extrapolation, but observation of the factors that have made it grow at the rapid clip so far. Google has been sending us lots of traffic (thousands of visitors a day from Google alone; it used to be just in the hundreds). The more traffic Google sends us, the more people get on board and create content; and then Google sends us even more traffic. Moreover, more and more people are linking to Wikipedia. This raises Wikipedia's Google rankings. (Thus more traffic, thus more content.) Already, plenty of Wikipedia pages are listed on the first few pages of Google results (See Wikipedia:Top 10 Google hits for an incomplete listing).
Now, that's only part of the argument. The other part is that, while there is attrition (some old contributors don't write so much anymore), there's an overall increase in active population. There are a lot more active Wikipedians now that there were, say, three months ago.
Another part of the argument is that the overall quality of Wikipedia has been increasing, and our experience so far indicates that it will, probably, continue to increase. This makes it more likely that people will take notice of the project, link to it, use its contents (properly sourcing Wikipedia), etc.
In short, "the rich get richer." Please note, this is not mere speculation: it's an explanation of how Wikipedia's growth has occurred in the last nine months.
Of course, we will run out of topics sooner or later--the number of encyclopedia topics is not infinite. But it is really huge. A lot bigger than 100,000, and a heck of a lot bigger than the number of topics contained in Britannica. Even if we reach a point at which we cannot grow significantly in breadth, we will still be able to grow significantly in depth.

You say Wikipedia is growing rapidly. Suppose it gets really big. Then you'll start to attract the attention of more malicious elements. All the noise will eventually be larger than any group of editors can handle.

Wikipedia is the largest wiki there is, and it's an open question as to whether it will scale up if it becomes even larger. Indeed, many folks believe that online communities may not scale, whether wiki-based or not.
Many of us believe that Wikipedia will scale indefinitely. The more people there are to abuse it, the more people there are to ward off the abuse. As traffic increases, so does the number of people who work on and care about the project. We've been Slashdotted before, or had articles featured on TV, and had huge bursts of traffic, and while there were a few "malicious elements," they soon find out that it's just not worth their while. Bear in mind that people have been telling us that Wikipedia won't scale since back in 2001, and so far, so good.
On the other hand, some of us agree with you, and think that Wikipedia won't scale indefinitely. At some point in the future, we may look back and see that while Wikipedia is a good encyclopedia, it was even better a month ago. Well, at that point we can start to take the project forward using a different approach, perhaps involving a more rigourous peer review system, or we can hand on the baton to someone else. But at the moment, Wikipedia is scaling nicely, and long may it continue to do so.



5. Miscellaneous concerns

Redundancy

Why is there a need for an encyclopedia at all, in the age of the omniscient Internet? Why not just go to your favorite search engine and search for whatever topic on which you're looking for an encyclopedia article? You're more likely to find more information, including more interesting and more current information.

Here's a glib answer: isn't it interesting that, in fact, thousands of people per day arrive at Wikipedia via Google?
Here's a longer answer. The point made here cannot be denied: the Internet, armed with good search engines, functions not unlike a giant, and exceedingly useful, encyclopedia. But does it follow from that that there is no need for an open content, community-built encyclopedia? No.
There is such a thing as an intellectual division of labor--not only among people, but among types of books and among types of reference materials. One does not typically consult a dictionary when one is looking for an encyclopedia article, even though, sometimes, the required information is included in the dictionary. Similarly, sometimes an encyclopedia article is precisely what is required. Encyclopedia articles can be generally expected to include certain kinds of information, and only that information. Among other things, they are generally expected to be written from a neutral point of view.
Moreover, Wikipedia is free and open content--which is valuable. This means that anyone will be able to use the content for any purpose, particularly for educational purposes. The prospects of the use of a really huge, free encyclopedia for educational purposes--including the development of specialized educational materials--is very exciting.
Additionally, it's important to note that both personal and organizational pages on the Internet are subject to bit rot. When page authors or owners ignore or forget about a piece of information, it loses its timeliness and relevance. Errors of fact can remain in place for years -- with the only feedback mechanism being increasingly rare "mailto:" tags. With Wikipedia, readers are editors -- interested parties can keep articles up-to-date and current long after the original author has moved on to greener pastures.
Finally, it is possible that in the fullness of time Wikipedia will contain more relevant, reliable information on any given topic than can be easily found via a search engine search. That's certainly our plan for it.

Markup and Display

Wikipedia software is inadequate to the task of collaboratively writing an encyclopedia. It is hard to collaboratively edit images, there is no WYSIWYG editing, and anything complex requires reams of HTML.

There are many ways in which Wikipedia is less than ideal in these respects. We are working to improve some of these issues, though: for example, the largest concerns have been in the mathematical section of the site. As of January 2003, Wikipedia supports Wikipedia:TeX markup, and this is no longer a problem. Similarly, we now support image uploads.
There are other proposals for simplifying image and table markup - see meta:Wiki markup tables, meta:image pages, and meta:WikiShouldOfferSimplifiedUseOfTables. The Wikipedia software is open source, so if you'd like to work on these problems then join Wikitech-L and offer your services.
In the mean time, while we can agree that the current software is non-ideal, it's certainly not inadequate, and everything we do now can be carried over as we slowly improve the software.

See also: Wikipedia:Disruption, Wikipedia:Why Wikipedia is so great, Wikipedia:Why Wikipedia is not so great