Jump to content

Best, worst and average case

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by TakuyaMurata (talk | contribs) at 08:10, 30 March 2004. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, best and worst cases in a given alogrithm tell how much time would be taken at least and at most, respectively. The worst case is most of concern since it is more important to know how much time would be needed to gurantee the algorithm would finish.

For example, a simple linear search has an average running time of n/2, but a worst case performance of n steps, when the item to be found is the last item in the table.

A classic example is Quicksort, which is, in the average case, a very fast algorithm. But if not used with great care, its worst-case performance can degrade to O(n2) ((see Big O notation), ironically when the target list is already sorted.

Worst-case performance analysis is often easier to do than "average case" performance. For many programs, determining what "average input" is, is in itself difficult, and often that "average input" has characterics which make it difficult to characterise mathematically (consider, for instance, algorithms that are designed to operate on strings of text). Similarly, even when a sensible description of a particular "average case" (which will probably only be applicable for some uses of the algorithm) is possible, they tend to result in more difficult to analyse equations.

See: sort algorithm - an area where there is a great deal of performance analysis of various algorithms.


disputed

For many algorithms, it is important to analyze worst-case performance as well as average performance because the worst case means the same as the best guaranteed case.