The Folklore of Sorting Algorithms



IJCSI International Journal of Computer Science Issues, Vol. 4, No. 2, 2009

27


stated that computer chips should double in power
approximately every 18 months. The speed of most of
today's computers is really quite remarkable. Even the
"slower" ones are quite fast, and the actual processing
doesn't take much time. Delays, usually encountered in
"I/O", either to/from disk, the network or peripheral
devices have been eliminated because of innovative
interfacing standards, thereby boosting speed. If a
computer has an 800 MHz processor, sounds like it was
made in the early to mid 90's. If we look at most
computers coming out today we will notice that the dual
core processor - which is in fact two processors on the
computer, each processor is clocking in at around 1.85
Ghz, which alone is more than two times as fast as that
800 Mhz processor. Then we have to multiply that by two
since there are two cores. It is something like today's
computers are working about 4-5 times faster than the
earlier ones. Such increase in computer speed causes
acceleration of sorting algorithms. Thus knowledge
proved by some researcher for a sorting algorithm on its
complexity can not be considered absolute as it may be
increased or decreased analogously. Therefore piece of
knowledge seen can be thought as folklore knowledge.

Above discussion is potentially the source for folklore.
Many people, same work on different environment leading
to different folklores. Some examples of folklore
knowledge are 1) A quick sort is an effective, but more
complex, sorting algorithm, 2) Shell Sort is much easier to
code than Quicksort, and it is nearly as fast as Quicksort.

In next part of paper, we try to express sorting algorithm’s
behavior using the folklore and then try to prove it by
formulating theorems.

3. Background Knowledge

Sorting is always carried out technically. In computer
science and mathematics; we can formulate a procedure
for sorting unordered array or a file. Such procedure is
always governed by an algorithm; called Sorting
Algorithm. It is highly expected that we must converge to
the definition of sorting concept. Basically a sorting
problem is to arrange a sequence of objects so that the
values of their "key" fields are in "non-decreasing"
sequence.

That is, given objects

O1, O2, ..., On

With key values

K1, K2, ...,   Kn respectively,

Arrange the objects in an order

Oi1, Oi2,    ..., Oin ,

Such that

Ki1 <= Ki2 <= ... <= Kin.      (1)

Generally, Computational complexity (worst, average and
best behavior) of element comparisons in terms of the size
of the unsorted list is considered for the analysis of the
efficiency of sorting algorithm. The complexity notational
terminology is covered in [2]. If the size of unsorted list is
(n), then for typical sorting algorithms, good behavior is O
(n log n) and bad behavior is Ω (n2). The Ideal behavior is
O (
n). Sort algorithms which only use an abstract key
comparison operation always need Ω (
n log n)
comparisons in the worst case.

An important criterion used to rate sorting algorithms is
their running time. Running time is measured by the
number of computational steps it takes the sorting
algorithm to terminate when sorting n records. We say that
an algorithm is O (
n2) ("of order n-squared") if the number
of computational steps needed to terminate as
n tends to
infinity increases in proportion to
n2. Literature review
carried out in [11, 5] indicate the man’s longing efforts to
improve running time of sorting algorithm with respect to
above core algorithm concepts.

Research on efficiency analysis [3, 2] use big O, omega
and theta notation to give asymptotic upper, lower, and
tight bounds on time and space complexity of sorting
algorithms. They determine the time complexity of sorting
algorithms, use a list of functions and order them
according to asymptotic growth.

4. Some Folklore and Convergence in to
Theorems

4.1 Folklore 1: The running time of a sorting
algorithm grows if the length of the input sequence
grows.

Proof 1: Proof can be obtained from the following
reformulation of the sorting problem. For
n records there
are
n! possible linear arrangements, only one of which is
the correct, sorted set of records. (
n! = 1 x 2 x 3 x . . . [n -
2] x [
n - 1] x n.) One may imagine these n! arrangements
as the leaves of a binary decision tree. The process of
sorting then involves starting at the root of the tree (the
unordered initial list) and traveling down the nodes,
choosing either the left or right node based on results of
comparisons. (In this terminology, the "root" of a tree is at
the top, and all the branches expand downward.) The
maximum number of steps needed would correspond to
the height of the tree, which is log
n!. Using Stirling's
approximation of log
n!. For large n, this process would be
O (
n log n).

IJCSI



More intriguing information

1. Long-Term Capital Movements
2. Constrained School Choice
3. Why unwinding preferences is not the same as liberalisation: the case of sugar
4. EMU's Decentralized System of Fiscal Policy
5. The name is absent
6. The Advantage of Cooperatives under Asymmetric Cost Information
7. The name is absent
8. The name is absent
9. HACCP AND MEAT AND POULTRY INSPECTION
10. How do investors' expectations drive asset prices?
11. The Complexity Era in Economics
12. A Multimodal Framework for Computer Mediated Learning: The Reshaping of Curriculum Knowledge and Learning
13. Direct observations of the kinetics of migrating T-cells suggest active retention by endothelial cells with continual bidirectional migration
14. LOCAL CONTROL AND IMPROVEMENT OF COMMUNITY SERVICE
15. Connectionism, Analogicity and Mental Content
16. Tissue Tracking Imaging for Identifying the Origin of Idiopathic Ventricular Arrhythmias: A New Role of Cardiac Ultrasound in Electrophysiology
17. Delivering job search services in rural labour markets: the role of ICT
18. Psychological Aspects of Market Crashes
19. Commuting in multinodal urban systems: An empirical comparison of three alternative models
20. Response speeds of direct and securitized real estate to shocks in the fundamentals