These are part of my notes for the Coursera course: Algorithms, Pt I (Princeton).
##Insertion Sort
####Concept
- iterate through items
- with each iteration, a new item is 'discovered'
- compare discovered item with item on its left
- if discovered item is smaller, swap the two
- continue to swap until a smaller item is found on the left or until the discovered item is at the beginning
####Implementation
- iteration is done through the elements (
for
loop) - index
i
keeps track of the global iteration - when a new item is discovered (on every iteration), it must be compared to the previously sorted items (second
for
loop) - index
j
keeps track of the iteration through previously sorted items - when a smaller than the discovered number is found, it is inserted above it and the
j-loop
is broken
####Analysis
Best Case - Already Sorted
If all is already sorted, all items would be compared once but never exchanged:
N-1 compares, 0 exchanges
Much better than selection sort.
Worse Case - Reverse Sorted
If sorted the wrong way, there will be:
~1/2 * N^2 compares + ~1/2 * N^2 exchanges
So worse than selection sort - due to the amount of exchanges.
Intermediate Case - Partially Sorted
An array is partially sorted if the amount of inversions (amount of things out of place) is smaller than c * N
.
The amounts of exchanges is the same as the amount of inversions:
N-1 compares + c * N exchanges
= (c + 1) * N -1
~ (c + 1) * N
So partially sorted arrays get sorted in linear time. This is the main case for which Insertion Sorts are attractive.
Average Case - Random Array
Analysis is relatively complicated due to the randomness.
Essentially, time random arrays get sorted in time:
~ 1/4 * N^2
So about twice as fast as selection sort.