Skip to content

Commit 2c890d1

Browse files
authored
Create README.md
1 parent 889e0a1 commit 2c890d1

File tree

1 file changed

+53
-0
lines changed

1 file changed

+53
-0
lines changed

sortingAlgo/countingSort/README.md

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# COUNTING SORT
2+
The lower bound nlogn does not apply to algorithms that do not compare array
3+
elements but use some other information. An example of such an algorithm is
4+
counting sort that sorts an array in O(n) time assuming that every element in
5+
the array is an integer between 0... c and c = O(n).
6+
The algorithm creates a bookkeeping array, whose indices are elements of the
7+
original array. The algorithm iterates through the original array and calculates
8+
how many times each element appears in the array.
9+
## Logic
10+
For example, the array
11+
12+
1 3 6 9 9 3 5 9
13+
14+
corresponds to the following bookkeeping array:
15+
16+
1 0 2 0 1 1 0 0 3
17+
1 2 3 4 5 6 7 8 9
18+
19+
For example, the value at position 3 in the bookkeeping array is 2, because
20+
the element 3 appears 2 times in the original array.
21+
Construction of the bookkeeping array takes O(n) time. After this, the sorted
22+
array can be created in O(n) time because the number of occurrences of each
23+
element can be retrieved from the bookkeeping array. Thus, the total time
24+
complexity of counting sort is O(n).
25+
Counting sort is a very efficient algorithm but it can only be used when the
26+
constant c is small enough, so that the array elements can be used as indices in
27+
the bookkeeping array.
28+
29+
## Time Complexity
30+
31+
### Best Case:
32+
The best case time complexity occurs when all elements are of the same range that is when k is equal to 1.
33+
In this case, counting the occurrence of each element in the input range takes constant time and then finding
34+
the correct index value of each element in the sorted output array takes n time, thus the total time complexity reduces to O(1 + n) i.e O(n) which is linear.
35+
36+
N is the number of elements
37+
K is the range of elements (K = largest element - smallest element)
38+
39+
### Worst Case:
40+
Worst case time complexity is when the data is skewed that is the largest element is significantly large than other elements. This increases the range K.
41+
42+
As the time complexity of algorithm is O(n+k) then, for example, when k is of the order O(n^2), it makes the time complexity O(n+(n^2)), which essentially
43+
reduces to O( n^2 ) and if k is of the order O(n^3), it makes the time complexity O(n+(n^3)), which essentially reduces to O( n^3 ).
44+
Hence, in this case, the time complexity got worse making it O(k) for such larger values of k. And this is not the end. It can get even worse for further larger values of k.
45+
Thus the worst case time complexity of counting sort occurs when the range k of the elements is significantly larger than the other elements.
46+
47+
## Space complexity
48+
49+
In the above algorithm we have used an auxiliary array C of size k, where k is the max element of the given array. Therefore the space complexity of Counting Sort algorithm is O(k).
50+
51+
Space Complexity : O(k)
52+
53+
Larger the range of elements in the given array, larger is the space complexity, hence space complexity of counting sort is bad if the range of integers are very large as the auxiliary array of that size has to be made.

0 commit comments

Comments
 (0)