You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: strings_arrays/binary_search.md
+8-8
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
-
Binary search is a method for locating an element in a sorted list efficiently. Searching for an element can done naively in **O(N)** time, but binary searchspeeds it up to **O(log N)**. Binary search is a great tool to keep in mind for array problems.
1
+
Binary search is a technique for efficiently locating an element in a sorted list. Searching for an element can done naively in **O(n)** time by checking every element in the list, but binary search's optimization speeds it up to **O(log n)**. Binary search is a great tool to keep in mind for array problems.
2
2
3
3
Algorithm
4
4
------------------
5
-
In binary search, you are provided a list of sorted numbers and a key. The desired output is the index of the key, if it exists and None if it doesn't.
5
+
In binary search, you are provided a sorted list of numbers and a key. The desired output of a binary search is the index of the key in the sorted list, if the key is in the list, or ```None``` otherwise.
6
6
7
-
Binary search is a recursive algorithm. The high level approach is that we examine the middle element of the list. The value of the middle element determines whether to terminate the algorithm (found the key), recursively search the left half of the list, or recursively search the right half of the list.
7
+
Binary search is a recursive algorithm. From a high-level perspective, we examine the middle element of the list, which determines whether to terminate the algorithm (found the key), recursively search the left half of the list (middle element value > key), or recursively search the right half of the list (middle element value < key).
8
8
```
9
9
def binary_search(nums, key):
10
10
if nums is empty:
@@ -13,20 +13,20 @@ def binary_search(nums, key):
13
13
return middle index
14
14
if middle element is greater than key:
15
15
binary search left half of nums
16
-
if middle element is less than
16
+
if middle element is less than
17
17
binary search right half of nums
18
18
```
19
19
20
20
There are two canonical ways of implementing binary search: recursive and iterative. Both solutions utilizes two pointers that keep track of the portion of the list we are searching.
21
21
22
22
### Recursive Binary Search
23
23
24
-
The recursive solution utilizes a helper function to keep track of pointers to the section of the list we are currently examining. The search either completes when we find the key, or the two pointers meet.
24
+
The recursive approach utilizes a helper function to keep track of pointers to the section of the list we are currently examining. The search either terminates when we find the key or if the two pointers meet.
The iterative solution manually keeps track of the section of the list we are examining, using the two-pointer technique. The search either completes when we find the key, or the two pointers meet.
44
+
The iterative approach manually keeps track of the section of the list we are examining using the two-pointer technique. The search either terminates when we find the key, or the two pointers meet.
45
45
```python
46
46
defbinary_search(nums, key):
47
47
left_idx, right_idx =0, len(nums)
@@ -58,7 +58,7 @@ def binary_search(nums, key):
58
58
59
59
## Runtime and Space Complexity
60
60
61
-
Binary search completes in **O(log N)** time because each iteration decreases the size of the list by a factor of 2. Its space complexity is constant because we only need to maintain two pointers to locations in the list. Even the recursive solution has constant space with [tail call optimization](https://en.wikipedia.org/wiki/Tail_call).
61
+
Binary search has **O(log n)** time complexity because each iteration decreases the size of the list by a factor of 2. Its space complexity is constant because we only need to maintain two pointers. Even the recursive solution has constant space with [tail call optimization](https://en.wikipedia.org/wiki/Tail_call).
Sorting is a fundamental tool for tackling problems, and is often utilized to help simplify problems.
2
2
3
-
There are several different sorting algorithms, each with different tradeoffs. In this guide, we will cover several well-known sorting algorithms along with when they are useful.
3
+
There are several different sorting algorithms, each with different tradeoffs. In this guide, we will cover several well-known sorting algorithms along with when they are useful.
4
4
5
-
We will go into detail for merge sort and quick sort, but will describe the rest at a high level.
5
+
We will describe merge sort and quick sort in detail and the remainder of the featured sorting algorithms at a high level.
6
6
7
7
## Terminology
8
8
Two commonly used terms in sorting are:
@@ -11,11 +11,11 @@ Two commonly used terms in sorting are:
11
11
2.**stable sort**: retains the order of duplicate elements after the sort ([3, <u>2</u>, 4, **2**] -> [<u>2</u>, **2**, 3, 4])
12
12
13
13
## Merge sort
14
-
**Merge sort** is perhaps the simplest sort to implement and has very consistent behavior. It adopts a divide-and-conquer strategy: recursively sort each half of the list, and then perform an O(N) merging operation to create a fully sorted list.
14
+
**Merge sort** is perhaps the simplest sort to implement and has very consistent behavior. It adopts a divide-and-conquer strategy: recursively sort each half of the list, and then perform an O(n) merging operation to create a fully sorted list.
15
15
16
16
### Implementation
17
17
18
-
The key operation in merge sort is `merge`, which is a function that takes two sorted lists and returns a single list which is sorted.
18
+
The key operation in merge sort is `merge`, which is a function that takes two sorted lists and returns a single sorted list composed of elements of the combined lists.
19
19
```python
20
20
defmerge(list1, list2):
21
21
iflen(list1) ==0:
@@ -31,7 +31,6 @@ This is a recursive implementation of `merge`, but an iterative implementation w
31
31
32
32
Given this `merge` operation, writing merge sort is quite simple.
33
33
34
-
35
34
```python
36
35
defmerge_sort(nums):
37
36
iflen(nums) <=1:
@@ -43,18 +42,17 @@ def merge_sort(nums):
43
42
```
44
43
45
44
### Runtime
46
-
47
-
Merge sort is a recursive, divide and conquer algorithm. It takes O(log N) recursive merge sorts and each merge is O(N) time, so we have a final runtime of O(N log N) for merge sort. Its behavior is consistent regardless of the input list (its worst case and best case take the same amount of time).
45
+
Merge sort is a recursive, divide and conquer algorithm. It takes O(log n) recursive merge sorts and each merge is O(n) time, so we have a final runtime of O(n log n) for merge sort. Its behavior is consistent regardless of the input list (its worst case and best case take the same amount of time).
48
46
49
47
**Summary**
50
-
* Worst case: O(N log N)
51
-
*Best case: O(N log N)
52
-
* Stable: yes
53
-
* In-place: no
48
+
49
+
| Worst case |Best case| Stable | In-place|
50
+
|:----------:|:---------:|:------:|:-------:|
51
+
| O(n log n) | O(n log n) | ✅ | ❌ |
54
52
55
53
## Quick sort
56
54
57
-
**Quick sort** is also a divide and conquer strategy, but uses a two-pointer swapping technique instead of `merge`. The core idea of quick sort is to select a "pivot" element in the list (typically the middle element), and swap elements in the list such that everything left of the pivot is less than it, and everything right of the pivot is greater. We call this operation `partition`. Quick sort is notable in its ability to sort efficiently in-place.
55
+
**Quick sort** is also a divide and conquer strategy, but uses a two-pointer swapping technique instead of `merge`. The core idea of quick sort is selecting a "pivot" element in the list (typically the middle element), and swapping elements in the list such that everything left of the pivot is less than it, and everything right of the pivot is greater. We call this operation `partition`. Quick sort is notable for its ability to sort efficiently in-place.
The partition function modifies `nums`inplace and takes up no extra memory. It also takes O(N) time in the worst case to fully partition a list.
71
+
The partition function modifies `nums`in-place and requires no extra memory. It also takes O(n) time worst case to fully partition a list.
74
72
75
73
```python
76
74
defquick_sort_helper(nums, left_idx, right_idx):
@@ -88,56 +86,62 @@ def quick_sort(nums):
88
86
89
87
### Runtime
90
88
91
-
The best case performance of quick sort is O(N log N), but depending on the structure of the list, quick sort's performance can vary.
89
+
The best case performance of quick sort is O(n log n), but depending on the structure of the list, quick sort's performance can vary.
92
90
93
-
If the pivot happens to be the median of the list, then the list will be divided in half after the partition.
91
+
If the pivot happens to be the median of the list, then the list will be divided in half after the partition.
94
92
95
93
In the worst case, however, the list will be divided into an N - 1 length list and an empty list. Thus, in the worst possible case, quick sort has O(N<sup>2</sup>) performance, since we'll have to recursively quicksort (N - 1), (N - 2), ... many lists. However, on average and in practice, quick sort is still very fast due to how fast swapping array elements is.
96
94
97
95
The space complexity for this version of quick sort os O(log N), due to the number of call stacks created during recursion, but an iterative version can make space complexity O(1).
98
96
99
97
**Summary**
100
-
* Worst case: O(N<sup>2</sup>)
101
-
*Best case: O(N log N)
102
-
* Stable: no
103
-
* In-place: yes
98
+
99
+
| Worst case |Best case| Stable | In-place|
100
+
|:----------:|:---------:|:------:|:-------:|
101
+
| O(n<sup>2</sup>) | O(n log n)| ❌ | ✅ |
104
102
105
103
## Insertion sort
106
104
107
105
In **insertion sort**, we incrementally build a sorted list from the unsorted list. We take elements from the unsorted list and insert them into the sorted list, making sure to maintain the order.
108
106
109
-
This algorithm takes O(N<sup>2</sup>) worst time, because looping through the unsorted list takes O(N) and finding the proper place to insert can take O(N) time in the worst case. However, if the list is already sorted, insertion sort takes O(N) time, since insertion time will be O(1). Insertion sort can be done in-place, so it takes up O(1) space.
107
+
This algorithm takes O(n<sup>2</sup>) worst time, because looping through the unsorted list takes O(n) and finding the proper place to insert can take O(n) time in the worst case. However, if the list is already sorted, insertion sort takes O(n) time, since insertion time will be O(1). Insertion sort can be done in-place, so it takes up O(1) space.
110
108
111
-
Insertion sort is easier on linked lists, which have O(1) insertion whereas arrays have O(N) insertion because in an array, inserting an element requires shifting all the elements behind that element.
109
+
Insertion sort is easier on linked lists, which have O(1) insertion whereas arrays have O(n) insertion because in an array, inserting an element requires shifting all the elements behind that element.
112
110
113
111
**Summary**
114
-
* Worst case: O(N^2^)
115
-
*Best case: O(N)
116
-
* Stable: yes
117
-
* In-place: yes
112
+
113
+
| Worst case |Best case| Stable | In-place|
114
+
|:----------:|:---------:|:------:|:-------:|
115
+
| O(n<sup>2</sup>) | O(n)| ✅ | ✅ |
118
116
119
117
## Selection sort
120
118
121
119
**Selection sort** incrementally builds a sorted list by finding the minimum value in the rest of the list, and swapping it to be in the front.
122
120
123
-
It takes O(N<sup>2</sup>) time in general, because we have to loop through the unsorted list which is O(N) and in each iteration, we search the rest of the list which always takes O(N). Selection sort can be done in-place, so it takes up O(1) space.
121
+
It takes O(n<sup>2</sup>) time in general, because we have to loop through the unsorted list which is O(n) and in each iteration, we search the rest of the list which always takes O(n). Selection sort can be done in-place, so it takes up O(1) space.
124
122
125
-
**Summary**
126
-
* Worst case: O(N<sup>2</sup>)
127
-
* Best case: O(N<sup>2</sup>)
128
-
* Stable: no
129
-
* In-place: yes
123
+
| Worst case | Best case | Stable | In-place|
124
+
|:----------:|:---------:|:------:|:-------:|
125
+
| O(n<sup>2</sup>) | O(N<sup>2</sup>)| ❌ | ✅ |
130
126
131
127
## Radix sort
132
128
133
129
**Radix sort** is a situational sorting algorithm when you know that the numbers you are sorting are bounded in some way. It operates by grouping numbers in the list by digit, looping through the digits in some order.
134
130
135
-
For example, if we had the list [100, 10, 1], radix sort would put 100 in the group which had 1 in the 100s digit place and would put (10, 1) in a group which had 0 in the 100s digit place. It would then sort by the 10s digit place, and finally the 1s digit place.
131
+
For example, if we had the list ```[100, 10, 1]```, radix sort would put 100 in the group which had 1 in the 100s digit place and would put (10, 1) in a group which had 0 in the 100s digit place. It would then sort by the 10s digit place, and finally the 1s digit place.
136
132
137
133
Radix sort thus needs one pass for each digit place it is sorting and takes O(KN) time, where K is the number of passes necessary to cover all digits.
138
134
139
-
**Summary**
140
-
* Worst case: O(KN)
141
-
* Best case: O(KN)
142
-
* Stable: yes (if going through digits from right to left)
143
-
* In-place: no
135
+
| Worst case | Best case | Stable | In-place|
136
+
|:----------:|:---------:|:------:|:-------:|
137
+
| O(kn) | O(kn)| ✅ (if going through digits from right to left) | ❌ |
138
+
139
+
## Summary
140
+
141
+
|Sort | Worst case | Best case | Stable | In-place|
Copy file name to clipboardExpand all lines: strings_arrays/sorting_colors.md
+17-19
Original file line number
Diff line number
Diff line change
@@ -12,18 +12,17 @@ Example:
12
12
```
13
13
14
14
## Approach #1: Merge or quick sort
15
+
### Approach
16
+
The problem is asking us to sort a list of integers, so we can use an algorithm like merge sort or quick sort.
15
17
16
-
**Approach**
17
-
The problem is asking us to sort a list of integers, so we could potentially use an algorithm like merge sort or quick sort.
18
-
19
-
**Time and space complexity**
20
-
With a sorting algorithm such as is O(N log N) in the worst case. The space complexity is O(1) since we sort in place.
18
+
### Time and space complexity
19
+
Both of these sorting algorithms have O(n log n) worst case time complexity and, because we sort in-place, O(1) space complexity.
21
20
22
21
## Approach #2: Counting sort
23
-
**Approach**
24
-
We know that the numbers we are sorting are 0, 1, or 2. This leads to an efficient counting sort implementation, since we can just count the numbers of each and modify the list inplace to match the counts in sorted order.
22
+
### Approach
23
+
We know that the numbers we are sorting are 0, 1, or 2. This means we can sort more efficiently by simply counting the numbers of times each of the three values occurs and modifying the list in-place to match the counts in sorted order.
25
24
26
-
**Implementation**
25
+
### Implementation
27
26
```python
28
27
from collections import defaultdict
29
28
defsort_colors(colors):
@@ -42,41 +41,40 @@ def sort_colors(colors):
42
41
idx +=1
43
42
```
44
43
45
-
**Time and space complexity**
46
-
This solution has complexity O(N), since we loop through the list once, then loop through the dictionary to modify our list, both of which take N time. This solution takes up O(1) space, since everything is done in place and the counts dictionary has a constant size.
44
+
### Time and space complexity
45
+
This solution has complexity O(n), since we loop through the list once, then loop through the entire dictionary to modify our list. This solution takes up O(1) space, since everything is done in place and the dictionary has a constant size.
47
46
48
47
## Approach #3: Three-way partition
49
48
This approach uses multiple pointers. Reading the [two pointer guide](https://guides.codepath.com/compsci/Two-pointer) may be helpful.
50
49
51
-
**Approach**
52
-
Although we cannot asymptotically do better than O(N) since we need to pass through the list at least once, we can limit our code to only making one pass. This will be slightly faster than approach #2.
50
+
### Approach
51
+
Although we cannot asymptotically do better than O(n), since we need to pass through the list at least once, we can limit our code to only making one pass. This will be slightly faster than approach #2.
53
52
54
-
We can accomplish this by seeing that sorting an array with three distinct elements is equivalent to a `partition` operation. Recall that in quick sort, we partition an array to put all elements less than a pivot to the left and greater than to a right. Since we only have three potential values in our list, partitioning using the middle value as a pivot will effectively sort the list.
53
+
We can accomplish this by recognizing that sorting an array with three distinct elements is equivalent to a partition operation. Recall that in quick sort, we partition an array to put all elements with values less than a pivot on the left and elements with values greater than a pivot to a right. Since we only have three potential values in our list, partitioning using the middle value as a pivot will effectively sort the list.
55
54
56
55
This particular type of partition is a bit tricky though because we're partitioning on the middle element (the 1's) of our list. It's called a three-way partition, since we are also grouping together elements that are equal in the middle (the 1's).
0 commit comments