Skip to content

Commit 32ce4bc

Browse files
committed
Make small edits to strings and arrays guides
1 parent 0cc2007 commit 32ce4bc

File tree

5 files changed

+100
-235
lines changed

5 files changed

+100
-235
lines changed

Diff for: strings_arrays/binary_search.md

+8-8
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
Binary search is a method for locating an element in a sorted list efficiently. Searching for an element can done naively in **O(N)** time, but binary search speeds it up to **O(log N)**. Binary search is a great tool to keep in mind for array problems.
1+
Binary search is a technique for efficiently locating an element in a sorted list. Searching for an element can done naively in **O(n)** time by checking every element in the list, but binary search's optimization speeds it up to **O(log n)**. Binary search is a great tool to keep in mind for array problems.
22

33
Algorithm
44
------------------
5-
In binary search, you are provided a list of sorted numbers and a key. The desired output is the index of the key, if it exists and None if it doesn't.
5+
In binary search, you are provided a sorted list of numbers and a key. The desired output of a binary search is the index of the key in the sorted list, if the key is in the list, or ```None``` otherwise.
66

7-
Binary search is a recursive algorithm. The high level approach is that we examine the middle element of the list. The value of the middle element determines whether to terminate the algorithm (found the key), recursively search the left half of the list, or recursively search the right half of the list.
7+
Binary search is a recursive algorithm. From a high-level perspective, we examine the middle element of the list, which determines whether to terminate the algorithm (found the key), recursively search the left half of the list (middle element value > key), or recursively search the right half of the list (middle element value < key).
88
```
99
def binary_search(nums, key):
1010
if nums is empty:
@@ -13,20 +13,20 @@ def binary_search(nums, key):
1313
return middle index
1414
if middle element is greater than key:
1515
binary search left half of nums
16-
if middle element is less than
16+
if middle element is less than
1717
binary search right half of nums
1818
```
1919

2020
There are two canonical ways of implementing binary search: recursive and iterative. Both solutions utilizes two pointers that keep track of the portion of the list we are searching.
2121

2222
### Recursive Binary Search
2323

24-
The recursive solution utilizes a helper function to keep track of pointers to the section of the list we are currently examining. The search either completes when we find the key, or the two pointers meet.
24+
The recursive approach utilizes a helper function to keep track of pointers to the section of the list we are currently examining. The search either terminates when we find the key or if the two pointers meet.
2525

2626
```python
2727
def binary_search(nums, key):
2828
return binary_search_helper(nums, key, 0, len(nums))
29-
29+
3030
def binary_search_helper(nums, key, start_idx, end_idx):
3131
middle_idx = (start_idx + end_idx) // 2
3232
if start_idx == end_idx:
@@ -41,7 +41,7 @@ def binary_search_helper(nums, key, start_idx, end_idx):
4141

4242
### Iterative Binary Search
4343

44-
The iterative solution manually keeps track of the section of the list we are examining, using the two-pointer technique. The search either completes when we find the key, or the two pointers meet.
44+
The iterative approach manually keeps track of the section of the list we are examining using the two-pointer technique. The search either terminates when we find the key, or the two pointers meet.
4545
```python
4646
def binary_search(nums, key):
4747
left_idx, right_idx = 0, len(nums)
@@ -58,7 +58,7 @@ def binary_search(nums, key):
5858

5959
## Runtime and Space Complexity
6060

61-
Binary search completes in **O(log N)** time because each iteration decreases the size of the list by a factor of 2. Its space complexity is constant because we only need to maintain two pointers to locations in the list. Even the recursive solution has constant space with [tail call optimization](https://en.wikipedia.org/wiki/Tail_call).
61+
Binary search has **O(log n)** time complexity because each iteration decreases the size of the list by a factor of 2. Its space complexity is constant because we only need to maintain two pointers. Even the recursive solution has constant space with [tail call optimization](https://en.wikipedia.org/wiki/Tail_call).
6262

6363
## Example problems
6464
* [Search insert position](https://leetcode.com/problems/search-insert-position/description/)

Diff for: strings_arrays/sorting.md

+41-37
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
Sorting is a fundamental tool for tackling problems, and is often utilized to help simplify problems.
22

3-
There are several different sorting algorithms, each with different tradeoffs. In this guide, we will cover several well-known sorting algorithms along with when they are useful.
3+
There are several different sorting algorithms, each with different tradeoffs. In this guide, we will cover several well-known sorting algorithms along with when they are useful.
44

5-
We will go into detail for merge sort and quick sort, but will describe the rest at a high level.
5+
We will describe merge sort and quick sort in detail and the remainder of the featured sorting algorithms at a high level.
66

77
## Terminology
88
Two commonly used terms in sorting are:
@@ -11,11 +11,11 @@ Two commonly used terms in sorting are:
1111
2. **stable sort**: retains the order of duplicate elements after the sort ([3, <u>2</u>, 4, **2**] -> [<u>2</u>, **2**, 3, 4])
1212

1313
## Merge sort
14-
**Merge sort** is perhaps the simplest sort to implement and has very consistent behavior. It adopts a divide-and-conquer strategy: recursively sort each half of the list, and then perform an O(N) merging operation to create a fully sorted list.
14+
**Merge sort** is perhaps the simplest sort to implement and has very consistent behavior. It adopts a divide-and-conquer strategy: recursively sort each half of the list, and then perform an O(n) merging operation to create a fully sorted list.
1515

1616
### Implementation
1717

18-
The key operation in merge sort is `merge`, which is a function that takes two sorted lists and returns a single list which is sorted.
18+
The key operation in merge sort is `merge`, which is a function that takes two sorted lists and returns a single sorted list composed of elements of the combined lists.
1919
```python
2020
def merge(list1, list2):
2121
if len(list1) == 0:
@@ -31,7 +31,6 @@ This is a recursive implementation of `merge`, but an iterative implementation w
3131

3232
Given this `merge` operation, writing merge sort is quite simple.
3333

34-
3534
```python
3635
def merge_sort(nums):
3736
if len(nums) <= 1:
@@ -43,18 +42,17 @@ def merge_sort(nums):
4342
```
4443

4544
### Runtime
46-
47-
Merge sort is a recursive, divide and conquer algorithm. It takes O(log N) recursive merge sorts and each merge is O(N) time, so we have a final runtime of O(N log N) for merge sort. Its behavior is consistent regardless of the input list (its worst case and best case take the same amount of time).
45+
Merge sort is a recursive, divide and conquer algorithm. It takes O(log n) recursive merge sorts and each merge is O(n) time, so we have a final runtime of O(n log n) for merge sort. Its behavior is consistent regardless of the input list (its worst case and best case take the same amount of time).
4846

4947
**Summary**
50-
* Worst case: O(N log N)
51-
* Best case: O(N log N)
52-
* Stable: yes
53-
* In-place: no
48+
49+
| Worst case | Best case | Stable | In-place|
50+
|:----------:|:---------:|:------:|:-------:|
51+
| O(n log n) | O(n log n) |||
5452

5553
## Quick sort
5654

57-
**Quick sort** is also a divide and conquer strategy, but uses a two-pointer swapping technique instead of `merge`. The core idea of quick sort is to select a "pivot" element in the list (typically the middle element), and swap elements in the list such that everything left of the pivot is less than it, and everything right of the pivot is greater. We call this operation `partition`. Quick sort is notable in its ability to sort efficiently in-place.
55+
**Quick sort** is also a divide and conquer strategy, but uses a two-pointer swapping technique instead of `merge`. The core idea of quick sort is selecting a "pivot" element in the list (typically the middle element), and swapping elements in the list such that everything left of the pivot is less than it, and everything right of the pivot is greater. We call this operation `partition`. Quick sort is notable for its ability to sort efficiently in-place.
5856

5957
```python
6058
def partition(nums, left_idx, right_idx):
@@ -70,7 +68,7 @@ def partition(nums, left_idx, right_idx):
7068
left_idx += 1
7169
right_idx -= 1
7270
```
73-
The partition function modifies `nums` inplace and takes up no extra memory. It also takes O(N) time in the worst case to fully partition a list.
71+
The partition function modifies `nums` in-place and requires no extra memory. It also takes O(n) time worst case to fully partition a list.
7472

7573
```python
7674
def quick_sort_helper(nums, left_idx, right_idx):
@@ -88,56 +86,62 @@ def quick_sort(nums):
8886

8987
### Runtime
9088

91-
The best case performance of quick sort is O(N log N), but depending on the structure of the list, quick sort's performance can vary.
89+
The best case performance of quick sort is O(n log n), but depending on the structure of the list, quick sort's performance can vary.
9290

93-
If the pivot happens to be the median of the list, then the list will be divided in half after the partition.
91+
If the pivot happens to be the median of the list, then the list will be divided in half after the partition.
9492

9593
In the worst case, however, the list will be divided into an N - 1 length list and an empty list. Thus, in the worst possible case, quick sort has O(N<sup>2</sup>) performance, since we'll have to recursively quicksort (N - 1), (N - 2), ... many lists. However, on average and in practice, quick sort is still very fast due to how fast swapping array elements is.
9694

9795
The space complexity for this version of quick sort os O(log N), due to the number of call stacks created during recursion, but an iterative version can make space complexity O(1).
9896

9997
**Summary**
100-
* Worst case: O(N<sup>2</sup>)
101-
* Best case: O(N log N)
102-
* Stable: no
103-
* In-place: yes
98+
99+
| Worst case | Best case | Stable | In-place|
100+
|:----------:|:---------:|:------:|:-------:|
101+
| O(n<sup>2</sup>) | O(n log n)|||
104102

105103
## Insertion sort
106104

107105
In **insertion sort**, we incrementally build a sorted list from the unsorted list. We take elements from the unsorted list and insert them into the sorted list, making sure to maintain the order.
108106

109-
This algorithm takes O(N<sup>2</sup>) worst time, because looping through the unsorted list takes O(N) and finding the proper place to insert can take O(N) time in the worst case. However, if the list is already sorted, insertion sort takes O(N) time, since insertion time will be O(1). Insertion sort can be done in-place, so it takes up O(1) space.
107+
This algorithm takes O(n<sup>2</sup>) worst time, because looping through the unsorted list takes O(n) and finding the proper place to insert can take O(n) time in the worst case. However, if the list is already sorted, insertion sort takes O(n) time, since insertion time will be O(1). Insertion sort can be done in-place, so it takes up O(1) space.
110108

111-
Insertion sort is easier on linked lists, which have O(1) insertion whereas arrays have O(N) insertion because in an array, inserting an element requires shifting all the elements behind that element.
109+
Insertion sort is easier on linked lists, which have O(1) insertion whereas arrays have O(n) insertion because in an array, inserting an element requires shifting all the elements behind that element.
112110

113111
**Summary**
114-
* Worst case: O(N^2^)
115-
* Best case: O(N)
116-
* Stable: yes
117-
* In-place: yes
112+
113+
| Worst case | Best case | Stable | In-place|
114+
|:----------:|:---------:|:------:|:-------:|
115+
| O(n<sup>2</sup>) | O(n)|||
118116

119117
## Selection sort
120118

121119
**Selection sort** incrementally builds a sorted list by finding the minimum value in the rest of the list, and swapping it to be in the front.
122120

123-
It takes O(N<sup>2</sup>) time in general, because we have to loop through the unsorted list which is O(N) and in each iteration, we search the rest of the list which always takes O(N). Selection sort can be done in-place, so it takes up O(1) space.
121+
It takes O(n<sup>2</sup>) time in general, because we have to loop through the unsorted list which is O(n) and in each iteration, we search the rest of the list which always takes O(n). Selection sort can be done in-place, so it takes up O(1) space.
124122

125-
**Summary**
126-
* Worst case: O(N<sup>2</sup>)
127-
* Best case: O(N<sup>2</sup>)
128-
* Stable: no
129-
* In-place: yes
123+
| Worst case | Best case | Stable | In-place|
124+
|:----------:|:---------:|:------:|:-------:|
125+
| O(n<sup>2</sup>) | O(N<sup>2</sup>)|||
130126

131127
## Radix sort
132128

133129
**Radix sort** is a situational sorting algorithm when you know that the numbers you are sorting are bounded in some way. It operates by grouping numbers in the list by digit, looping through the digits in some order.
134130

135-
For example, if we had the list [100, 10, 1], radix sort would put 100 in the group which had 1 in the 100s digit place and would put (10, 1) in a group which had 0 in the 100s digit place. It would then sort by the 10s digit place, and finally the 1s digit place.
131+
For example, if we had the list ```[100, 10, 1]```, radix sort would put 100 in the group which had 1 in the 100s digit place and would put (10, 1) in a group which had 0 in the 100s digit place. It would then sort by the 10s digit place, and finally the 1s digit place.
136132

137133
Radix sort thus needs one pass for each digit place it is sorting and takes O(KN) time, where K is the number of passes necessary to cover all digits.
138134

139-
**Summary**
140-
* Worst case: O(KN)
141-
* Best case: O(KN)
142-
* Stable: yes (if going through digits from right to left)
143-
* In-place: no
135+
| Worst case | Best case | Stable | In-place|
136+
|:----------:|:---------:|:------:|:-------:|
137+
| O(kn) | O(kn)| ✅ (if going through digits from right to left) ||
138+
139+
## Summary
140+
141+
|Sort | Worst case | Best case | Stable | In-place|
142+
|:-:||:----------:|:---------:|:------:|:-------:|
143+
|Merge sort | O(n log n) | O(n log n) |||
144+
|Quick sort | O(n<sup>2</sup>) | O(n log n)|||
145+
| Insertion sort | O(n<sup>2</sup>) | O(n)|||
146+
|Selection sort | O(n<sup>2</sup>) | O(N<sup>2</sup>)|||
147+
|Radix sort| O(kn) | O(kn)| ✅ (if going through digits from right to left) ||

Diff for: strings_arrays/sorting_colors.md

+17-19
Original file line numberDiff line numberDiff line change
@@ -12,18 +12,17 @@ Example:
1212
```
1313

1414
## Approach #1: Merge or quick sort
15+
### Approach
16+
The problem is asking us to sort a list of integers, so we can use an algorithm like merge sort or quick sort.
1517

16-
**Approach**
17-
The problem is asking us to sort a list of integers, so we could potentially use an algorithm like merge sort or quick sort.
18-
19-
**Time and space complexity**
20-
With a sorting algorithm such as is O(N log N) in the worst case. The space complexity is O(1) since we sort in place.
18+
### Time and space complexity
19+
Both of these sorting algorithms have O(n log n) worst case time complexity and, because we sort in-place, O(1) space complexity.
2120

2221
## Approach #2: Counting sort
23-
**Approach**
24-
We know that the numbers we are sorting are 0, 1, or 2. This leads to an efficient counting sort implementation, since we can just count the numbers of each and modify the list in place to match the counts in sorted order.
22+
### Approach
23+
We know that the numbers we are sorting are 0, 1, or 2. This means we can sort more efficiently by simply counting the numbers of times each of the three values occurs and modifying the list in-place to match the counts in sorted order.
2524

26-
**Implementation**
25+
### Implementation
2726
```python
2827
from collections import defaultdict
2928
def sort_colors(colors):
@@ -42,41 +41,40 @@ def sort_colors(colors):
4241
idx += 1
4342
```
4443

45-
**Time and space complexity**
46-
This solution has complexity O(N), since we loop through the list once, then loop through the dictionary to modify our list, both of which take N time. This solution takes up O(1) space, since everything is done in place and the counts dictionary has a constant size.
44+
### Time and space complexity
45+
This solution has complexity O(n), since we loop through the list once, then loop through the entire dictionary to modify our list. This solution takes up O(1) space, since everything is done in place and the dictionary has a constant size.
4746

4847
## Approach #3: Three-way partition
4948
This approach uses multiple pointers. Reading the [two pointer guide](https://guides.codepath.com/compsci/Two-pointer) may be helpful.
5049

51-
**Approach**
52-
Although we cannot asymptotically do better than O(N) since we need to pass through the list at least once, we can limit our code to only making one pass. This will be slightly faster than approach #2.
50+
### Approach
51+
Although we cannot asymptotically do better than O(n), since we need to pass through the list at least once, we can limit our code to only making one pass. This will be slightly faster than approach #2.
5352

54-
We can accomplish this by seeing that sorting an array with three distinct elements is equivalent to a `partition` operation. Recall that in quick sort, we partition an array to put all elements less than a pivot to the left and greater than to a right. Since we only have three potential values in our list, partitioning using the middle value as a pivot will effectively sort the list.
53+
We can accomplish this by recognizing that sorting an array with three distinct elements is equivalent to a partition operation. Recall that in quick sort, we partition an array to put all elements with values less than a pivot on the left and elements with values greater than a pivot to a right. Since we only have three potential values in our list, partitioning using the middle value as a pivot will effectively sort the list.
5554

5655
This particular type of partition is a bit tricky though because we're partitioning on the middle element (the 1's) of our list. It's called a three-way partition, since we are also grouping together elements that are equal in the middle (the 1's).
5756

5857

59-
**Implementation**
60-
58+
### Implementation
6159
```python
6260
def sort_colors(colors):
6361
left, middle, right = 0, 0, len(colors) - 1
6462
while middle <= right:
6563
if colors[middle] == 0:
66-
colors[middle], colors[left] = colors[left], colors[middle]
64+
colors[middle], colors[left] = colors[left], colors[middle]
6765
left += 1
6866
middle += 1
6967
elif colors[middle] == 1:
7068
middle += 1
7169
elif colors[middle] == 2:
72-
colors[middle], colors[right] = colors[right], colors[middle]
70+
colors[middle], colors[right] = colors[right], colors[middle]
7371
right -= 1
7472
middle += 1
7573
```
7674

7775

78-
**Time and space complexity**
79-
This solution has also has complexity O(N), but only takes one pass since it uses two pointers that stop moving when one moves past the other.
76+
### Time and space complexity
77+
This solution has also has time complexity O(n), but only takes one pass since it uses two pointers that stop moving when one moves past the other.
8078

8179
It is slightly faster than the counting sort and is O(1) space, since it is in-place.
8280

0 commit comments

Comments
 (0)