You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: data_structures/Data_Structures_Answers.md
-62Lines changed: 0 additions & 62 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,141 +3,79 @@ For each of the methods associated with each data structure, classify it based o
3
3
## Linked List
4
4
5
5
1. What is the runtime complexity of `addToTail`?
6
-
7
-
O(1) since the linked list has a pointer directly to the tail of the list.
8
6
9
7
a. What if our list implementation didn't have a reference to the tail of the list in its constructor? What would be the runtime of the `addToTail` method?
10
8
11
-
Without the tail pointer, `addToTail` would be an O(n) method since we'd have to start at the head pointer and traverse along the entire length of the list.
12
-
13
9
2. What is the runtime complexity of `removeHead`?
14
10
15
-
O(1) since the linked list has a pointer directly to the head of the list.
16
-
17
11
3. What is the runtime complexity of `contains`?
18
12
19
-
O(n) regardless of whether an iterative or recursive solution was used.
20
-
21
13
4. What is the runtime complexity of `getMax`?
22
14
23
-
O(n) since in the worst case we need to iterate through the entire length of the list to check every value to ensure we have the max value of the entire list.
24
15
25
16
## Queue
26
17
27
18
1. What is the runtime complexity of `enqueue`?
28
19
29
-
O(1) since the underlying `addToTail` method runs in constant time.
30
-
31
20
2. What is the runtime complexity of `dequeue`?
32
21
33
-
O(1) since the underlying `removeHead` method runs in constant time.
34
-
35
22
3. What is the runtime complexity of `isEmpty`?
36
23
37
-
O(1)
38
-
39
24
4. What is the runtime complexity of `length`?
40
25
41
-
O(1)
42
26
43
27
## Doubly Linked List
44
28
45
29
1. What is the runtime complexity of `ListNode.insertAfter`?
46
30
47
-
O(1) since we're just adding a new node as this node's `next` node, which we can access in constant time.
48
-
49
31
2. What is the runtime complexity of `ListNode.insertBefore`?
50
32
51
-
O(1) since we're just adding a new node as this node's `prev` node, which we can access in constant time.
52
-
53
33
3. What is the runtime complexity of `ListNode.delete`?
54
34
55
-
O(1) since we're simply rearranging `prev` and `next` pointers. This operation only ever touches the calling node's previous and next nodes.
56
-
57
35
4. What is the runtime complexity of `DoublyLinkedList.addToHead`?
58
36
59
-
O(1) since we have a pointer to the head of the list.
60
-
61
37
5. What is the runtime complexity of `DoublyLinkedList.removeFromHead`?
62
38
63
-
O(1) since we have a pointer to the head of the list.
64
-
65
39
6. What is the runtime complexity of `DoublyLinkedList.addToTail`?
66
40
67
-
O(1) since we have a pointer to the tail of the list.
68
-
69
41
7. What is the runtime complexity of `DoublyLinkedList.removeFromTail`?
70
42
71
-
O(1) since we have a pointer to the tail of the list.
72
-
73
43
8. What is the runtime complexity of `DoublyLinkedList.moveToFront`?
74
44
75
-
O(1) since this method receives the node that we wish to move to the front of the list. We don't need to go searching through the list to find the node.
76
-
77
45
9. What is the runtime complexity of `DoublyLinkedList.moveToBack`?
78
46
79
-
O(1) since this method receives the node that we wish to move to the end of the list. We don't need to go searching through the list to find the node.
80
-
81
47
10. What is the runtime complexity of `DoublyLinkedList.delete`?
82
48
83
-
O(1) since we simply call the node's `delete` method, which runs in constant time.
84
-
85
49
a. Compare the runtime of the doubly linked list's `delete` method with the worst-case runtime of the `Array.splice` method. Which method generally performs better?
86
50
87
-
The `Array.splice` method has a worst-case runtime of O(n). This is because when splicing an element out of an array, the elements that come after the element that was deleted in the array need to all be shifted up to fill the empty spot the deleted element left behind. Thus, in the worst case, we could have to shift _every_ element forward one spot if the very first array element was spliced out of the array.
88
51
89
52
## Binary Search Tree
90
53
91
54
1. What is the runtime complexity of your `checkBalanced` function?
92
55
93
-
Ideally, the `checkBalanced` function runs in O(n) time. Even though the possibility exists of the function returning early by short-circuiting upon finding two branches that aren't perfectly balanced, this fact can only affect wall-clock time, not theoretical Big O runtime. The theoretical upper bound of the `checkBalanced` function can't be faster than O(n) since the case of a perfectly balanced tree, the function needs to traverse every node in the tree.
94
-
95
56
2. What is the runtime complexity of `insert`?
96
57
97
-
O(log n) since we're traversing the binary search tree in 'levels' instead of inspecting each element in turn. At each level we determine whether we continue traversing down the left subtree or the right subtree, meaning we don't need to check the other side of the tree at every level.
98
-
99
58
3. What is the runtime complexity of `contains`?
100
59
101
-
O(log n) since we're traversing the binary search tree in 'levels' instead of inspecting each element in turn. At each level we determine whether we continue traversing down the left subtree or the right subtree, meaning we don't need to check the other side of the tree at every level.
102
-
103
60
4. What is the runtime complexity of `getMax`?
104
61
105
-
O(log n) even though we simply walk the right subtree as far as it goes. This operation still depends upon the number of elements in the tree, since that contributes to how far right we need to keep going in the tree.
106
-
107
62
5. What is the runtime complexity of `depthFirstForEach`?
108
63
109
-
O(n) since we're traversing _all_ of the elements in the tree. The fact that we're doing this in a depth-first fashion doesn't change that fact.
110
-
111
64
6. What is the runtime complexity of `breadthFirstForEach`?
112
65
113
-
O(n) wince we're traversing _all_ of the elements in the tree. The fact that we're doing this in a breadth-first fashion doesn't change that fact.
114
66
115
67
## Heap
116
68
117
69
1. What is the runtime complexity of your `heapsort` function?
118
70
119
-
Ideally, the `heapsort` function loops twice over the length of the input array, once to fill the heap data structure with all of the data from the input array, and once again to remove the elements from the heap in order to populate another array that holds the sorted data. Each loop executes a O(log n) method (both `insert` and `delete` have O(log n) runtimes) n times. So the overall runtime complexity of `heapsort` is O(2n * log n), which simplifies to O(n log n).
120
-
121
71
2. What is the space complexity of the `heapsort` function? Recall that your implementation should return a new array with the sorted data. What would be the space complexity if your function instead altered the input array?
122
72
123
-
The space complexity of the `heapsort` function is O(n) since we're returnign a sorted copy of the input data. The sorted array will have the same length as the input array, hence we have linear growth in our space requirements. If our `heapsort` function instead sorted the input array by mutating it directly, then we would not be incurring any additional space, so the space complexity in that case would be O(1).
124
-
125
73
3. What is the runtime complexity of `bubbleUp`?
126
74
127
-
O(log n) since a heap has a binary tree structure. In the worst case, we only need to bubble up a heap element up along a single 'path' in the heap. All the other elements in the heap don't need to be touched. In other words, when bubbling up an element to the top of the heap, each recursive call of the `bubbleUp` function moves an element up a level in the heap.
128
-
129
75
4. What is the runtime complexity of `siftDown`?
130
76
131
-
O(log n) following the same logic of the `bubbleUp` method, excect in this case, elements are swapped down the tree, with each recursive call to `siftDown` moving the element down a level in the heap.
132
-
133
77
5. What is the runtime complexity of `insert`?
134
78
135
-
O(log n) since `push`ing to the `storage` array is a constant time operation, so the biggest contributor is the call to the `bubbleUp` method, which incurs O(log n) complexity.
136
-
137
79
6. What is the runtime complexity of `delete`?
138
80
139
-
O(log n) since the dominating operation is the call to the `siftDown` method.
140
-
141
81
7. What is the runtime complexity of `getMax`?
142
-
143
-
O(1) since the heap maintains the invariant that the maximum value is located at `this.storage[1]`.
0 commit comments