Skip to content

Commit c2fee0b

Browse files
committed
Add empty files
1 parent b8d4931 commit c2fee0b

File tree

4 files changed

+12
-202
lines changed

4 files changed

+12
-202
lines changed

algorithms/Algorithms_Answers.md

Lines changed: 1 addition & 74 deletions
Original file line numberDiff line numberDiff line change
@@ -1,74 +1 @@
1-
# Exercise I
2-
3-
## Part a)
4-
O(n) / linear - the bound is cubic, but the rate of growth is quadratic.
5-
6-
## Part b)
7-
O(log(n)) / logarithmic (assuming integers not floats) - `i` is cut in half each
8-
iteration, and the comparison to `x` is irrelevant.
9-
10-
## Part c)
11-
O(sqrt(n)) - a somewhat unusual but still distinct complexity, determined fully
12-
by the outer loop. The two inner loops are both always length 8 so are dropped.
13-
14-
## Part d)
15-
O(n*log(n)) / log-linear (or "linearithmic") - the outer loop is logarithmic,
16-
the inner is linear, and the total complexity is just their product.
17-
18-
## Part e)
19-
O(n^3) / cubic - the outer loop is linear, the next two nested loops are both
20-
linear (in the worst/general case), and the innermost is just constant (9).
21-
22-
## Part f)
23-
O(n) / linear - essentially equivalent to a simple for loop but recursive.
24-
25-
## Part g)
26-
O(n) / linear - it can return early, but worst/general case is linear.
27-
28-
29-
# Exercise II
30-
31-
## Part a)
32-
The naive (quadratic) approach would be a nested for loop, over `i` and `j`,
33-
that compares all possible combinations and remembers the one with the biggest
34-
difference.
35-
36-
The correct/efficient (linear) approach is to loop over the array once tracking
37-
the minimum value and maximum difference in the same pass.
38-
39-
Pseudocode:
40-
```
41-
minVal = a[0]
42-
maxDiff = 0
43-
for i in 1...n:
44-
minVal = min(minVal, a[i])
45-
maxDiff = max(maxDiff, a[i] - minVal)
46-
return maxDiff
47-
```
48-
49-
## Part b)
50-
This is a conceptual question, and can be answered in a number of ways - the key
51-
insight is that you should essentially implement a binary search. Start by
52-
dropping an egg from the middle floor - if it breaks, go to the midway point
53-
between where you are and the bottom. If it doesn't, go to the midway point
54-
between where you are and the top. Either way, continue until you've narrowed
55-
on the specific lowest floor that causes it to break, and that is `f`.
56-
57-
58-
# Exercise III
59-
60-
## Part a)
61-
The runtime will be O(n^2) / quadratic - there will be a linear number of passes
62-
on possible pivots, and each pivot will be compared with (on average) a linear
63-
number of other elements.
64-
65-
## Part b)
66-
The runtime will be O(n*log(n)) / log-linear - there will be a logarithmic
67-
number of pivots, and on average a linear number of comparisons for each pivot.
68-
69-
Note that practical implementations of Quicksort often use randomization - that
70-
is, pivots are chosen at random. The result (though it requires a bit of
71-
statistics to prove) is, on average, O(n*log(n)) performance - and the
72-
probability of the quadratic worst case is vanishingly small.
73-
74-
1+
Add your answers to the Algorithms exercises here.

data_structures/Data_Structures_Answers.md

Lines changed: 0 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -3,141 +3,79 @@ For each of the methods associated with each data structure, classify it based o
33
## Linked List
44

55
1. What is the runtime complexity of `addToTail`?
6-
7-
O(1) since the linked list has a pointer directly to the tail of the list.
86

97
a. What if our list implementation didn't have a reference to the tail of the list in its constructor? What would be the runtime of the `addToTail` method?
108

11-
Without the tail pointer, `addToTail` would be an O(n) method since we'd have to start at the head pointer and traverse along the entire length of the list.
12-
139
2. What is the runtime complexity of `removeHead`?
1410

15-
O(1) since the linked list has a pointer directly to the head of the list.
16-
1711
3. What is the runtime complexity of `contains`?
1812

19-
O(n) regardless of whether an iterative or recursive solution was used.
20-
2113
4. What is the runtime complexity of `getMax`?
2214

23-
O(n) since in the worst case we need to iterate through the entire length of the list to check every value to ensure we have the max value of the entire list.
2415

2516
## Queue
2617

2718
1. What is the runtime complexity of `enqueue`?
2819

29-
O(1) since the underlying `addToTail` method runs in constant time.
30-
3120
2. What is the runtime complexity of `dequeue`?
3221

33-
O(1) since the underlying `removeHead` method runs in constant time.
34-
3522
3. What is the runtime complexity of `isEmpty`?
3623

37-
O(1)
38-
3924
4. What is the runtime complexity of `length`?
4025

41-
O(1)
4226

4327
## Doubly Linked List
4428

4529
1. What is the runtime complexity of `ListNode.insertAfter`?
4630

47-
O(1) since we're just adding a new node as this node's `next` node, which we can access in constant time.
48-
4931
2. What is the runtime complexity of `ListNode.insertBefore`?
5032

51-
O(1) since we're just adding a new node as this node's `prev` node, which we can access in constant time.
52-
5333
3. What is the runtime complexity of `ListNode.delete`?
5434

55-
O(1) since we're simply rearranging `prev` and `next` pointers. This operation only ever touches the calling node's previous and next nodes.
56-
5735
4. What is the runtime complexity of `DoublyLinkedList.addToHead`?
5836

59-
O(1) since we have a pointer to the head of the list.
60-
6137
5. What is the runtime complexity of `DoublyLinkedList.removeFromHead`?
6238

63-
O(1) since we have a pointer to the head of the list.
64-
6539
6. What is the runtime complexity of `DoublyLinkedList.addToTail`?
6640

67-
O(1) since we have a pointer to the tail of the list.
68-
6941
7. What is the runtime complexity of `DoublyLinkedList.removeFromTail`?
7042

71-
O(1) since we have a pointer to the tail of the list.
72-
7343
8. What is the runtime complexity of `DoublyLinkedList.moveToFront`?
7444

75-
O(1) since this method receives the node that we wish to move to the front of the list. We don't need to go searching through the list to find the node.
76-
7745
9. What is the runtime complexity of `DoublyLinkedList.moveToBack`?
7846

79-
O(1) since this method receives the node that we wish to move to the end of the list. We don't need to go searching through the list to find the node.
80-
8147
10. What is the runtime complexity of `DoublyLinkedList.delete`?
8248

83-
O(1) since we simply call the node's `delete` method, which runs in constant time.
84-
8549
a. Compare the runtime of the doubly linked list's `delete` method with the worst-case runtime of the `Array.splice` method. Which method generally performs better?
8650

87-
The `Array.splice` method has a worst-case runtime of O(n). This is because when splicing an element out of an array, the elements that come after the element that was deleted in the array need to all be shifted up to fill the empty spot the deleted element left behind. Thus, in the worst case, we could have to shift _every_ element forward one spot if the very first array element was spliced out of the array.
8851

8952
## Binary Search Tree
9053

9154
1. What is the runtime complexity of your `checkBalanced` function?
9255

93-
Ideally, the `checkBalanced` function runs in O(n) time. Even though the possibility exists of the function returning early by short-circuiting upon finding two branches that aren't perfectly balanced, this fact can only affect wall-clock time, not theoretical Big O runtime. The theoretical upper bound of the `checkBalanced` function can't be faster than O(n) since the case of a perfectly balanced tree, the function needs to traverse every node in the tree.
94-
9556
2. What is the runtime complexity of `insert`?
9657

97-
O(log n) since we're traversing the binary search tree in 'levels' instead of inspecting each element in turn. At each level we determine whether we continue traversing down the left subtree or the right subtree, meaning we don't need to check the other side of the tree at every level.
98-
9958
3. What is the runtime complexity of `contains`?
10059

101-
O(log n) since we're traversing the binary search tree in 'levels' instead of inspecting each element in turn. At each level we determine whether we continue traversing down the left subtree or the right subtree, meaning we don't need to check the other side of the tree at every level.
102-
10360
4. What is the runtime complexity of `getMax`?
10461

105-
O(log n) even though we simply walk the right subtree as far as it goes. This operation still depends upon the number of elements in the tree, since that contributes to how far right we need to keep going in the tree.
106-
10762
5. What is the runtime complexity of `depthFirstForEach`?
10863

109-
O(n) since we're traversing _all_ of the elements in the tree. The fact that we're doing this in a depth-first fashion doesn't change that fact.
110-
11164
6. What is the runtime complexity of `breadthFirstForEach`?
11265

113-
O(n) wince we're traversing _all_ of the elements in the tree. The fact that we're doing this in a breadth-first fashion doesn't change that fact.
11466

11567
## Heap
11668

11769
1. What is the runtime complexity of your `heapsort` function?
11870

119-
Ideally, the `heapsort` function loops twice over the length of the input array, once to fill the heap data structure with all of the data from the input array, and once again to remove the elements from the heap in order to populate another array that holds the sorted data. Each loop executes a O(log n) method (both `insert` and `delete` have O(log n) runtimes) n times. So the overall runtime complexity of `heapsort` is O(2n * log n), which simplifies to O(n log n).
120-
12171
2. What is the space complexity of the `heapsort` function? Recall that your implementation should return a new array with the sorted data. What would be the space complexity if your function instead altered the input array?
12272

123-
The space complexity of the `heapsort` function is O(n) since we're returnign a sorted copy of the input data. The sorted array will have the same length as the input array, hence we have linear growth in our space requirements. If our `heapsort` function instead sorted the input array by mutating it directly, then we would not be incurring any additional space, so the space complexity in that case would be O(1).
124-
12573
3. What is the runtime complexity of `bubbleUp`?
12674

127-
O(log n) since a heap has a binary tree structure. In the worst case, we only need to bubble up a heap element up along a single 'path' in the heap. All the other elements in the heap don't need to be touched. In other words, when bubbling up an element to the top of the heap, each recursive call of the `bubbleUp` function moves an element up a level in the heap.
128-
12975
4. What is the runtime complexity of `siftDown`?
13076

131-
O(log n) following the same logic of the `bubbleUp` method, excect in this case, elements are swapped down the tree, with each recursive call to `siftDown` moving the element down a level in the heap.
132-
13377
5. What is the runtime complexity of `insert`?
13478

135-
O(log n) since `push`ing to the `storage` array is a constant time operation, so the biggest contributor is the call to the `bubbleUp` method, which incurs O(log n) complexity.
136-
13779
6. What is the runtime complexity of `delete`?
13880

139-
O(log n) since the dominating operation is the call to the `siftDown` method.
140-
14181
7. What is the runtime complexity of `getMax`?
142-
143-
O(1) since the heap maintains the invariant that the maximum value is located at `this.storage[1]`.

data_structures/src/binary-search-tree.js

Lines changed: 6 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
const checkBalanced = (rootNode) => {
2+
/* Your code here */
3+
4+
};
5+
16
class BinarySearchTree {
27
constructor(value) {
38
this.value = value;
@@ -66,57 +71,7 @@ class BinarySearchTree {
6671
}
6772
}
6873

69-
/* Recursive Solution */
70-
const checkBalanced = (rootNode) => {
71-
if (!rootNode) return true;
72-
73-
const minDepth = (node) => {
74-
if (!node) return 0;
75-
return 1 + Math.min(minDepth(node.left), minDepth(node.right));
76-
};
77-
78-
const maxDepth = (node) => {
79-
if (!node) return 0;
80-
return 1 + Math.max(maxDepth(node.left), maxDepth(node.right));
81-
}
82-
83-
return (maxDepth(rootNode) - minDepth(rootNode) === 0);
84-
};
85-
86-
/* Iterative Solution */
87-
// const checkBalanced = (rootNode) => {
88-
// if (!rootNode) return true;
89-
// const depths = [];
90-
// const nodes = [];
91-
92-
// nodes.push([rootNode, 0]);
93-
94-
// while (nodes.length) {
95-
// const nodePair = nodes.pop();
96-
// const node = nodePair[0];
97-
// const depth = nodePair[1];
98-
99-
// if (!node.left && !node.right) {
100-
// if (depths.indexOf(depth) < 0) {
101-
// depths.push(depth);
102-
103-
// if ((depths.length > 2) || (depths.length === 2 && Math.abs(depths[0] - depths[1]) > 0)) {
104-
// return false;
105-
// }
106-
// }
107-
// } else {
108-
// if (node.left) {
109-
// nodes.push([node.left, depth + 1]);
110-
// }
111-
// if (node.right) {
112-
// nodes.push([node.right, depth + 1]);
113-
// }
114-
// }
115-
// }
116-
// return true;
117-
// };
118-
11974
module.exports = {
12075
BinarySearchTree,
12176
checkBalanced,
122-
}
77+
};

data_structures/src/heap.js

Lines changed: 5 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
const heapsort = (arr) => {
2+
/* Your code here */
3+
4+
};
5+
16
class Heap {
27
constructor() {
38
this.storage = [null];
@@ -60,21 +65,6 @@ class Heap {
6065
}
6166
}
6267

63-
const heapsort = (arr) => {
64-
let heap = new Heap();
65-
const sorted = new Array(arr.length);
66-
67-
for (let i = 0; i < arr.length; i++) {
68-
heap.insert(arr[i]);
69-
}
70-
71-
for (let i = arr.length - 1; i > -1; i--) {
72-
sorted[i] = heap.delete();
73-
}
74-
75-
return sorted;
76-
};
77-
7868
module.exports = {
7969
Heap,
8070
heapsort,

0 commit comments

Comments
 (0)