diff --git a/docs/data-structures/segment-tree.md b/docs/data-structures/segment-tree.md index 458f92c..4e62f4f 100644 --- a/docs/data-structures/segment-tree.md +++ b/docs/data-structures/segment-tree.md @@ -103,6 +103,7 @@ Previously, update function was called to update only a single value in array. P ### Lazy Propogation Algorithm We need a structure that can perform following operations on an array $[1,N]$. + - Add inc to all elements in the given range $[l, r]$. - Return the sum of all elements in the given range $[l, r]$. @@ -118,6 +119,7 @@ Trick is to be lazy i.e, do work only when needed. Do the updates only when you Let’s be lazy as told, when we need to update an interval, we will update a node and mark its children that it needs to be updated and update them when needed. For this we need an array $lazy[]$ of the same size as that of segment tree. Initially all the elements of the $lazy[]$ array will be $0$ representing that there is no pending update. If there is non-zero element $lazy[k]$ then this element needs to update node k in the segment tree before making any query operation, then $lazy[2\cdot k]$ and $lazy[2 \cdot k + 1]$ must be also updated correspondingly. To update an interval we will keep 3 things in mind. + - If current segment tree node has any pending update, then first add that pending update to current node and push the update to it’s children. - If the interval represented by current node lies completely in the interval to update, then update the current node and update the $lazy[]$ array for children nodes. - If the interval represented by current node overlaps with the interval to update, then update the nodes as the earlier update function. @@ -202,6 +204,7 @@ Notice that the only difference with the regular query function is pushing the l ## Binary Search on Segment Tree Assume we have an array A that contains elements between 1 and $M$. We have to perform 2 kinds of operations. + - Change the value of the element in given index i by x. - Return the value of the kth element on the array when sorted. @@ -240,8 +243,9 @@ This is of course, slow. Let’s use segment tree’s to improve it. First we wi
![segment tree updates](img/updated_segtree.png){ width="100%" }
Segment Tree After First Update
+
-```c++ +```cpp void update(int i, int x) { update(1, 1, M, A[i], --F[A[i]]); // Decrement frequency of old value A[i] = x; // Update A[i] to new value @@ -263,7 +267,7 @@ int query(int k) { If you look at the code above you can notice that each update takes $\mathcal{O}(\log M)$ time and each query takes $\mathcal{O}(\log^{2} M)$ time, but we can do better. -### How To Speed Up? +### How To Speed Up? If you look at the segment tree solution on preceding subsection you can see that queries are performed in $\mathcal{O}(\log^{2} M)$ time. We can make is faster, actually we can reduce the time complexity to $\mathcal{O}(\log M)$ which is same with the time complexity for updates. We will do the binary search when we are traversing the segment tree. We first will start from the root and look at its left child’s sum value, if this value is greater than k, this means our answer is somewhere in the left child’s subtree. Otherwise it is somewhere in the right child’s subtree. We will follow a path using this rule until we reach a leaf, then this will be our answer. Since we just traversed $\mathcal{O}(\log M)$ nodes (one node at each level), time complexity will be $\mathcal{O}(\log M)$. Look at the code below for better understanding.
@@ -271,7 +275,7 @@ If you look at the segment tree solution on preceding subsection you can see tha
Solution of First Query
-```c++ +```cpp void update(int i, int x) { update(1, 1, M, A[i], --F[A[i]]); // Decrement frequency of old value A[i] = x; // Update A[i] to new value @@ -289,4 +293,4 @@ int query(int node, int start, int end, int k) { int query(int k) { return query(1, 1, M, k); // Public interface for querying } -``` \ No newline at end of file +``` diff --git a/docs/graph/binary-search-tree.md b/docs/graph/binary-search-tree.md new file mode 100644 index 0000000..866726a --- /dev/null +++ b/docs/graph/binary-search-tree.md @@ -0,0 +1,156 @@ +--- +title: Binary Search Tree +tags: + - Tree + - Binary Search + - BST +--- + +A Binary tree is a tree data structure in which each node has at most two children, which are referred to as the left child and the right child. + +For a binary tree to be a binary search tree, the values of all the nodes in the left sub-tree of the root node should be smaller than the root node's value. Also the values of all the nodes in the right sub-tree of the root node should be larger than the root node's value. + +
+![a simple binary search tree](img/binarytree.png) +
a simple binary search tree
+
+ +## Insertion Algorithm + +1. Compare values of the root node and the element to be inserted. +2. If the value of the root node is larger, and if a left child exists, then repeat step 1 with root = current root's left child. Else, insert element as left child of current root. +3. If the value of the root node is lesser, and if a right child exists, then repeat step 1 with root = current root's right child. Else, insert element as right child of current root. + +## Deletion Algorithm +- Deleting a node with no children: simply remove the node from the tree. +- Deleting a node with one child: remove the node and replace it with its child. +- Node to be deleted has two children: Find inorder successor of the node. Copy contents of the inorder successor to the node and delete the inorder successor. +- Note that: inorder successor can be obtained by finding the minimum value in right child of the node. + +## Sample Code + +```c +// C program to demonstrate delete operation in binary search tree +#include +#include + +struct node +{ + int key; + struct node *left, *right; +}; + +// A utility function to create a new BST node +struct node *newNode(int item) +{ + struct node *temp = (struct node *)malloc(sizeof(struct node)); + temp->key = item; + temp->left = temp->right = NULL; + return temp; +} + +// A utility function to do inorder traversal of BST +void inorder(struct node *root) +{ + if (root != NULL) + { + inorder(root->left); + printf("%d ", root->key); + inorder(root->right); + } +} + +/* A utility function to insert a new node with given key in BST */ +struct node* insert(struct node* node, int key) +{ + /* If the tree is empty, return a new node */ + if (node == NULL) return newNode(key); + + /* Otherwise, recur down the tree */ + if (key < node->key) + node->left = insert(node->left, key); + else + node->right = insert(node->right, key); + + /* return the (unchanged) node pointer */ + return node; +} + +/* Given a non-empty binary search tree, return the node with minimum + key value found in that tree. Note that the entire tree does not + need to be searched. */ +struct node * minValueNode(struct node* node) +{ + struct node* current = node; + + /* loop down to find the leftmost leaf */ + while (current->left != NULL) + current = current->left; + + return current; +} + +/* Given a binary search tree and a key, this function deletes the key + and returns the new root */ +struct node* deleteNode(struct node* root, int key) +{ + // base case + if (root == NULL) return root; + + // If the key to be deleted is smaller than the root's key, + // then it lies in left subtree + if (key < root->key) + root->left = deleteNode(root->left, key); + + // If the key to be deleted is greater than the root's key, + // then it lies in right subtree + else if (key > root->key) + root->right = deleteNode(root->right, key); + + // if key is same as root's key, then This is the node + // to be deleted + else + { + // node with only one child or no child + if (root->left == NULL) + { + struct node *temp = root->right; + free(root); + return temp; + } + else if (root->right == NULL) + { + struct node *temp = root->left; + free(root); + return temp; + } + + // node with two children: Get the inorder successor (smallest + // in the right subtree) + struct node* temp = minValueNode(root->right); + + // Copy the inorder successor's content to this node + root->key = temp->key; + + // Delete the inorder successor + root->right = deleteNode(root->right, temp->key); + } + return root; +} +``` + +## Time Complexity + +The worst case time complexity of search, insert, and deletion operations is $\mathcal{O}(h)$ where h is the height of Binary Search Tree. In the worst case, we may have to travel from root to the deepest leaf node. The height of a skewed tree may become $N$ and the time complexity of search and insert operation may become $\mathcal{O}(N)$. So the time complexity of establishing $N$ node unbalanced tree may become $\mathcal{O}(N^2)$ (for example the nodes are being inserted in a sorted way). But, with random input the expected time complexity is $\mathcal{O}(NlogN)$. + +However, you can implement other data structures to establish Self-balancing binary search tree (which will be taught later), popular data structures that implementing this type of tree include: + +- 2-3 tree +- AA tree +- AVL tree +- B-tree +- Red-black tree +- Scapegoat tree +- Splay tree +- Treap +- Weight-balanced tree diff --git a/docs/graph/heap.md b/docs/graph/heap.md new file mode 100644 index 0000000..2711201 --- /dev/null +++ b/docs/graph/heap.md @@ -0,0 +1,138 @@ +--- +title: Heap +tags: + - Heap + - Priority Queue +--- + +
+![a simple binary search tree](img/360px-Max-Heap.png) +
an example max-heap with 9 nodes
+
+ +The heap is a complete binary tree with N nodes, the value of all the nodes in the left and right sub-tree of the root node should be smaller than the root node's value. + +In a heap, the highest (or lowest) priority element is always stored at the root. A heap is not a sorted structure and can be regarded as partially ordered. As visible from the heap-diagram, there is no particular relationship among nodes on any given level, even among the siblings. Because a heap is a complete binary tree, it has a smallest possible height. A heap with $N$ nodes has $logN$ height. A heap is a useful data structure when you need to remove the object with the highest (or lowest) priority. + +## Implementation + +Heaps are usually implemented in an array (fixed size or dynamic array), and do not require pointers between elements. After an element is inserted into or deleted from a heap, the heap property may be violated and the heap must be balanced by internal operations. + +The first (or last) element will contain the root. The next two elements of the array contain its children. The next four contain the four children of the two child nodes, etc. Thus the children of the node at position n would be at positions $2*n$ and $2*n + 1$ in a one-based array. This allows moving up or down the tree by doing simple index computations. Balancing a heap is done by sift-up or sift-down operations (swapping elements which are out of order). So we can build a heap from an array without requiring extra memory. + +
+![example a heap as an array](img/Heap-as-array.png) +
example a heap as an array
+
+ +## Insertion + +Basically add the new element at the end of the heap. Then look it's parent if it is smaller or bigger depends on the whether it is max-heap or min-heap (max-heap called when Parents are always greater), swap with the parent. If it is swapped do the same operation for the parent. + +## Deletion + +If you are going to delete a node (root node or another one does not matter), + +1. Swap the node to be deleted with the last element of heap to maintain a balanced structure. +2. Delete the last element which is the node we want to delete at the start. +3. Now you have a node which is in the wrong place, You have to find the correct place for the swapped last element, to do this starting point you should check its left and right children, if one them is greater than our node you should swap it with the greatest child(or smallest if it is min-heap). +4. Still current node may in the wrong place, so apply Step 3 as long as it is not greater than its children(or smaller if it is min-heap). + +
+![](img/heap1.png) +![](img/heap2.png) +
an example deletion on a heap structure
+
+ +```py +class BinHeap: + def __init__(self): + self.heapList = [0] + self.currentSize = 0 + + def percUp(self,i): + while i // 2 > 0: + if self.heapList[i] < self.heapList[i // 2]: + tmp = self.heapList[i // 2] + self.heapList[i // 2] = self.heapList[i] + self.heapList[i] = tmp + i = i // 2 + + def insert(self,k): + self.heapList.append(k) + self.currentSize = self.currentSize + 1 + self.percUp(self.currentSize) + + def percDown(self,i): + while (i * 2) <= self.currentSize: + mc = self.minChild(i) + if self.heapList[i] > self.heapList[mc]: + tmp = self.heapList[i] + self.heapList[i] = self.heapList[mc] + self.heapList[mc] = tmp + i = mc + + def minChild(self,i): + if i * 2 + 1 > self.currentSize: + return i * 2 + else: + if self.heapList[i*2] < self.heapList[i*2+1]: + return i * 2 + else: + return i * 2 + 1 + + def delMin(self): + retval = self.heapList[1] + self.heapList[1] = self.heapList[self.currentSize] + self.currentSize = self.currentSize - 1 + self.heapList.pop() + self.percDown(1) + return retval + + def buildHeap(self,alist): + i = len(alist) // 2 + self.currentSize = len(alist) + self.heapList = [0] + alist[:] + while (i > 0): + self.percDown(i) + i = i - 1 + +bh = BinHeap() +bh.buildHeap([9,5,6,2,3]) + +print(bh.delMin()) +print(bh.delMin()) +print(bh.delMin()) +print(bh.delMin()) +print(bh.delMin()) +``` + +## Complexity + +Insertion $\mathcal{O}(logN)$, delete-min $\mathcal{O}(logN)$ , and finding minimum $\mathcal{O}(1)$. These operations depend on heap's height and heaps are always complete binary trees, basically the height is $logN$. (N is number of Node) + +## Priority Queue +Priority queues are a type of container adaptors, specifically designed so that its first element is always the greatest of the elements it contains, according to some strict weak ordering criterion. + +While priority queues are often implemented with heaps, they are conceptually distinct from heaps. A priority queue is an abstract concept like "a list" or "a map"; just as a list can be implemented with a linked list or an array, a priority queue can be implemented with a heap or a variety of other methods such as an unordered array. + +```cpp +#include // std::cout +#include // std::priority_queue +using namespace std; +int main () { + priority_queue mypq; + + mypq.push(30); + mypq.push(100); + mypq.push(25); + mypq.push(40); + + cout << "Popping out elements..."; + while (!mypq.empty()) { + cout << ' ' << mypq.top(); + mypq.pop(); + } + return 0; +} +``` diff --git a/docs/graph/img/360px-Max-Heap.png b/docs/graph/img/360px-Max-Heap.png new file mode 100644 index 0000000..2ee3822 Binary files /dev/null and b/docs/graph/img/360px-Max-Heap.png differ diff --git a/docs/graph/img/Heap-as-array.png b/docs/graph/img/Heap-as-array.png new file mode 100644 index 0000000..c0de22f Binary files /dev/null and b/docs/graph/img/Heap-as-array.png differ diff --git a/docs/graph/img/binary-tree.png b/docs/graph/img/binary-tree.png new file mode 100644 index 0000000..6754630 Binary files /dev/null and b/docs/graph/img/binary-tree.png differ diff --git a/docs/graph/img/binarytree.png b/docs/graph/img/binarytree.png new file mode 100644 index 0000000..3b3303f Binary files /dev/null and b/docs/graph/img/binarytree.png differ diff --git a/docs/graph/img/heap1.png b/docs/graph/img/heap1.png new file mode 100644 index 0000000..f609e43 Binary files /dev/null and b/docs/graph/img/heap1.png differ diff --git a/docs/graph/img/heap2.png b/docs/graph/img/heap2.png new file mode 100644 index 0000000..03323d1 Binary files /dev/null and b/docs/graph/img/heap2.png differ diff --git a/docs/graph/index.md b/docs/graph/index.md index 4a76d67..d8b8c8f 100644 --- a/docs/graph/index.md +++ b/docs/graph/index.md @@ -9,6 +9,9 @@ title: Graph ### [Introduction](introduction.md) ### [Definitions](definitions.md) ### [Representing Graphs](representing-graphs.md) +### [Tree Traversals](tree-traversals.md) +### [Binary Search Tree](./binary-search-tree.md) +### [Heap](heap.md) ### [Depth First Search](depth-first-search.md) ### [Breadth First Search](breadth-first-search.md) ### [Cycle Finding](cycle-finding.md) diff --git a/docs/graph/tree-traversals.md b/docs/graph/tree-traversals.md new file mode 100644 index 0000000..18570f1 --- /dev/null +++ b/docs/graph/tree-traversals.md @@ -0,0 +1,80 @@ +--- +title: Tree Traversals +tags: + - Tree + - Preorder + - Postorder + - Inorder +--- + +The tree traversal is the process of visiting every node exactly once in a tree structure for some purposes(like getting information or updating information). In a binary tree there are some described order to travel, these are specific for binary trees but they may be generalized to other trees and even graphs as well. + +
+![a binary tree](img/binary-tree.png) +
a binary tree
+
+ +## Preorder Traversal + +Preorder means that a root will be evaluated before its children. In other words the order of evaluation is: Root-Left-Right + +``` +Preorder Traversal + Look Data + Traverse the left node + Traverse the right node +``` + +Example: 50 – 7 – 3 – 2 – 8 – 16 – 5 – 12 – 17 – 54 – 9 – 13 + +## Inorder Traversal +Inorder means that the left child (and all of the left child’s children) will be evaluated before the root and before the right child and its children. Left-Root-Right (by the way, in binary search tree inorder retrieves data in sorted order) + +``` +Inorder Traversal + Traverse the left node + Look Data + Traverse the right node +``` + +Example: 2 – 3 – 7 – 16 – 8 – 50 – 12 – 54 – 17 – 5 – 9 – 13 + +## Postorder Traversal +Postorder is the opposite of preorder, all children are evaluated before their root: Left-Right-Root + +``` +Postorder Traversal + Traverse the left node + Traverse the right node + Look Data +``` + +Example: 2 – 3 – 16 – 8 – 7 – 54 – 17 – 12 – 13 – 9 – 5 – 50 + +## Implementation + +```py +class Node: + def __init__(self,key): + self.left = None + self.right = None + self.val = key + +def printInorder(root): + if root: + printInorder(root.left) + print(root.val) + printInorder(root.right) + +def printPostorder(root): + if root: + printPostorder(root.left) + printPostorder(root.right) + print(root.val) + +def printPreorder(root): + if root: + print(root.val) + printPreorder(root.left) + printPreorder(root.right) +```