CPSC 223 : https://zoo.cs.yale.edu/classes/cs223/f2022/index.html
Data Structures and Programming Techniques : http://www.cs.yale.edu/homes/aspnes/classes/223/notes.html
Heap : http://cs.yale.edu/homes/aspnes/classes/223/notes.html#heaps
A heap is a binary tree in which each element has a key (or some priority) that is less than the keys of its children. Heaps are used to implement the priority queue abstract data type, which we'll talk about first.
In a standard queue, elements leave the queue in the same order as they arrive. In a priority queue, elements leave the queue in order of decreasing priority: the DEQUEUE operation becomes a DELETE_MIN operation(or DELELTE_MAX, if higher numbers mean higher priority), which removes and returns the highest-priority element of the priority queue, regardless of when it was inserted. Priority queues are ofthen used in operating system scheudlers to determine which job to run next: a high-priority job (e.g., turn on the fire suppression system) runs before a low-priority job (floss the cat) even if the low-priority job has been waiting longer.
Implementing a priority queue using an array or linked list is likely to be expensive. If the array or list is unsorted, it takes O(n) time to find the minimum element; if it is sorted, it takes O(n) time (in the worst case) to add a new element. So such implementations are only useful when the numbers of INSERT and DELETE-MIN operations are very different. For example, if DELETE-MIN is called only rarely but INSERT is called often, it may actually be cheapest to implement a priority queue as an unsorted linked list with O(1) INSERTs and O(n) DELETE-MINs. But if we expect that every element that is inserted is eventually removed, we want something for which both INSERT and DELETE-MIN are cheap operations.
A heap is a binary tree in which each node has a smaller key than its children; this property is called the heap property or heap invariant.
To insert a node in the heap, we add it as a new leaf, which may violate the heap property if the new node has a lower key than its parent. But we can restore the heap property (at least between this node and its parent) by swapping either the new node or its sibling with the parent, where in either case we move up the node with the similar key. This may still leave a violation of the heap property one level up in the tree, but by continuing to swap small nodes with their parents we eventually reach the top and have a heap again. The time to complete this operation is proportional to the depth of the heap, which will typically be O(log n)
To implement DELETE-MIN, we can easily find the value to return at the top of the heap. Unfortunately, removing it leaves a vaccum that must be filled in by some other element. The easiest way to do this is to grab a leaf(which probably has a very high key), and then float it down to where it belongs by swapping it with its smaller child at each iteration. After time proportional to the depth (again O(log n) if we are doing things right), the heap invariants is restored.
Similar local swapping can be used to restore the heap invariant if the priority of some element in the middle changes; we will not discuss this in detail.
It is possible to build a heap using structs and pointers, where each element points to its parent and children. In practice, heaps are instead stored in arrays, with an implicit pointer structrue determined by array indices. For zero-based arrays as in C, the rule is that a node at position i has children at positions (2 X i+1) and (2 X i+2); in the other direction, a node at position i has a parent at position (i-1)/2 (which rounds down in C). This is equivalent to storing a heap in array by reading through the tree in breadth-first search order.
0
/ \
1 2
/ \ / \
3 4 5 6
becomes
0 1 2 3 4 5 6
This approach works best if there are no gaps in the array. So to maximze efficiency we make this "no gaps" policy part of the invariant. We can do so because we don't care which leaf gets added when we do an INSERT, so we can make it be whichever leaf is at the end of the array. Similarly, in a DELETE-MIN operation, we can promote the element at the end of the array to the root before floating it down. Both these operations change the number of elements in the array, and INSERTs in particular might force us to reallocate eventually. So in the worst case INSERT can be an expensive operation, although as with growing hash tables, the amortized cost may still be small.
If we are presented with an unsorted array, we can turn it into a heap more quickly than the O(nlogn) time required to do n INSERTs. The tricks to build the heap from the bottom up. This means starting with position n-1 and working back to position o, so that when it comes to fix the heap invariant at position i, we have already fixed it at all later positions (this is a form of dynamic programming). Unfortunately, it is not quite enough simply to swap a[i] with its smaller child when we get there, because we could find that a[0] was the largest element in the heap. But the cost of floating a[i] down to its proper place will be proportional to its own height rather than the height of the entire heap. Since most of the elements of the heap are close to the bottom, the total cost will turn out to be O(n).
Heap implemented in C
#define MAX_ELEMENT 200 // 힙 안에 저장된 요소의 개수
typedef struct{
int key;
} element;
typedef struct{
element heap[MAX_ELEMENT];
int heap_size;
} HeapType;
# 힙의 생성
HeapType heap1;
https://gmlwjd9405.github.io/2018/05/10/data-structure-heap.html
Ref)
https://gmlwjd9405.github.io/2018/05/10/data-structure-heap.html