CS62 - Spring 2010 - Lecture 16
Exercise 13.2a
- What does the min-heap [0, 2, 1, 3, 7, 4, 6, 8] look like as a tree?
[0], [2, 1], [3, 7, 4, 6] [8]
heaps
- a heap is a binary tree where:
- the value of a parent is less than or equal to the value of it's children
- common additional resriction: the tree is complete
- recall: a complete tree is a full binary tree except the leaves are filled in from left to right
- draw a binary heap
- A few other observations about binary heaps...
- the smallest value in a heap is the root node
- like binary trees, all nodes in a heap are themselves heaps
- level does NOT indicate size
representing a heap
- we could store the heap using references as we have with other binary trees
- we can also store it using an array (or ArrayList) by leveraging the fact that it is a complete tree
left(i) = 2i + 1
right(i) = 2i + 2
parent(i) = floor((i-1)/2)
which of the following are valid heaps?
- [1, 2, 3, 4, 5, 6, 7, 8]
- [3, 8, 9, 14, 11, 7, 10, 20, 17]
- [1, 20, 2, 30, 25, 5, 6]
- in tree form: [8]
- in tree form: [1], [4, 8], [ , , 9, 13]
implementing a heap
- given a valid heap, how do we add data?
- could try and start at the top and then move our way down
- this gets tricky if we try and maintain the complete tree
- another idea:
- add it to the bottom of the tree (maintaining the complete tree property)
- propagate the value up the tree swapping with its parent until it's in the appropriate location
- look at add method in ArrayListPriorityQueue class in
PriorityQueue code
- what is the worst-case run-time of this second approach?
- O(height) = O(log n) since it's a complete tree
- given a valid heap, how do we extract the minimum value?
- we know that the minimum value is right at the root
- the challenge is that we need to remove it
- one idea: remove it and then percolate the values up with the smallest child
- as before, the challenge here is that it's easy to end up with something that isn't a complete tree
- another idea:
- similar to adding an element, we want to process at the end of the heap. Move the last element in the tree to the root position and then propagate the value down the tree until it's in the appropriate position
- how do we do this?
- starting at the root, compare the current node to the left and right child
- if one of the children is smaller, swap the current value with the smallest child (it neither are, then we're done)
- then, repeat this on the child heap that we just swapped with
- look at extractMin method in ArrayListPriorityQueue class in
PriorityQueue code
- uses a method called heapify, which takes a possibly invalid heap which has both children as valid heaps and turns it into a heap
- note the recursive definition closely matches how we described the algorithm
- as an aside, I believe this way of looking at the problem is much easier to follow than the implementation in the book
- what is the worst-case run-time?
- O(height) = O(log n) since it's a complete tree
other implementations
- we could also implement this using our binary tree structure
- the methods would only change slightly, but the underlying approaches would remain the same
- not as memory efficient, so most implementations use array (or ArrayList) based approach
max vs. min heaps
- what if we wanted to store and remove the largest rather than the smallest values?
- go through and change all the '<'s in the code to '>'
- or, an easier way, is to alter the compareTo method
binary heaps in summary
- binary heaps allow us to insert and extract the minimum values in O(log n) time
-
http://java.sun.com/j2se/1.5.0/docs/api/java/util/PriorityQueue.html
given n data items, how can we build a heap?
- the easy version:
- call add n times
- what is the run-time?
- n calls to add
- each call to add is O(log n)
- O(n log n)
- can we use our heapify method?
- heapify requires that the left and right children are valid heaps
- Any way to accomplish this easily?
- single element heaps are trivially valid heaps
- basic idea:
- start with n/2 single element heaps
- slowly build up bigger and bigger heaps
- for example:
- let's say we want to build a heap from 1-8 in some random order, say:
- 7, 4, 3, 1, 2, 6, 8
- let (7), (4), (3), and (1) be single element heaps
- slowly build up bigger heaps:
- set 2 as the parent of (7) and (4) and call heapify:
(2 (7 4)), (3), (1)
- set 6 as the parent of (3) and (1) and call heapify:
(1 (3 6)), (2 (7 4))
- finally, set 8 as the parent of (1 (3 6)) and (2 (7 4)) and call heapify:
- (1 (3 (6 8)) (2 (7 4)) or [1, 3, 2, 6, 8, 7, 4] in array form
- look at constructor in ArrayListPriorityQueue class in
PriorityQueue code
- we let the end n/2 elements in the data item be the single element heaps
- work our way from n/2 to the front of the data
- since we know that everything with an index > than the current index is a valid heap, each call to heapify will generate a valid heap
- run-time?
- n/2 calls to heapify
- easy answer is O(n log n), since each call to heapify is O(log n)
- however, a lot of the early calls to heapify are not O(log n), for example the first n/4 calls are just O(1) since there is at most one swap that can happen
- with a little bit of math, you can show that it actually is O(n) to construct a heap using this approach (follow the line of reasoning above and you'll get: \sum_i=1^(log n) (n i)/(2^(i+1)), which works out to O(n)
write a method decreasePriority(int index, E val)
- decrease the priority of the item at index to val
- val must be less than or equal to the current priority of index
could we use a heap to sort data?
- build a heap with our data
- call extractMin n times
- what is the runtime?
- O(n) to build the heap
- n calls to extractMin = O(n log n)
- O(n log n) overall
- this is called heap-sort and is another O(n log n) time algorithm for sorting data
can we do better than O(n log n) for sorting?
- look at comparison-based sorting slides