Chapter 20 Fibonacci Heaps 20.0.1
We saw how binomial heaps support in O(lg n)
time the mergeable heap operations INSERT,
MINIMUM, EXTRACTMIN, and UNION, in addition
to DECREASEKEY and DELETE. Fibonacci heaps
support the same operations and improve the
amortized running time to O(1) for all except
EXTRACTMIN and DELETE.
This is good when there are few EXTRACTMINs
and DELETEs, for example in graph algorithms
that call DECREASEKEY once per edge in graphs
with many edges. However the constant factors
and programming complexity make Fibonacci
heaps less desirable than ordinary binary
heaps for most applications. So they are
mostly of theoretical interest.
A Fibonacci heap, loosely based on a binomial
heap, is a collection of trees. Each tree is
an "unordered binomial tree" if neither DELETE
nor DECREASEKEY is called. Fibonacci heaps
have a more relaxed structure than binomial
heaps, delaying work to maintain the structure
until it is convenient, and giving improved
asymptotic bounds using the potential method.
Fibonacci heaps do not give efficient support
to the SEARCH operation, and assume that the
application maintains handles pointing to heap
items, and that handles in heap items point to
the corresponding application objects.
20.1 Structure of Fibonacci heaps 20.1.1
A Fibonacci heap is a collection of minheap
ordered trees, which are not constrained to be
binomial trees  as shown in Figure 20.1(a):
min[H]


V
(23)(7)(3)(17)(24)
/\  /\
/  \  / \
/  \  / \
/  \  / \
((18)) (52) (38) (30) ((26)) (46)
  
  
  
((39)) (41) (35)
The trees are rooted but unordered. As shown
in Figure 20.1(b) (page 478), each node x has
a pointer p[x] to its parent and child[x] to
any one of its children. The children of x
are linked together in a circular, doubly
linked list, called the child list of x. Each
child y has pointers left[y] and right[y]. If
y is an only child, left[y] = right[y] = y.
The order of the siblings in child list is
arbitrary.
20.1.2
Circular, doubly linked lists have two
advantages for use in Fibonacci heaps:
1) We can remove a node in O(1) time, and
2) Two such lists can be concatenated (or
"spliced") into one such list in O(1) time.
There are two other fields in each node x:
the number of children in the child list of x
is degree[x], and the boolean value in the
field mark[x] indicates whether x has lost a
child since the last time x was made the child
of another node. A new node is unmarked, and
a node becomes unmarked whenever it is made
the child of another node.
A Fibonacci heap H is accessed by a pointer
min[H] to the root of the tree containing a
minimum key; this node is called the minimum
node of H. If H is empty, min[H] = NIL.
The roots of all trees in a Fibonacci heap
are also linked using their left and right
pointers into a circular, doubly linked list,
called the root list, in arbitrary order.
Also we maintain the attribute n[H] as the
number of nodes currently in H.
Potential function 20.1.3
For a Fibonacci heap H, we let t(H) denote
the number of trees in the root list of H, and
m(H) the number of marked nodes in H. Then
the potential of H is defined by:
Phi(H) = t(H) + 2m(H) (20.1)
(We will see why this is a good choice for the
potential in Section 20.3.) For example, the
potential of the Fibonacci heap in Figure 20.1
is 5 + 2*3 = 11. The potential of a set of
Fibonacci heaps is the sum of the potentials
of its constituent heaps. We assume that one
unit of potential is large enough to cover the
cost of any specific constanttime pieces of
work that might be encountered.
We assume that a Fibonacci heap application
begins with no heaps, so that its potential is
0, and by equation (20.1) will be nonnegative
at all subsequent times. From equation (17.3)
an upper bound on the total amortized cost is
also an upper bound on the total actual cost
of a sequence of operations.
Maximum degree
In the analysis we will perform, we will
assume that we know an upper bound D(n) on the
maximum degree of any node in an nnode heap.
By Exercise 20.23, if we only use mergeable
heap operations, D(n) <= floor(lg n); later,
Section 20.3 shows when we have DECREASEKEY
and DELETE as well, that D(n) = O(lg n).
20.2 Mergeableheap operations 20.2.1
If we only use the mergeableheap operations
MAKEHEAP, INSERT, MINIMUM, EXTRACTMIN, and
UNION, a Fibonacci heap is simply a collection
of unordered binomial trees. The unordered
binomial tree U_0 consists of a single node,
and U_k is made of of two U_(k1)'s where one
is made to be _any_ child of the other.
Lemma 19.1 (page 457) holds for unordered
binomial trees also, with the following change
to property 4 (see Exercise 20.2, page 488):
4'. For the unordered binomial tree U_k, the
root has degree k, which is greater than
that of any other node. The children of
the root are roots of subtrees U_0, U_1,...
U_(k1), in _some_ order.
Thus, if an nnode Fibonacci heap is made up
of unordered binomial trees, D(n) <= lg(n).
The key idea in mergeableheap operations on
Fibonacci heaps is to delay structural
maintenance work as long as possible. If t(H)
is small then we can quickly find the new
minimum node in an EXTRACTMIN operation. But
as in Exercise 19.210 for binomial heaps, we
pay a price for ensuring t(H) is small: it can
take up to Omega(lg n) time to insert a node
or unite two binomial heaps. We do not try to
consolidate trees in a Fibonacci heap when we
insert a node or unite two heaps; we do it in
EXTRACTMIN when we remove the minimum.
Creating a new Fibonacci heap 20.2.2
MAKEFIBHEAP() allocates and returns an
empty heap H, with n[H] = 0 and min[H] = NIL.
Since t(H) = 0 and m(H) = 0, the potential of
H is 0, so that the amortized cost is equal to
its O(1) actual cost.
Inserting a node
FIBHEAPINSERT(H,x) inserts node x into H
assuming x has been allocated and that key[x]
has been filled in.
FIBHEAPINSERT(H,x)
1 degree[x] < 0
2 p[x] < NIL
3 child[x] < NIL
4 left[x] < x
5 right[x] < x
6 mark[x] < FALSE
7 concatenate the node x with H's root list
8 if min[H] = NIL or key[x] < key[min[H]]
9 then min[H] < x
10 n[H] < n[H] + 1
Lines 16 make x into its own root list which
is then linked to H's root list in O(1) actual
time. Lines 89 update min[H] if needed, and
line 10 increments n[H]. It makes no attempt
at consolidation, unlike BINOMIALHEAPINSERT.
To determine the amortized cost, 20.2.3
let H be the original heap and H' be the
resulting heap. Then t(H') = t(H) + 1, and
m(H') = m(H), so the increase in potential is:
((t(H) + 1) + 2m(H))  (t(H) + 2m(H)) = 1
The amortized cost is O(1) + 1 = O(1) since
the actual cost is O(1).
Finding the minimum node
We can find the minimum node min[H] in O(1)
actual time, and since the potential does not
change, this is the amortized cost too.
Uniting two Fibonacci heaps
FIBHEAPUNION(H1,H2) concatenates the root
lists of H1 and H2, destroying them, and
determines the new minimum node.
FIBHEAPUNION(H1,H2)
1 H < MAKEFIBHEAP()
2 min[H] < min[H1]
3 concatenate root list of H2 with that of H
4 if (min[H1] = NIL) or
(min[H2] not = NIL and
key[min[H2]] < key[min[H1]])
5 then min[H] < min[H2]
6 n[H] < n[H1] + n[H2]
7 free objects H1 and H2
8 return H
20.2.4
Lines 13 concatenate the root lists of H1
and H2 into a new root list H. Lines 2, 4,
and 5 compute the minimum node, and line 6
sets n[H]. Objects H1 and H2 are freed, and
H is returned. Again, no consolidation
occurs.
Since t(H) = t(H1)+t(H2) & m(H) = m(H1)+m(H2)
the change in potential is:
Phi(H)  (Phi(H1) + Phi(H2))
= (t(H)+2m(H)) 
( (t(H1)+2m(H1)) + (t(H1)+2m(H1)) )
= 0,
so the amortized cost is equal to the O(1)
actual cost.
Extracting the minimum node
Extracting the minimum node is the most
complicated operation; it is also where the
delayed work of consolidating the trees is
done. It assumes that pointers remaining in
the root list are updated, but that pointers
in the extracted node are left unchanged for
convenience. It uses the auxiliary procedure
CONSOLIDATE, which we shall see shortly.
FIBHEAPEXTRACTMIN(H) 20.2.5
1 z < min[H]
2 if z not = NIL
3 then for each child x of z
4 do add x to the root list of H
5 p[x] < NIL
6 remove z from the root list of H
7 if z = right[z]
8 then min[H] < NIL
9 else min[H] < right[z]
10 CONSOLIDATE(H)
11 n[H] < n[H]  1
12 return z
First, a pointer z to the minimum node is
saved and then returned at the end. If z is
NIL, H is empty and we are done. Otherwise we
add all z's children to the root list and
delete z from H. In line 7 if z = right[z], z
was the only node in the root list and it had
no children, so we just make H the empty heap.
Otherwise, we set min[H] to right[z], which
may not be the new minimum, but this will be
fixed when the heap is consolidated in line 10
by the call CONSOLIDATE(H).
CONSOLIDATE(H) repeats the following steps
until all roots have different degree values.
1. Find two roots x and y in the root list of
the same degree, where key[x] <= key[y].
2. Link y to x: remove y from the root list
and make y a child of x. Clear the mark on
y, if any, and increment degree[x].
CONSOLIDATE uses an auxiliary array 20.2.6
of root pointers A[0..D(n[H])]; if A[i] = y,
then y is currently a root with degree[y] = i.
CONSOLIDATE(H)
1 for i < 0 to D(n[H])
2 do A[i] < NIL
3 for each node w in the root list of H
4 do x < w
5 d < degree[x]
6 while A[d] not = NIL
7 do y < A[d] > Another node with
> same degree as x.
8 if key[x] > key[y]
9 then exchange x <> y
10 FIBHEAPLINK(H,y,x)
11 A[d] < NIL
12 d < d + 1
13 A[d] < x
14 min[H] < NIL > empty H's root list
15 for i < 0 to D(n[H]) > rebuild it from A
16 do if A[i] not = NIL
17 then add A[i] to the root list of H
18 if min[H] = NIL or
key[A[i]] < key[min[H]]
19 then min[H] < A[i]
FIBHEAPLINK(H,y,x)
1 remove y from the root list of H
2 make y a child of x, incrementing degree[x]
3 mark[y] < FALSE
CONSOLIDATE works as follows. 20.2.7
After initializing A, lines 313 process each
root w in the root list. After processing w,
it ends up in a tree rooted at some node x,
which may or may not be the same as w. Of the
processed roots, no others will have the same
degree as x, and so we set A[degree[x]] to
point to x. When this forloop terminates, at
most one root of each degree will remain, and
A's entries will point to each remaining root.
The whileloop of lines 612 repeatedly links
the root x of the tree containing w to another
root with the same degree as x, until no other
root has the same degree. This whileloop
maintains the following invariant:
At the start of each iteration, d = degree[x]
The loop invariant is used as follows:
Initialization: Line 5 ensures that the loop
invariant holds when we enter the loop.
Maintenance: In each iteration, A[d] points to
a root y. Because d = degree[x] = degree[y]
we want to make y a child of x (switching x
and y first if key[x] > key[y]); the link
operation increments degree[x] and we also
increment d to maintain the invariant. Note
that y is no longer a root, so we remove its
pointer from A in line 11.
20.2.8
Termination: We repeat the loop until A[d] is
NIL, in which case there is no other root
with the same degree as x.
After the whileloop terminates, we set A[d]
to x in line 13 and do another iteration of
the forloop.
Figure 20.3(a), page 484, shows a Fibonacci
heap H. Figure 20.3(b) shows H with its min
removed and min's children added to its root
list. Figures 20.3(c)(e) show H and A for
the first 3 iterations of the forloop, which
only set entries in A, since the degrees of
the roots are different and A starts with only
NILs.
In the next iteration, the degree of the root
containing 7 is the same as that pointed to by
A[0] (the root containing 23), so the while
loop is entered and those roots are linked,
forming a tree with degree 1 (Figure 20.3(f)).
This tree has the same degree as that pointed
to by A[1], so we stay in the whileloop and
link those trees, forming a tree of degree 2
(Figure 20.3(g)). The whileloop is iterated
one more time, linking the degree 2 tree with
that pointed to by A[2] (Figure 20.3(h)).
The next 2 iterations of the forloop simply
add pointers to the roots 21 (in A[0]) and 18
(in A[1]), shown in Figures 20.3(i)(j).
In the next iteration, the degree of 20.2.9
the root containing 52 is the same as the one
pointed to by A[0], so the whileloop is run
and they are linked, forming a degree1 tree
with 21 at the root and 52 as the child. The
whileloop runs again, linking that degree1
tree with the one pointed to by A[1], forming
a degree2 tree, resulting in Figure 20.3(k).
The final iteration of the forloop simply
sets A[1] to point to the root containing 38
(Figure 20.3(l)).
Line 14 empties the root list and lines 1519
reconstruct it from the array A. The final
result is shown in Figure 20.3(m).
Note that if before FIBHEAPEXTRACTMIN is
called, all the trees in the heap are
unordered binomial trees, they are afterward
also. This is because: 1) the children of the
minimum node are all unordered binomial trees,
and 2) CONSOLIDATE only links two trees of the
same degree, and if they are both U_k trees,
the resulting tree is a U_(k+1).
We now show the amortized cost of extracting
the minimum node in an nnode heap H is
Theta(D(n)). One contribution of Theta(D(n))
comes from adding at worst D(n) children of
min[H] to the rootlist in FIBHEAPEXTRACTMIN
in lines 35. Another Theta(D(n)) comes from
lines 12 and 1419 of CONSOLIDATE.
It remains to find the contribution 20.2.10
of lines 313. The size of the root list upon
calling CONSOLIDATE is at most D(n) + t(H)  1
since it consists of the original t(H) roots,
minus the extracted node, plus its children
(at most D(n) ). Every time through the while
loop of lines 612, one of the roots is linked
to another, so the total amount of work done
by the for loop is proportional to D(n) + t(H)
and thus the total actual amount of work done
in extracting the minimum is O(D(n) + t(H)).
The potential before extracting the minimum
node is t(H) + 2m(H), and the potential after
is at most (D(n) + 1) + 2m(H), since at most
D(n) roots remain and no nodes become marked.
Thus, the amortized cost is at most:
O(D(n) + t(H)) + ((D(n) + 1) + 2m(H))
 (t(H) + 2m(H)) (*)
= O(D(n)) + O(t(H))  t(H)
= O(D(n))
since we can scale up the potential in the
last two terms of (*), so that the t(H) from
the last term cancels the O(t(H)) from the
first term of (*). Intuitively, the cost of
performing a link is paid for by the reduction
in the potential due to reducing the number of
roots by 1. Since in Section 20.4 we show
D(n) = O(lg n), the amortized cost of
extracting the minimum node is O(lg n).
20.3 Decreasing a key and 20.3.1
deleting a node
We show how to decrease the key of a node in
O(1) amortized time, and how to delete any
node in O(D(n)) amortized time. These two
operations do not preserve unordered binomial
trees in a heap, but they are close enough to
bound D(n) by O(lg n) (done in Section 20.4).
This implies that FIBHEAPEXTRACTMIN and
FIBHEAPDELETE run in O(lg n) amortized time.
Decreasing a key
FIBHEAPDECREASEKEY(H,x,k)
1 if k > key[x]
2 then error "new key is > than current key"
3 key[x] < k
4 y < p[x]
5 if y not = NIL and key[x] < key[y]
6 then CUT(H,x,y)
7 CASCADINGCUT(H,y)
8 if key[x] < key[min[H]]
9 then min[H] < x
CUT(H,x,y)
1 remove x from the child list of y,
decrementing degree[y]
2 add x to the root list of H
3 p[x] < NIL
4 mark[x] < FALSE
CASCADINGCUT(H,y) 20.3.2
1 z < p[y]
2 if z not = NIL
3 then if mark[y] = FALSE
4 then mark[y] < TRUE
5 else CUT(H,y,z)
6 CASCADINGCUT(H,z)
Lines 13 of FIBHEAPDECREASEKEY ensure that
the new key is <= the current key and assign
it to x. If x is a root or key[x] >= key[y],
where y = p[x], no changes need to be made
since the minheap order still holds; lines
45 test for this. If the minheap order has
been violated, we start by cutting x, making
it a root with the call to CUT in line 6.
We also cut x's parent, y, if y had another
child CUT since y itself was made the child of
another node. The "mark" field of y tells us
to do this if it is TRUE. In other words, if
1. at some time y was a root,
2. then y was linked to another node,
3. then two children of y were CUT,
then we must also CUT y. When a node is cut,
we clear its mark field (this was also done in
FIBHEAPLINK since this is only step 2.).
But we are not done, since mark[p[y]] could
be TRUE and the CUTs could "cascade" up the
tree. This will stop at a root or when a node
had no previous child CUT.
Once all cascading CUTs have occurred, lines
89 update min[H] if necessary.
20.3.3
Figure 20.4 (page 491) shows the results of
two calls to FIBHEAPDECREASEKEY, starting
with the heap shown in Figure 20.4(a). First,
the node with key 46 has it decreased to 15,
which involves no cascading cuts, but marks
its parent with key 24, as in Figure 20.4(b).
Then the node with key 35 has it decreased to
5, which causes two cascading cuts, and is
shown in Figures 20.4(c)(e).
To find FIBHEAPDECREASEKEY's amortized
cost, we first find its actual cost. It takes
O(1) time, plus the time to do c cascading
cuts (c >= 0), so the actual cost is O(c).
To find the change in potential, we note that
each CASCADINGCUT, except for the last one,
makes the marked node a root and clears the
mark bit. Afterward, there are t(H) + c trees
(the original t(H) trees, c1 trees produced
by cascading cuts, and the tree rooted at x),
and at most m(H)  c + 2 marked nodes (c  1
were unmarked by cascading cuts and the last
call may have marked a node). So the change
in potential is at most:
((t(H)+c) + 2(m(H)c+2))  (t(H) + 2m(H))
= 4  c
Thus the amortized cost is at most
O(c) + 4  c = O(1)
since we can scale the potential so that the
c dominates the hidden constant in O(c).
20.3.4
We can now see why the potential function
included the term 2m(H). When a marked node
is cut, one unit of potential pays for the cut
(and clearing the mark bit) and the other unit
compensates for the increase in potential due
the node becoming a new root.
Deleting a node
Deleting a node is done the same way as it is
done in binomial heaps. We assume that there
is no key value of infinity in the heap.
FIBHEAPDELETE(H,x)
1 FIBHEAPDECREASEKEY(H,x,infinity)
2 FIBHEAPEXTRACTMIN(H)
The amortized cost of FIBHEAPDELETE is the
sum of the O(1) cost of FIBHEAPDECREASEKEY
and the O(D(n)) cost of FIBHEAPEXTRACTMIN.
Since it is shown in Section 20.4 that
D(n) = O(lg n), The amortized cost of
FIBHEAPDELETE is also O(lg n).
20.4 Bounding the maximum degree 20.4.1
To establish the O(lg n) amortized costs of
FIBHEAPEXTRACTMIN and FIBHEAPDELETE, we
must show D(n) = O(lg n), where D(n) is the
upper bound on the degree of any node in a
Fibonacci heap. We can establish this bound
because a node is cut from its parent as soon
as it loses a second child. In particular, we
will show that D(n) <= floor( log_phi(n) ),
where phi = (1 + sqrt(5))/2, the golden ratio.
For any node x, we define size(x) to be the
number of nodes, including x, in the subtree
rooted at x. We shall show that size(x) is
exponential in degree[x].
Lemma 20.1
Let x be any node in a Fibonacci heap, and
suppose degree[x] = k. Let y_1, y_2, ..., y_k
denote the children of x in the order in which
they were linked to x from earliest to latest.
Then degree[y_1] >= 0 and degree[y_i] >= i  2
for i = 2, 3, ..., k.
Proof: Certainly degree[y_1] >= 0.
For i > 1, when y_i was linked to x, all of
y_1, y_2, ..., y_(i1) were children of x, so
degree[x] = i  1 then. Node y_i is linked to
x only if degree[y_i] = degree[x], so
degree[y_i] was also i  1 then. Since then,
y_i has lost at most 1 child (or else it would
have been cut), so degree[y_i] >= i  2.
20.4.2
Recall (Section 3.2) that for k = 0, 1, 2,...
the kth Fibonacci numbers are defined by:
/ 0 if k = 0
F_k = < 1 if k = 1
\ F_(k1) + F_(k2) if k >= 2
Lemma 20.2
For all integers k >= 0,
k
F_(k+2) = 1 + Sum ( F_i )
i = 0
Proof: By induction on k. When k = 0,
0
1 + Sum ( F_i ) = 1 + F_0
i = 0 = 1 + 0
= 1
= F_2
We now assume the induction hypothesis (I.H.):
k1
F_(k+1) = 1 + Sum ( F_i )
i = 0
Then: F_(k+2) = F_(k) + F_(k+1)
k1
= F_(k) + ( 1 + Sum ( F_i ) )
i = 0
k
= 1 + Sum ( F_i )
i = 0
Recall (Exercise 3.27, page 57): 20.4.3
k
F_(k+2) >= phi , where phi = (1 + sqrt(5))/2.
Lemma 20.3
Let x be any node in a Fibonacci heap, and let
k = degree[x]. Then size(x) >= F_(k+2), and
so size(x) >= phi^k also.
Proof: Let s_k denote the minimum possible
value of size(x) over all nodes z such that
degree[z] = k. Obviously s_0 = 1, s_1 = 2,
and s_2 = 3. By definition of s_k, if
degree[x] = k, size(x) >= s_k. Also, s_k
increases (strictly) monotonically with k.
As in Lemma 20.1, let y_1, y_2, ..., y_k
denote the children of x in the order in which
they were linked to x. To compute a lower
bound on size(x), we count 1 for x itself and
1 for y_1 (for which size(y_1) >= 1), giving:
k
size(x) >= 2 + Sum ( s_degree[y_i] )
i = 2
k
>= 2 + Sum ( s_(i2) )
i = 2
where the last line follows from Lemma 20.1
(so that degree[y_i] >= i2) and monotonicity
of s_k (so that s_degree[y_i] >= s_(i2) ).
20.4.4
We use induction on k to show s_k >= F_(k+2)
for k >= 0. The bases, for k = 0 and k = 1,
hold since s_0 = 1 = F_2, and s_1 = 2 = F_3.
For the inductive step, we assume that k >= 2
and that s_i >= F_(i+2) for i = 0, 1,..., k1.
So we have:
k
s_k >= 2 + Sum ( s_(i2) ) (by above)
i = 2
k
>= 2 + Sum ( F_i ) (by ind. hyp.)
i = 2
k
= 1 + Sum ( F_i )
i = 0
= F_(k+2) (by Lemma 20.2)
k
Thus, size(x) >= s_k >= F_(k+2) >= phi .
Corollary 20.4 D(n) = O(lg n)
Proof: Let x be a node in an nnode Fibonacci
heap, and let k = degree[x]. By Lemma 20.3,
n >= size(x) >= phi^k. Taking basephi
logarithms gives log_phi(n) >= k, so the
maximum degree D(n) of any node is O(lg n).