Chapter 19 Fibonacci Heaps 19.0.1
Fibonacci heaps (1) support "mergeable heap"
operations, and (2) some of those operations
run in Theta(1) amortized time, making the
Fibonacci heaps well suited for applications
that use those operations frequently.
Mergeable heaps
Mergeable heaps support these five operations:
MAKEHEAP() creates & returns a new empty heap
INSERT(H,x) inserts element x, whose key field
has been set, into a heap H.
MINIMUM(H) returns a pointer to the element in
H whose key is minimum.
EXTRACTMIN(H) deletes the element from H with
minimum key, returning a pointer to it.
UNION(H1,H2) creates and returns a new heap
containing all element of H1 and H2.
Fibonacci heaps are mergeable heaps which
also support the two operations:
DECREASEKEY(H,x,k) assigns to element x a new
key value k, assumed to be <= current key.
DELETE(H,x) deletes element x from heap H.
Figure 19.1 19.0.2
Binary heap Fibonacci heap
Procedure (worstcase) (amortized)

MAKEHEAP Theta(1) Theta(1)
INSERT Theta(lg n) Theta(1)
MINIMUM Theta(1) Theta(1)
EXTRACTMIN Theta(lg n) O(lg n)
UNION Theta(n) Theta(1)
DECREASEKEY Theta(lg n) Theta(1)
DELETE Theta(lg n) O(lg n)
Figure 19.1 shows that if we don't need the
UNION operation, binary heaps are reasonable.
We merge two binary heaps by concatenating
thier arrays and running BUILDMINHEAP, thus
taking Theta(n) time.
Fibonacci heaps have better time bounds for
INSERT, UNION, and DECREASEKEY, and the same
bounds for the other operations.
Fibonacci heaps in theory & practice 19.0.3
From a theoretical standpoint Fibonacci heaps
are very useful when the number of EXTRACTMIN
and DELETE operations is small. For example,
some graph algorithms call DECREASEKEY on
each edge, which is useful for dense graphs.
Fast graph algorithms such as finding minimal
spanning trees and singlesource shortest
paths make essential use of Fibonacci heaps.
From a practical standpoint, the constant
factors and programming complexity make
Fibonacci heaps less desirable than binary
heaps, except for special applications wtth
very large n. Thus Fibonacci heaps are mostly
of theoretical interest.
Binary heaps and Fibonacci heaps and do not
efficiently support the SEARCH operation. So
we assume DECREASEKEY and DELETE are given
pointers to the element. As mentioned before,
each element often has handle pointing to the
other data associated with the key; and the
application using the heap has a handle that
points to the element.
Fibonacci heap are based on rooted trees with
each element represented by a node that has a
key attribute, so from now on, we use "node"
instead of "element". We also assume that the
calling application allocates and frees nodes.
Section 19.1 defines a Fibonacci heap and its
potential function. Section 19.2 discusses
the mergeable heap operations and Section 19.3
discusses DECREASEKEY and DELETE. Section
19.4 finishes a key part of the analysis and
explains why we call them "Fibonacci heaps".
19.1 Structure of Fibonacci heaps 19.1.1
A Fibonacci heap is a collection of minheap
ordered trees  as shown in Figure 19.2(a):
min[H]


V
(23)(7)(3)(17)(24)
/\  /\
/  \  / \
/  \  / \
/  \  / \
((18)) (52) (38) (30) ((26)) (46)
  
  
  
((39)) (41) (35)
As shown in Figure 19.2(b) each node x has
a pointer x.p to its parent and x.child to any
one of its children. The children of x are
linked together in a circular, doubly linked
list, called the child list of x. Each child
y has pointers y.left and y.right. If y is an
only child, y.left = y.right = y. The order
of the siblings in child list is arbitrary.
Circular, doubly linked lists have 19.1.2
two advantages for use in Fibonacci heaps:
1) We can insert or remove a node in O(1) time
2) Two such lists can be concatenated (or
"spliced") into one such list in O(1) time.
There are two other fields in each node x:
the number of children in the child list of x
is x.degree, and the boolean value in the
field x.mark indicates whether x has lost a
child since the last time x was made the child
of another node. A new node is unmarked, and
a node becomes unmarked whenever it is made
the child of another node (a node can only
become marked in DECREASEKEY and DELETE).
A Fibonacci heap H is accessed by a pointer
H.min to the root of the tree containing a
minimum key; this node is called the minimum
node of H. If H is empty, H.min = NIL.
The roots of all trees in a Fibonacci heap
are also linked using their left and right
pointers into a circular, doubly linked list,
called the root list, in arbitrary order.
Also we maintain the attribute H.n as the
number of nodes currently in H.
Potential function 19.1.3
For a Fibonacci heap H, we let t(H) denote
the number of trees in the root list of H, and
m(H) the number of marked nodes in H. Then
the potential of H is defined by:
Phi(H) = t(H) + 2m(H) (19.1)
(We will see why this is a good choice for the
potential in Section 19.3.) For example, the
potential of the Fibonacci heap in Figure 19.2
is 5 + 2*3 = 11. The potential of a set of
Fibonacci heaps is the sum of the potentials
of its constituent heaps. We assume that one
unit of potential is large enough to cover the
cost of any specific constanttime pieces of
work that might be encountered.
We assume that a Fibonacci heap application
begins with no heaps, so that its potential is
0, and by equation (19.1) will be nonnegative
at all subsequent times. From equation (17.3)
an upper bound on the total amortized cost is
also an upper bound on the total actual cost
of a sequence of operations.
Maximum degree
In the analysis we will perform, we will
assume that we know an upper bound D(n) on the
maximum degree of any node in an nnode heap.
By Exercise 19.23, if we only use mergeable
heap operations, D(n) <= floor(lg n); later,
Section 19.3 shows when we have DECREASEKEY
and DELETE as well, that D(n) = O(lg n).
19.2 Mergeableheap operations 19.2.1
The mergeableheap operations on Fibonacci
heaps delay work as long as possible. We can
insert a node into the root list in constant
time. Doing this to a empty Fibonacci heap k
times produces a root list of k nodes. Then
if we do an EXTRACTMIN, we would need to go
through the remaining k1 nodes to find the
new minimum. Since we have to do that, we
consolidate the nodes so that each root has a
unique degree, which reduces the root list to
at most D(n) + 1 roots.
Creating a new Fibonacci heap
MAKEFIBHEAP() allocates and returns an
empty heap H, with H.n = 0 and H.min = NIL.
Since t(H) = 0 and m(H) = 0, the potential of
H is 0, so that the amortized cost is equal to
its O(1) actual cost. We assume that the
calling application that uses the heaps frees
nolongerneeded heaps.
Inserting a node
FIBHEAPINSERT(H,x) inserts node x into H
assuming x has been allocated and that x.key
has been filled in (shown in Figure 19.3).
FIBHEAPINSERT(H,x) 19.2.2
1 x.degree = 0
2 x.p = NIL
3 x.child = NIL
4 x.mark = FALSE
5 if H.min == NIL
6 create a root list for H containing x
7 H.min = x
8 else insert x into H's root list
9 if x.key < H.min.key
10 H.min = x
11 H.n = H.n + 1
Lines 14 initialize x; line 5 tests if H is
empty, if so (lines 67) make x a root list
and H.min point to it, otherwise (lines 810)
insert x into H's root list and update H.min
if needed; finally increment H.n (line 11).
To determine the amortized cost, let H be the
original heap and H' be the resulting heap.
Then t(H') = t(H) + 1, and m(H') = m(H), so
the increase in potential is:
((t(H) + 1) + 2m(H))  (t(H) + 2m(H)) = 1
The amortized cost is O(1) + 1 = O(1) since
the actual cost is O(1).
Finding the minimum node
We can find the minimum node H.min in O(1)
actual time, and since the potential does not
change, this is the amortized cost too.
Uniting two Fibonacci heaps 19.2.3
FIBHEAPUNION(H1,H2) concatenates the root
lists of H1 and H2 and determines the new
minimum node. H1 and H2 can then be freed.
FIBHEAPUNION(H1,H2)
1 H = MAKEFIBHEAP()
2 H.min = H1.min
3 concatenate root list of H2 with that of H
4 if (H1.min == NIL) or
(H2.min not = NIL and
H2.min.key < H1.min.key)
5 H.min = H2.min
6 H.n = H1.n + H2.n
7 return H
Lines 13 concatenate the root lists of H1
and H2 into a new root list H. Lines 2 & 45
set the minimum node, and line 6 sets H.n.
H is returned; objects H1 and H2 can be freed.
Since t(H) = t(H1)+t(H2) & m(H) = m(H1)+m(H2)
the change in potential is:
Phi(H)  (Phi(H1) + Phi(H2))
= (t(H)+2m(H)) 
( (t(H1)+2m(H1)) + (t(H1)+2m(H1)) )
= 0,
so the amortized cost is equal to the O(1)
actual cost.
Extracting the minimum node 19.2.4
Extracting the minimum node is the most
complicated operation; it is also where the
delayed work of consolidating the trees is
done. It assumes that pointers remaining in
the root list are updated, but that pointers
in the extracted node are left unchanged for
convenience. It uses the auxiliary procedure
CONSOLIDATE, which we shall see shortly.
FIBHEAPEXTRACTMIN(H)
1 z = H.min
2 if z != NIL
3 for each child x of z
4 add x to the root list of H
5 x.p = NIL
6 remove z from the root list of H
7 if z == z.right
8 H.min = NIL
9 else H.min = z.right
10 CONSOLIDATE(H)
11 H.n = H.n  1
12 return z
First, a pointer z to the minimum 19.2.5
node is saved and then returned at the end.
If z is NIL, H is empty and we are done.
Otherwise we add all z's children to the root
list and delete z from H. In line 7, if z =
z.right, z was the only node in the root list
and it had no children, so we just make H the
empty heap. Otherwise, we set H.min to
z.right, which may not be the new minimum, but
this will be fixed when we consolidate the
heap in line 10 by the call CONSOLIDATE(H).
CONSOLIDATE(H) repeats the following steps
until all roots have different degree values.
1. Find two roots x and y in the root list of
the same degree, where x.key <= y.key.
2. Link y to x: remove y from the root list
and make y a child of x. Clear the mark on
y, if any, and increment x.degree.
CONSOLIDATE uses an auxiliary array of root
pointers A[0..D(H.n)]; if A[i] = y, then y is
currently a root with y.degree = i.
CONSOLIDATE(H) 19.2.6
1 let A[0..D(H.n)] be a new array
2 for i = 0 to D(H.n)
3 A[i] = NIL
4 for each node w in the root list of H
5 x = w
6 d = x.degree
7 while A[d] != NIL
8 y = A[d] // Another node with
// same degree as x.
9 if x.key > y.key
10 exchange x <> y
11 FIBHEAPLINK(H,y,x)
12 A[d] = NIL
13 d = d + 1
14 A[d] = x
15 H.min = NIL // empty H's root list
16 for i = 0 to D(H.n) // rebuild H from A
17 if A[i] != NIL
18 if H.min = NIL
19 create a root list for H
containing just A[i]
20 H.min = A[i]
21 else insert A[i] into H's root list
22 if A[i].key < H.min.key
23 H.min = A[i]
FIBHEAPLINK(H,y,x)
1 remove y from the root list of H
2 make y a child of x, incrementing x.degree
3 y.mark = FALSE
CONSOLIDATE works as follows: Initialize A,
then lines 414 process each w in the root
list. Then w ends up in a tree rooted at some
node x, which may or may not be the same as w.
Of the processed roots, no others can have the
same degree as x, and so we set A[x.degree] to
point to x. When this forloop terminates, at
most one root of each degree will remain, and
A's entries will point to each remaining root.
The whileloop of repeatedly links 19.2.7
the root x of the tree containing w to another
root with the same degree as x, until no other
root has the same degree. This whileloop
maintains the following invariant:
At the start of each iteration, d = x.degree
The loop invariant is used as follows:
Initialization: Line 6 ensures that the loop
invariant holds when we enter the loop.
Maintenance: In each iteration, A[d] points to
a root y. Because d = x.degree = y.degree
we want to make y a child of x (switching x
and y first if x.key > y.key); the link
operation increments x.degree and we also
increment d to maintain the invariant. Note
that y is no longer a root, so we remove its
pointer from A in line 12.
Termination: We repeat the loop until A[d] is
NIL, in which case there is no other root
with the same degree as x.
After the whileloop terminates, we set A[d]
to x in line 14 and do another iteration of
the forloop.
Figure 19.4(a), page 514, shows 19.2.8
a Fibonacci heap H. Figure 19.4(b) shows H
with its min removed and min's children added
to its root list. Figures 19.4(c)(e) show H
and A for the first 3 iterations of the
forloop, which only set entries in A, since
the degrees of the roots are different and A
starts with only NILs.
In the next iteration, the degree of the root
containing 7 is the same as that pointed to by
A[0] (the root containing 23), so the while
loop is entered and those roots are linked,
forming a tree with degree 1 (Figure 19.4(f)).
This tree has the same degree as that pointed
to by A[1], so we stay in the whileloop and
link those trees, forming a tree of degree 2
(Figure 19.4(g)). The whileloop is iterated
one more time, linking the degree 2 tree with
that pointed to by A[2] (Figure 19.4(h)).
The next 2 iterations of the forloop simply
add pointers to the roots 21 (in A[0]) and 18
(in A[1]), shown in Figures 19.4(i)(j).
In the next iteration, the degree of 19.2.9
the root containing 52 is the same as the one
pointed to by A[0], so the whileloop is run
and they are linked, forming a degree1 tree
with 21 at the root and 52 as the child. The
whileloop runs again, linking that degree1
tree with the one pointed to by A[1], forming
a degree2 tree, resulting in Figure 19.4(k).
The final iteration of the forloop simply
sets A[1] to point to the root containing 38
(Figure 19.4(l)).
Line 15 empties the root list and lines 1623
reconstruct it from the array A. The final
result is shown in Figure 19.4(m).
We now show the amortized cost of extracting
the minimum node in an nnode heap H is
Theta(D(n)). One contribution of Theta(D(n))
comes from adding at worst D(n) children of
H.min to the rootlist in FIBHEAPEXTRACTMIN
in lines 46. Another Theta(D(n)) comes from
lines 23 and 1523 of CONSOLIDATE.
It remains to find the contribution of lines
414. The size of the root list upon calling
CONSOLIDATE is at most D(n) + t(H)  1 since
it consists of the original t(H) roots, minus
the extracted node, plus its children (at most
D(n) ). Every time through the while loop of
lines 713, one of the roots is linked to
another, so the total amount of work done by
the for loop is proportional to D(n) + t(H)
and thus the total actual amount of work done
in extracting the minimum is O(D(n) + t(H)).
The potential before extracting the 19.2.10
minimum node is t(H) + 2m(H) and the potential
after is at most (D(n) + 1) + 2m(H), since at
most D(n) roots remain and no nodes become
marked. Thus, the amortized cost is at most:
O(D(n) + t(H)) + ((D(n) + 1) + 2m(H))
 (t(H) + 2m(H)) (*)
= O(D(n)) + O(t(H))  t(H)
= O(D(n))
since we can scale up the potential in the
last two terms of (*), so that the t(H) from
the last term cancels the O(t(H)) from the
first term of (*). Intuitively, the cost of
performing a link is paid for by the reduction
in the potential due to reducing the number of
roots by 1. Since in Section 19.4 we show
D(n) = O(lg n), the amortized cost of
extracting the minimum node is O(lg n).
19.3 Decreasing a key and 19.3.1
deleting a node
We show how to decrease the key of a node in
O(1) amortized time, and how to delete any
node in O(D(n)) amortized time. Since D(n) is
O(lg n) (Section 19.4), FIBHEAPEXTRACTMIN &
FIBHEAPDELETE run in O(lg n) amortized time.
Decreasing a key
FIBHEAPDECREASEKEY(H,x,k)
1 if k > x.key
2 error "new key is > than current key"
3 x.key = k
4 y = x.p
5 if y not = NIL and x.key < y.key
6 CUT(H,x,y)
7 CASCADINGCUT(H,y)
8 if x.key < H.min.key
9 H.min = x
CUT(H,x,y)
1 remove x from the child list of y,
decrementing y.degree
2 add x to the root list of H
3 x.p = NIL
4 x.mark = FALSE
CASCADINGCUT(H,y) 19.3.2
1 z = y.p
2 if z not = NIL
3 if y.mark == FALSE
4 y.mark = TRUE
5 else CUT(H,y,z)
6 CASCADINGCUT(H,z)
Lines 13 of FIBHEAPDECREASEKEY ensure that
the new key is <= the current key and assign
it to x. If x is a root or x.key >= y.key,
where y = x.p, no changes need to be made
since the minheap order still holds; lines
45 test for this. If the minheap order has
been violated, we start by cutting x, making
it a root with the call to CUT in line 6.
We also cut x's parent, y, if y had another
child CUT since y itself was made the child of
another node. The "mark" field of y tells us
to do this if it is TRUE. In other words, if
1. at some time y was a root,
2. then y was linked to another node,
3. then two children of y were CUT,
then we must also CUT y. When a node is cut,
we clear its mark field (this was also done in
FIBHEAPLINK since this is only step 2.).
But we are not done, since y.p.mark could be
TRUE and the CUTs could "cascade" up the tree.
This will stop at a root or when a node had
no previous child CUT.
Once all cascading CUTs have occurred, lines
89 update H.min if necessary.
19.3.3
Figure 19.5 (page 521) shows the results of
two calls to FIBHEAPDECREASEKEY, starting
with the heap shown in Figure 19.5(a). First,
the node with key 46 has it decreased to 15,
which involves no cascading cuts, but marks
its parent with key 24, as in Figure 19.5(b).
Then the node with key 35 has it decreased to
5, which causes two cascading cuts, and is
shown in Figures 19.5(c)(e).
To find FIBHEAPDECREASEKEY's amortized
cost, we first find its actual cost. It takes
O(1) time, plus the time to do c cascading
cuts (c >= 0), so the actual cost is O(c).
To find the change in potential, we note that
each CASCADINGCUT, except for the last one,
makes the marked node a root and clears the
mark bit. Afterward, there are t(H) + c trees
(the original t(H) trees, c1 trees produced
by cascading cuts, and the tree rooted at x),
and at most m(H)  c + 2 marked nodes (c  1
were unmarked by cascading cuts and the last
call may have marked a node). So the change
in potential is at most:
((t(H)+c) + 2(m(H)c+2))  (t(H) + 2m(H))
= 4  c
Thus the amortized cost is at most
O(c) + 4  c = O(1)
since we can scale the potential so that the
c dominates the hidden constant in O(c).
19.3.4
We can now see why the potential function
included the term 2m(H). When a marked node
is cut, one unit of potential pays for the cut
(and clearing the mark bit) and the other unit
compensates for the increase in potential due
the node becoming a new root.
Deleting a node
To delete a node We assume that there is no
key value of infinity in the heap.
FIBHEAPDELETE(H,x)
1 FIBHEAPDECREASEKEY(H,x,infinity)
2 FIBHEAPEXTRACTMIN(H)
The amortized cost of FIBHEAPDELETE is the
sum of the O(1) cost of FIBHEAPDECREASEKEY
and the O(D(n)) cost of FIBHEAPEXTRACTMIN.
Since it is shown in Section 19.4 that
D(n) = O(lg n), The amortized cost of
FIBHEAPDELETE is also O(lg n).
19.4 Bounding the maximum degree 19.4.1
To establish the O(lg n) amortized costs of
FIBHEAPEXTRACTMIN and FIBHEAPDELETE, we
must show D(n) = O(lg n), where D(n) is the
upper bound on the degree of any node in a
Fibonacci heap. We can establish this bound
because a node is cut from its parent as soon
as it loses a second child. In particular, we
will show that D(n) <= floor( log_phi(n) ),
where phi = (1 + sqrt(5))/2, the golden ratio.
For any node x, we define size(x) to be the
number of nodes, including x, in the subtree
rooted at x. We shall show that size(x) is
exponential in x.degree.
Lemma 19.1
Let x be any node in a Fibonacci heap, and
suppose x.degree = k. Let y_1, y_2, ..., y_k
denote the children of x in the order in which
they were linked to x from earliest to latest.
Then y_1.degree >= 0 and y_i.degree >= i  2
for i = 2, 3, ..., k.
Proof: Certainly y_1.degree >= 0.
For i > 1, when y_i was linked to x, all of
y_1, y_2, ..., y_(i1) were children of x, so
x.degree = i  1 then. Node y_i is linked to
x only if y_i.degree = x.degree, so
y_i.degree was also i  1 then. Since then,
y_i has lost at most 1 child (or else it would
have been cut), so y_i.degree >= i  2.
19.4.2
Recall (Section 3.2) that for k = 0, 1, 2,...
the kth Fibonacci numbers are defined by:
/ 0 if k = 0
F_k = < 1 if k = 1
\ F_(k1) + F_(k2) if k >= 2
Lemma 19.2
For all integers k >= 0,
k
F_(k+2) = 1 + Sum ( F_i )
i = 0
Proof: By induction on k. When k = 0,
0
1 + Sum ( F_i ) = 1 + F_0
i = 0 = 1 + 0
= 1
= F_2
We now assume the induction hypothesis (I.H.):
k1
F_(k+1) = 1 + Sum ( F_i )
i = 0
Then: F_(k+2) = F_(k) + F_(k+1)
k1
= F_(k) + ( 1 + Sum ( F_i ) )
i = 0
k
= 1 + Sum ( F_i )
i = 0
Lemma 19.3 k 19.4.3
F_(k+2) >= phi , where phi = (1 + sqrt(5))/2.
Proof: By induction on k. 0
Base cases: k = 0: F_2 = 1 = phi 1
k = 1: F_3 = 2 > 1.619 > phi
Induction Hypothesis: i
F_(i+2) >= phi for 0 <= i <= k1
Induction step:
F_(k+2) = F_(k+1) + F_k
k1 k2
>= phi + phi by Ind. Hyp.
k2
= (phi + 1)*phi
2 k2
= phi * phi by Equation (3.23)
k
= phi
Lemma 19.4 19.4.4
Let x be any node in a Fibonacci heap, and let
k = x.degree. Then size(x) >= F_(k+2), and
so size(x) >= phi^k also.
Proof: Let s_k denote the minimum possible
value of size(x) over all nodes z such that
z.degree = k. Obviously s_0 = 1 and s_1 = 2
(and s_2 = 3). By definition of s_k, if
x.degree = k, size(x) >= s_k. Also, s_k
increases (strictly) monotonically with k.
Suppose a node z has z.degree = k and size(z)
= s_k. Because s_k <= size(x) we compute a
lower bound on size(x) by computing a lower
bound on s_k. As in Lemma 19.1, let y_1, y_2,
..., y_k denote the children of x in the order
they were linked to x. To compute a lower
bound on size(x), we count 1 for x itself and
1 for y_1 (for which size(y_1) >= 1), giving:
size(x) >= s_k
k
>= 2 + Sum s_(y_i.degree)
i = 2
k
>= 2 + Sum ( s_(i2) )
i = 2
where the last line follows from Lemma 19.1
(so that y_i.degree >= i2) and monotonicity
of s_k (so that s_(y_i.degree) >= s_(i2) ).
19.4.5
We use induction on k to show s_k >= F_(k+2)
for k >= 0. The bases, for k = 0 and k = 1,
hold since s_0 = 1 = F_2, and s_1 = 2 = F_3.
For the inductive step, we assume that k >= 2
and that s_i >= F_(i+2) for i = 0, 1,..., k1.
So we have:
k
s_k >= 2 + Sum ( s_(i2) ) (by above)
i = 2
k
>= 2 + Sum ( F_i ) (by ind. hyp.)
i = 2
k
= 1 + Sum ( F_i )
i = 0
= F_(k+2) (by Lemma 19.2)
k
Thus, size(x) >= s_k >= F_(k+2) >= phi .
Corollary 19.5 D(n) = O(lg n)
Proof: Let x be a node in an nnode Fibonacci
heap, and let k = x.degree. By Lemma 19.4,
n >= size(x) >= phi^k. Taking basephi
logarithms gives log_phi(n) >= k, so the
maximum degree D(n) of any node is O(lg n).