Contents
Trees are extremely important in computer science because:
We can represent a tree as a construction consisting of nodes, and edges which represent a relationship between two nodes.
|
The node which is at the top (if such a node exists) is termed the root node. The nodes to which it is joined by an edge are its children, and it is the parent of those nodes. Each of these children, in turn, may have one, many, or no children, however, each node always has exactly one parent (except of course the root node). Children who share the same parent are called siblings. A node with no children is referred to as a leaf. Nodes which are the same distance from the root (have the same number of edges lying between them and the root) are at the same level. The height of the tree is the maximum distance between the root and any node. The degree of a node is the number of its children. |
|
A free tree is a connected, acyclic graph.
It has no root. It is connected in that any node in the graph can be reached from any other node by exactly one path. It does not contain any cycles (aka circuits, or closed paths), which would imply the existence of more than one path between two nodes. |
|
|
An ordered tree consists of a root node and any number of children which are ordered from "oldest" to "youngest", hence the name. Each of the children may be the root of a sub-trees. |
This type of tree superficially resembles an ordered tree. However, the children are not identified by their age, but by their position. I.e., two ordered trees consisting of only two nodes are identical: they both consist of a root and an oldest child. Two two-node position trees, on the other hand, may be quite different. They both have roots, but one may have a child in position #1, and the other a child in position #2.
In general, a k-ary position tree is a position tree with a branch factor of k. The most important of these are binary trees (k=2).
A binary tree consists of a root with no children, or a root with one left child, or a root with one right child, or a root with both a left and a right child. Each of these children in turn is the root of a binary tree.
A one-to-one correspondence can be made between any ordered tree of n nodes and a binary tree of n-1 nodes, as illustrated below.
![]() |
![]() |
I.e., there exists a unique mapping from an ordered tree to a binary tree. The siblings in the original tree become a "linked list" of right children in the corresponding binary tree. Note, however, that in this mapping the first node can never have a right child, because that would imply that the root of the original ordered tree had a sibling! So the mapping is between an ordered tree of n nodes and a binary tree of n-1 nodes.
Further, this method of mapping can be extended to an ordered forest of ordered rooted trees with a total number of nodes equal to n.
This connection is important for the following reason. An ordered rooted tree with an unknown, and possibly very large, branching factor can create problems in implementation and storage (e.g., an array cannot be used since the branch factor is unknown). But the fact that this type of tree can be mapped to a binary tree means that the problem is reduced to that of implementing a binary tree.
A complete tree is a k-ary position tree in which all levels are filled from left to right (so the only possible partially-filled level is the last level). Each full level contains k(level #) nodes. The following is a complete binary (k=2) tree.
The great advantage of a complete tree is that pointers are not required to implement its structure. The position of a node is implicit in the structure. They are, in addition, the "flattest" type of tree, ie, they have the largest node to height ratio, where "height" is the maximum distance (number of edges traversed) between the root and any node in the tree. Height is a property of a tree, not of a node.
In a complete tree there is a connection between the height h and the number of nodes n. For k=2 (a complete binary tree)we have:
Noting that the total number of nodes is the sum of the number of nodes at each level:
So for a complete binary tree of size n, h is:
| n | h |
|---|---|
| 1 000 | 10 |
| 1 000 000 | 20 |
| 1 000 000 000 | 30 |
We can access any element in a collection of one billion nodes in a maximum of 30 steps. Not too shabby.
Exercise:
Prove that for k=2, h=floor(log2 n).
The most common implementation is a linked list. Here is an example of a standard implementation for a binary tree (in C):
struct node
{
datatype key;
struct node *parent, /* 'parent' is optional */
*leftchild,
*rightchild,
.
.
.
/* other pointers as required */ ;
}
typedef struct node *tree;
This definition could also be used to implement a rooted ordered tree, by
replacing struct node *leftchild with struct node
*oldestchild and replacing struct node *rightchild with
struct node *nextsibling.
Note that a tree is simply a pointer to a node. The above example is for an endogeneous tree, ie. the data is held in the node itself. For an exogeneous tree, the line datatype key; would become datatype *key;. Exogeneous implementation is often preferable since, because the structure of a tree is often altered in the course of an algorithm, the physical moving about of data is avoided.
A binary complete tree lends itself well to array implementation:
![[array2.gif]](images/array2.gif)
The root is the first item in the array, its left child is the second, right child is the third, and so on. The position of an element in the tree is implied by its position in the array: it is an implicit data structure.
Formulae for accessing node l:
| To find the parent: | floor(l/2) . |
| To find left child: | 2l . |
| To find right child: | 2l + 1 . |
| To determine if l is a leaf: | 2l > n if true, then l is a leaf. |
Another possible implicit data structure would be an n x m matrix, whereby element(i,j) would have and index of im+j in an array of length nm ( i={0, 1, 2, ... n-1} and j={0, 1, 2, ... m-1} ).
Exercise:
Determine a formula to locate the parent of node l in a complete k-ary
tree.
Determine a formula to locate l's oldest child and youngest child.
(total nodes) n0 + n1 + n2 = n
(total edges) 0n0 + 1n1 + 2n2 = n-1
---------------------
subtract: n0 - n2 = 1
Suppose we have to e-mail someone a tree. What is the minimum number of bits required? Generally, for n objects log2(n) bits will be needed. Is it possible to find a way of numbering trees so that each possible tree is uniquely identified?
To explore this, we examine tree traversal
_____________________
| \
| SCENIC \ to java demo
| DIVERSION /
|_____________________ /
| |
\ | |//
A tree traversal is just the listing of the nodes of a tree in a
certain order.
|
Pre-order traversal (peep through the blue windows) of this tree will produce ABCEFD (each node is visited before its children are visited; the root is visited first). Post-order traversal (peep through the red windows) will produce BEFCDA (the children of a node are visited before the node itself; the root is visited last).
|
Pre-order and post-order are the two standard types of tree traversal. There is also a third type, in-order traversal (see below).
Returning to the question of how to represent a tree: a pre-order traversal alone is not sufficient to reconstruct the shape of the original tree. However, in the case of ordered rooted trees, the original tree can be recreated from the pre- and post-order traversals. But this does not hold if the original tree was a binary tree. Why? Given a pre-order traversal of AB and a post-order traversal of BA, one cannot determine whether B was a left child or a right child.
Exercise:
Write a linear time algorithm to reconstruct an ordered
rooted tree from pre- and post-order traversals.
The pre-order traversal and the degree of each node taken together also provide sufficient information to reconstruct a tree (remember, the degree of a node is the number of its children). For the above tree (degree given in parentheses)
We know that in pre-order traversal all subnodes must be visited before the node is exited. Since C is of degree 2, the next two nodes must be its children. This implies that A (which is degree 3) has B, C and D as its children.
Exercise:
Write a linear time algorithm to reconstruct a tree from the
pre-order traversal and the degree.
But we can do better than this: remembering that any ordered tree can be uniquely mapped to a binary tree, we can represent each node by two bits, where the left-most bit represents the absence or presence of a left-child (0 or 1 respectively) and the right-most bit, the absence or presence of a right-child.
![[binarytreetwo.gif]](images/binarytreetwo.gif)
If we replace the nodes in a pre-order traversal by their bits, we obtain for the tree given above the bit string 100111010000. If we interpret this bit string as a number in binary, then each ordered rooted tree and each binary tree may be uniquely represented by a number. There are, however, two things to note: pre-order traversal will always end with a leaf, so the last two bits will always be 0; the binary tree mapped from the original ordered tree always has a root node with a sole left child, therefore the first two bits will always be 10. However, this means that 2n bits are sufficient to describe the shape of either tree.
Is is possible to find a representation in less than 2n bits?
First, we know that to represent N objects we need at least ceiling(log2 N) bits. To represent all binary trees on n nodes, we need at least ceiling (log2 (number of binary trees on n nodes) ) bits.
The number of possible binary trees in n nodes is: 1/n * C(2n, n) = 1/(n+1) * (2n)!/(n!n!). Remembering that log2(n) is an absolute lower bound, then
("~" means as n goes to infinity the ratio of left/right goes to 1)
(The above can be verified by using Stirling's formula, which states that
n! ~ (n/e)n * sqrt(2 * pi * n) .)
To examine in-order traversal, consider the following tree (known as an expression tree). It represents the expression (A*3) + B
![]() |
Note that if the operators are all binary (ie. they require two arguments),
the operands are always leaves and the operators are internal nodes, so:
# operands = # operators + 1
A non-leaf node is said to represent the value of the expression of the subtree of which it is the root, ie. in this tree, the node (*) represents the expression (A * 3). |
visit(p<-root)
if(p is not a leaf)
write "("
visit(left[p])
write ")"
write(key[p])
write "("
visit(right[p])
write ")"
Given the above tree as input it will produce the following
output:What is the time complexity of this algorithm? The two recursive calls (visit(left[p]) and visit(right[p])) are executed n-1 times (once for each edge in the tree), so this is O(n), or linear time. In general, any traversal algorithm is of order Theta(n).
How much space do tree traversals take? Consider the following algorithms (note that these all refer to binary trees):
Stack-based Traversal
Implementing a stack-based (on-recursive) traversal algorithm makes it clear how much space is used by the algorithm, since we explicitly store unvisited nodes.
Here, then, is a stack-based algorithm (S is a stack):
makenull(S)
push(root,S)
while not_empty(S) do
p<-pop(S)
visit(p)
if (right[p]!=null) push(right[p])
if (left[p]!=null) push(left[p])
But consider a tree with this shape:
This is just about worst case shape for pre-order traversal, because the stack will grow to size n/2.
So how do we optimize this?
Parsimonious Traversal
In this kind of traversal we make sure that the root of the largest sub-tree is pushed onto the stack, and process the smallest sub-tree. This works because if we color (mark) the nodes in the binary tree that are stacked, then the siblings of these nodes exist and have sub-trees that are <=50% of their parents. But since the colored nodes must lie on a path to the root, we see that if k nodes are stacked (marked), the sub-tree size of the sibling of the deepest such stacked node is at most n/2k. Now, this is at least one, for otherwise, the sibling would not exist. Thus, 2k <=n or k<= log2 n. So we can absolutely guarantee logarithmic storage requirements.
And how do we know which is the smallest subtree? The trick is to store in each node the size of its sub-tree. (It is a good practice to keep track of sub-tree size since it is an attribute used in a great many algorithms.)
Stackless Traversals
To implement in-order or post-order non-recursive stackless traversals, parent pointers are required. The following code is an example:
p<-root
repeat
while (left[p] != right[p])
while(left[p]!=null)
p<-left[p]
visit(p)
if (right[p]!=null)
p<-right[p]
visit(p) //end of 1 phase
while( (p!=root) AND ( (right(parent[p]) = p) OR (right(parent[p]=null) )
p<-parent[p]
if (p!=root)
p<-right[ parent[p] ]
visit(p)
else exit;
Now consider a threaded binary tree. This is a tree constructed in such a way that no parent pointers are required. Both child pointers of leaf nodes are set to the node's successor in a pre-order traversal. This greatly simplifies the algorithm and allows us to drop the parent pointers altogether, while not increasing the storage!
For a threaded binary tree, then, the algorithm for stackless pre-order traversal given above can be rewritten as follows:
p<-root
repeat
while( (left[p] != right[p]) AND (left[p] != null) )
p<-left[p]
visit(p)
if( (left[p] != right[p]) AND (right[p] != null) )
p<-right[p]
visit(p) //end of 1 phase
if (p=root) exit
else p=left[p]
visit(p)
Exercise:This applet illustrates the three types of tree traversal. Choose a traversal type and push "start"
[Back to "Representations and Tree Traversal"]
[Top]
Applets:
Prof. D. Ierardi, U. of Southern California, Interactive binary
search tree demos
Sughihara's course notes #11-14 (Great stuff)
Oberlin College course notes (notes on Scheme-like implementations)
Tree Traversal:
Dowloadable postscript files:
Online Journals:
Exotica:
Abstract Data Types, N. Dale. Joans & Bartlett (1996), Chapters 7-10.
The Art of Computer Programming (3 vols), D. Knuth. Addison-Wesley (1968).
Data Structures and Algorithm Analysis in C, A. Weiss. Benjamin-Cummings (1993), Chapter 4, pp. 87-148.
Data Structures and Algorithms, Aho, Hopcroft and Ullman. Addison-Wesley (1983). (good for mathematical analysis)
Discrete Mathematics and its Applications (3rd ed.), K. Rosen. McGraw-Hill (1995), Chapter 8, pp. 531-607.
Introduction to Algorithms, Cormen, Leiserson, Rivest. MIT Press (1990), pp. 91-96, 213-15.
Find it on MUSE (requires tn3270)
Or
Search using PERUSE (for articles)
If it's not there, ask the library to buy it!
This web page was created by
Jim Laskaris did the graphics using xfig, xpaint and xv (to convert to .gif format). Eric Bettan and Dimitri Panaritis did the applet. Elizabeth Thomson did the html. Everybody found links and references.