
Computer Science 308251B April 19, 2000 Winter 2000  Final examination 



hashsearch(T,k) i=0 repeat j=h(k)+i mod m if T[j]=k then return j else i=i+1 until i=m or T[j]=nil return "not found" 







smalltree(x,k) [ returns root of the subtree of x consisting of all nodes with key at most k AND... it fixes that subtree in the process.] if x=nil then return nil else if key[k] > k then return smalltree(left[x],k) else right[x]=smalltree(right[x],k) return xNote that in each recursive step we go down one level. The number of levels does not exceed h, and thus, the worstcase time is O(h). 



makenull(Q) insert( (root, key[root]), Q) while not empty (Q) do: (x,p) <  deletemin (Q) visit (x) if left[x] != nil then insert((left[x], p+key[left[x]]),Q) if right[x] != nil then insert((right[x], p+key[right[x]]),Q)The algorithm is efficient as it visits all nodes once, and performs n inserts and n deletemins, where n is the size of the tree. The complexity is thus O(n log n). 







key[x] : label (these are the "u"'s in the question) left[x], right[x], p[x], color[x] (as usual) list[x] : points to the adjacency listThis setup allows us to perform insert, delete and search in O(log n) time. However, as each adjacency list nmay hold up to n1 nodes, the edge operations are Omega (n) in the worstcase. Therefore, each adjacency list too is implemented as a redblack tree. That is, list[x] will point to the root of a redblack tree. These adjacencylist redblack trees have cells/nodes with the following fields: key[x] : label (these are the "v"'s in the question) (that is, if we are in the redblack tree for u, then the presence of v indicates the presence of edge (u,v)) left[x], right[x], p[x], color[x] (as usual)It is clear how to perform insertedge(u,v) and searchedge(u,v): first find u in the main redblack tree, which leads to the root of the redblack tree for u. In this tree, find v for a search and insert v for insertedge. For insertedge, we also insert u in the tree for v. deleteedge(u,v) deletes v from the utree and u from the vtree. All operations take O(log n) in the worst case. We now return to delete(u). It is the only remaining problem: deleting the tree for u is simple and takes O(1) time. However, from each vtree we need to eliminate the uentry. This is too much work. Instead, we leave the node in the tree, and check the legality of a node during a searchedge by checking whether u is indeed a valid label (in O(log n) time, as the labels are picked from a set of size at most n). As a second solution, we may replace the small adjacency list redblack trees by one redblack tree for all edges. The ordering needed to do this is as follows: first order the nodes of an edge pair, so that u < v for edge (u,v). Then (u,v) is < (w,z) if u < w or if u=w and v < z. (This is called lexicographical ordering.) This tree can have up to n^2 elements, but its height is still O(log n). 



A(n) B(n) = A_1 (n/2) B_1 (n/2) + x^n A_2 (n/2) B_2 (n/2) + x^(n/2) ( A_1 (n/2) B_2 (n/2) + A_2 (n/2) B_1 (n/2)).The time T_n needed to compute the product polynomial is cn + 3 T_(n/2) where the cn term takes care of the additions and the shifts, and the 3 T_(n/2) accounts for the three recursive calls to "matrix multiplication". The "3" is there because the x^(n/2) term can be computed by using one extra multiplication, not two, as shown by this trick (dropping arguments): compute A_1 B_1 (coefficient of x^0) compute A_2 B_2 (coefficient of x^n) compute (A_1 +A_2)(B_1 + B_2) compute A_1 B_2 + A_2 B_1 (coefficient of x^(n/2)) as (A_1 +A_2)(B_1 + B_2)  A_1 B_1  A_2 B_2 (no more multiplications needed!)The recurrence T_n = 3 T_(n/2) + cn , T_1 = 1,has solution T_n = Theta (n^(log_2 3)) by the master theorem. 



ceiling ( log_3 (2n) )For part B, we have only n answers, and an accompanying lower bound of ceiling ( log_3 (n) ). 

Copyright © 2000 Luc Devroye. Email: luc@cs.mcgill.ca. 