Lectures in COMP 252--Winter 2026


CLR refers to Cormen, Leiserson, Rivest and Stein: "Introduction to Algorithms (Third Edition)".

Lecture 1 (Jan 6)

General introduction. Definition of an algorithm. Examples of simple computer models: Turing model, RAM model, pointer based machines, comparison model. RAM model (or uniform cost model), bit model (or: logarithmic cost model). Definition and examples of oracles (not done explicitly),. The Fibonacci example: introduce recursion, iteration (the array method), and fast exponentiation. Worst-case time complexity. Landau symbols: O, Omega, Theta, small o, small omega. Examples of many complexity classes (linear, quadratic, polynomial, exponential).

Resources:

Exercise mentioned during the lecture:

  • Show that (log n)^1000 = o(n^0.001), n^1000=o(1.001^n), and 1000^n = o(n!).
  • Show that n ln (n) / ln (n!) -> 1 as n -> infinity.

Exercise to chew on, but not mentioned in class:

  • Show that for the fast exponentiation method in the RAM model, T_n = O(log(n)) even if n is not a power of 2.
  • Determine in Big Theta notation the bit model time complexities of the recursion method, the array method, and the fast exponentiation method in the Fibonacci example. In class, we only did the RAM model complexities.

Lecture 2 (Jan 8)

Assignment 1 handed out: this handwritten assignment is due on January 20 at 1:05 pm, in class.

Introduction to the divide-and-conquer principle. Fast exponentiation revisited. Introduction to recurrences. The divide-and-conquer solution to the chip testing problem in CLR. Divide-and-conquer (Karatsuba) multiplication. Mergesort, convex hulls in the plane. We covered all of chapter 2 and part of chapter 3 of CLR.

Resources:

Exercises mentioned during the lecture:

  • Describe in algorithmic form the linear time upper hull merge algorithm outlined in class.

Exercises not mentioned during the lecture:

  • One student asked if bad chips would have to lie. If they have to lie, then the problem is done with n-1 tests. Do you see that?

Lecture 3 (Jan 13)

Binary search using a ternary oracle (with analysis by the substitution method). Strassen matrix multiplication. Euclid's method for the greatest common divisor. The master theorem (statement only). Recursion tree to explain the master theorem.

Resources:

Exercises mentioned during the lecture:

  • Write the binary search algorithm using a binary oracle.
  • For the adventurous: Show that the bit complexity of Euclid's algorithm for gcd (n,m) is O( log n . log m ).
  • What is the "big theta" order of the solution of T_n = T_(n/2) + log n, n > 0? (This was not mentioned but may make you think about recursion trees.)

Lecture 4 (Jan 15)

The induction method illustrated on mergesort and binary search with a ternary oracle. Linear time selection: finding the k-th smallest by the algorithm of Blum, Floyd, Pratt, Rivest and Tarjan. O(n) complexity proof based on induction.

Resources:

Exercises mentioned during the lecture:

  • For the binary search algorithm with binary oracle, show by induction on n that the worst-case number of comparisions T_n satisfies
    T_n ≤ 2 + ⌈log_2 (n)⌉
    for all n. Convince yourself that T_1 =2, and that
    T_n = 1 + T_{⌊(1+n)/2⌋}.
  • Using the recursion tree, analyze the median-of-three algorithm and convince yourself that its worst-case time complexity is Θ(n log n).

Lecture 5 (Jan 20)

Assignment 1 due at 1:05 pm in class, handwritten on paper; groups of 2 are allowed (one copy per pair); iPad printouts are permitted. The Fast Fourier Transform. Multiplying polynomials in time O(n log n).

Resources:

  • CLR, chapter 30.
  • Luc's notes given in a link during the lecture.

Exercise: We saw the FFT conversion algorithm from N coefficients to N values, evaluated at ω0, ω1, ... ωN-1, where ω is the N-th root of unity. Take N=4. The algorithm computes polynomial P at 4 points. It builds on two recursions which compute two polynomials at two points, and finally, each of these two recursions uses two recursions computing a polynomial at one point. This gives a total of 7 function calls. It is useful to work out the outputs of these seven calls, i.e., a vector of size 4, two vectors of size 2, and 4 vectors of size 1. Express these vectors as a function of ω and the coefficients of P. Then express these same vectors in complex number form.

Lecture 6 (Jan 22)

Assignment 2 handed out: this handwritten assignment is due on February 3 at 1:05 pm, in class. Note: the k+1 +k log_2 (n/k) complexity in exercise 2 should be replaced by O(k+1 +k log_2 (n/k)).

Dynamic programming. Computing binomial coefficients and partition numbers. The traveling salesman problem. The knapsack problem. The longest common subsequence problem.

Resources:

Exercises mentioned in class:

  • In the shortest path problem, formulate the sub-solutions necessary so that when asked, one can get the shortest path sequence from an arbitrary node j to node 1 in time O(n), where n is the number of nodes. And tell us how to modify or update the TSP algorithm seen in class.
  • Given the longest common subsequence table and the input bit strings of lengths n and m, write an algorithm that outputs the positions of the matched bit pairs in time O(n+m).
  • Modify the knapsack algorithm so that you can output the items that fit in a knapsack given that P[n,k]=1.

Note about assignment 2: the k+1 +k log_2 (n/k) complexity in exercise 2 should be replaced by O(k+1 +k log_2 (n/k)).

Exercises not mentioned in class:

  • Modify the knapsack algorithm for the case that we have an unlimited supply of items of the given sizes.
  • Can you count the number of knapsack solutions by dynamic programming?
  • Can you count the number of ways of paying a bill of n dollars with coins of values c_1 < c_2 < ... < c_k? Here the input is n, k, c_1,...,c_k.

Lecture 7 (Jan 27)

Lower bounds. Oracles. Viewing algorithms as decision trees. Proof of lower bound for worst-case complexity based on decision trees. Examples include sorting and searching. Lower bounds for board games. Finding the median of 5 with 6 comparisons. Merging two sorted arrays using finger lists.

Resources:

Practice: What is meant by the statement that for an algorithm A, the worst-case time T_n (A) is Ω(n^2)? And what is meant when we say that for a given problem, the lower bound is Ω(n^7)?

Lecture 8 (Jan 20)

Lower bounds via witnesses: finding the maximum. Lower bounds via adversaries. Guessing a password. The decision tree method as a special case of the method of adversaries. Finding the maximum and the minimum. Finding the two largest elements. Finding the median.

Resources:

An exercise:

  • We mentioned that the height of a complete binary tree on n leaves is ⌈log_2 n⌉. Prove this. And show that for any binary tree, this is a lower bound.

Lecture 9 (Feb 3)

Assignment 2 due at 1:05 pm, in class. Assignment 3 handed out: this handwritten assignment is due on February 12 at 1:05 pm, in class.

Introductory remarks about ADTs (abstract data types). Lists. Primitive ADTs such as stacks, queues, deques, and lists. Classification and use of linked lists, including representations of large integers and polynomials, window managers, and available space lists. Uses of stacks and queues in recursions and as scratchpads for organizing future work. Introduction to trees: free trees, ordered trees, position trees, binary trees. Bijection between ordered trees (forests) and binary trees via oldest child and next sibling pointers. Relationship between number of leaves and number of internal nodes with two children in a binary tree. Complete binary trees: implicit storage, height.

Resources:

Exercises mentioned in class:

  • We saw the way of navigating a complete binary tree using the locations of the nodes in an array. Generalize the navigation rules by computing for a given node i in a complete k-ary tree with n nodes, the parent, the oldest child and the youngest child (and indicate when any of these do not exist).
  • Prove by induction that in a binary tree, the number of leaves is equal to the number of nodes with two children plus one.
  • Write an O(n) time program for creating the oldest-child-next-sibling binary tree seen in class from a given ordered tree in which the children of a node are stored from oldest to youngest in a linked list. Conversely, from a binary tree in which the root only has a left child, recreate in O(n) time the corresponding ordered tree.

Lecture 10 (Feb 5)

Continuing with trees. Depth, height, external nodes, leaves, preorder numbers, postorder numbers. Preorder, postorder, inorder and level order listings and traversals. Recursive and nonrecursive traversals. Stackless traversal if parent pointers are available. Representing and storing trees: binary trees require about 2n bits. Counting binary trees (Catalan's formula) via counting Dyck paths. Expression trees, infix, postfix (reverse Polish), and prefix (Polish) notation.

Resources:

Exercises mentioned in class:

  • Write linear time algorithms to go between any of the 2n-bit representations of binary trees, and a binary tree.
  • Write an O(n) algorithm that computes for all nodes in an ordered tree of size n the sizes of their subtrees (and stores those sizes in the corresponding cells).

Exercises not mentioned in class:

  • How many bits are needed to store the shape of a binary expression tree on n nodes?
  • How many bits are needed to store the shape of a rooted ordered tree on n nodes?
  • Determine in time $O(n)$ whether a coded bit sequence of length $2n$ that uses the codes 00, 01, 10 and 11 for the four kinds of nodes, is a legal sequence, i.e., one that corresponds to a binary tree on $n$ nodes.

Lecture 11 (Feb 10)

Introduction to the ADT Dictionary. Definition of a binary search tree. Standard operations on binary search trees, all taking time O(h), where h is the height. Generic tree sort. The browsing operations. Definition of red-black trees. Red-black trees viewed as 2-3-4 trees.

Resources: