|
CLR refers to Cormen, Leiserson, Rivest and Stein: "Introduction to Algorithms (Third Edition)". |
|
Lecture 1 (Jan 6) |
General introduction. Definition of an algorithm. Examples of simple computer models: Turing model, RAM model, pointer based machines, comparison model. RAM model (or uniform cost model), bit model (or: logarithmic cost model). Definition and examples of oracles (not done explicitly),. The Fibonacci example: introduce recursion, iteration (the array method), and fast exponentiation. Worst-case time complexity. Landau symbols: O, Omega, Theta, small o, small omega. Examples of many complexity classes (linear, quadratic, polynomial, exponential). Resources:
Exercise mentioned during the lecture:
Exercise to chew on, but not mentioned in class:
|
|
Lecture 2 (Jan 8) |
Assignment 1 handed out: this handwritten assignment is due on January 20 at 1:05 pm, in class. Introduction to the divide-and-conquer principle. Fast exponentiation revisited. Introduction to recurrences. The divide-and-conquer solution to the chip testing problem in CLR. Divide-and-conquer (Karatsuba) multiplication. Mergesort, convex hulls in the plane. We covered all of chapter 2 and part of chapter 3 of CLR. Resources:
Exercises mentioned during the lecture:
Exercises not mentioned during the lecture:
|
|
Lecture 3 (Jan 13) |
Binary search using a ternary oracle (with analysis by the substitution method). Strassen matrix multiplication. Euclid's method for the greatest common divisor. The master theorem (statement only). Recursion tree to explain the master theorem. Resources:
Exercises mentioned during the lecture:
|
|
Lecture 4 (Jan 15) |
The induction method illustrated on mergesort and binary search with a ternary oracle. Linear time selection: finding the k-th smallest by the algorithm of Blum, Floyd, Pratt, Rivest and Tarjan. O(n) complexity proof based on induction. Resources:
Exercises mentioned during the lecture:
|
|
Lecture 5 (Jan 20) |
Assignment 1 due at 1:05 pm in class, handwritten on paper; groups of 2 are allowed (one copy per pair); iPad printouts are permitted. The Fast Fourier Transform. Multiplying polynomials in time O(n log n). Resources:
Exercise: We saw the FFT conversion algorithm from N coefficients to N values, evaluated at ω0, ω1, ... ωN-1, where ω is the N-th root of unity. Take N=4. The algorithm computes polynomial P at 4 points. It builds on two recursions which compute two polynomials at two points, and finally, each of these two recursions uses two recursions computing a polynomial at one point. This gives a total of 7 function calls. It is useful to work out the outputs of these seven calls, i.e., a vector of size 4, two vectors of size 2, and 4 vectors of size 1. Express these vectors as a function of ω and the coefficients of P. Then express these same vectors in complex number form. |
|
Lecture 6 (Jan 22) |
Assignment 2 handed out: this handwritten assignment is due on February 3 at 1:05 pm, in class. Note: the k+1 +k log_2 (n/k) complexity in exercise 2 should be replaced by O(k+1 +k log_2 (n/k)). Dynamic programming. Computing binomial coefficients and partition numbers. The traveling salesman problem. The knapsack problem. The longest common subsequence problem. Resources:
Exercises mentioned in class:
Note about assignment 2: the k+1 +k log_2 (n/k) complexity in exercise 2 should be replaced by O(k+1 +k log_2 (n/k)). Exercises not mentioned in class:
|
|
Lecture 7 (Jan 27) |
Lower bounds. Oracles. Viewing algorithms as decision trees. Proof of lower bound for worst-case complexity based on decision trees. Examples include sorting and searching. Lower bounds for board games. Finding the median of 5 with 6 comparisons. Merging two sorted arrays using finger lists. Resources:
Practice: What is meant by the statement that for an algorithm A, the worst-case time T_n (A) is Ω(n^2)? And what is meant when we say that for a given problem, the lower bound is Ω(n^7)? |
|
Lecture 8 (Jan 20) |
Lower bounds via witnesses: finding the maximum. Lower bounds via adversaries. Guessing a password. The decision tree method as a special case of the method of adversaries. Finding the maximum and the minimum. Finding the two largest elements. Finding the median. Resources:
An exercise:
|
|
Lecture 9 (Feb 3) |
Assignment 2 due at 1:05 pm, in class. Assignment 3 handed out: this handwritten assignment is due on February 12 at 1:05 pm, in class. Note added late: in exercise 2, the oracle can ask whether x < A[i,j], x ≤ A[i,j], x > A[i,j] and x ≥ A[i,j]. Introductory remarks about ADTs (abstract data types). Lists. Primitive ADTs such as stacks, queues, deques, and lists. Classification and use of linked lists, including representations of large integers and polynomials, window managers, and available space lists. Uses of stacks and queues in recursions and as scratchpads for organizing future work. Introduction to trees: free trees, ordered trees, position trees, binary trees. Bijection between ordered trees (forests) and binary trees via oldest child and next sibling pointers. Relationship between number of leaves and number of internal nodes with two children in a binary tree. Complete binary trees: implicit storage, height. Resources:
Exercises mentioned in class:
|
|
Lecture 10 (Feb 5) |
Continuing with trees. Depth, height, external nodes, leaves, preorder numbers, postorder numbers. Preorder, postorder, inorder and level order listings and traversals. Recursive and nonrecursive traversals. Stackless traversal if parent pointers are available. Representing and storing trees: binary trees require about 2n bits. Counting binary trees (Catalan's formula) via counting Dyck paths. Expression trees, infix, postfix (reverse Polish), and prefix (Polish) notation. Resources:
Exercises mentioned in class:
Exercises not mentioned in class:
|
|
Lecture 11 (Feb 10) |
Introduction to the ADT Dictionary. Definition of a binary search tree. Standard operations on binary search trees, all taking time O(h), where h is the height. Generic tree sort. The browsing operations. Definition of red-black trees. Red-black trees viewed as 2-3-4 trees. Resources:
Exercises mentioned in class:
Exercise not mentioned in class:
|
|
Lecture 12 (Feb 12) |
Assignment 3 due at 1:05 pm, in class.
Relationships between rank, height and number of nodes. Insert operation on red-black trees. Standard deletion is not covered in class. Lazy delete. Split and join in red-black trees. Application to list implementations (concatenate and sublist in O(log n) time). Definition of HB-k, AVL and 2-3 trees. Resources:
Exercises mentioned in class:
|
|
Midterm examination (Feb 17) |
Room LEAR 232, 1-2:30 pm. |
|
Lecture 13 (Feb 19) |
Augmented data structures. Examples:
Not done in 2026:
Resources:
Exercises mentioned in class:
|
|
Lecture 15 (Feb 24) |
Assignment 4 handed out, 2:25 pm on March 10, 2026. Cartesian trees. Random binary search trees. Treaps. Analysis of the expected time of the main operations. Quicksort revisited. Resources:
|
|
Lecture 16 (Feb 26) |
(Lecture halved by a fire alarm.) Priority queues (as abstract data types). Uses, such as in discrete event simulation and operating systems. Binary heaps. k-ary heaps. Resources:
Exercises mentioned in class:
|
|
Lecture 17 (Mar 10) |
Assignment 4 due at 2:25 pm, in class. Heapsort. Bottom-up heapsort. Tournament trees. Amortization. Potential functions. Examples:
Resources:
Suggested exercise: in class, we implemented a lazy deletemin in a tournament tree. How would you implement a true deletemin? |
|
Lecture 18 (Mar 12) |
Amortized analysis, continued. Further examples:
Resources:
Exercise mentioned in class: consider a 1-step splay tree in which splay steps use one-level rotations instead of the two-step rotation seen in class. Exhibit a sequence of n standard operations (search, insert, delete), starting from an empty tree, such the total complexity is Ω(n^2), and not O(n log n). |
|
Lecture 19 (Mar 17) |
Assignment 5 handed out, due 2:25 pm on March 26, 2026. Stringology. Data structures for strings and compression. Tries, PATRICIA trees, digital search trees. Suffix tries, suffix trees, suffix arrays. String searching in a large file via the Knuth-Morris-Pratt algorithm. Resources:
Exercises:
|
|
Lecture 20 (Mar 19) |
Introduction to compression and coding. Fixed and variable width codes. Prefix codes. Huffman trees. Huffman code. An O(n log n) algorithm for finding the Huffman tree. Definition of a greedy algorithm. Data structures for Huffman coding. Merging sorted lists by the greedy method. Generating a random integer. Resources:
Exercises mentioned in class:
|
|
Lecture 21 (Mar 24) |
Definition of entropy and Shannon's result on best possible compression ratios. Entropy of an input source. Kraft's inequality. Proof of Shannon's theorem. The Shannon-Fano code. Lempel-Ziv compression. Data structures for Lempel-Ziv compression. Digital search tree. Resources:
Exercises:
|
|
Lecture 22 (Mar 26) |
Introduction to graphs: notation, definitions, adjacency matrix, adjacency lists. Graph traversals as the fundamental procedures for most graph algorithms: depth first search. DFS forest and DFS properties. Classification of edges. Detecting cycles. Resources:
Exercises:
|
|
Lecture 23 (Mar 31) |
Euler tour. Breadth first search. BFS tree. Shortest path problem. Shortest path properties. Dijkstra's greedy algorithm for shortest paths. Resources:
Exercises:
|
|
Lecture 24 (Apr 2) |
Minimal spanning tree. Main properties. Greedy approach for the MST: the Prim-Dijkstra algorithm. Application in hierarchical clustering. Kruskal's algorithm. A two-approximation for the traveling salesman problem. The all-pairs shortest path algorithm of Floyd and Warshall. Transitive closure of a graph and matrix multiplication. Resources:
Exercises:
|
|
Lecture 25 (Apr 7) |
Directed acyclic graphs (or: dags). Topological sorting. Activity (PERT) networks. NIM games. Strongly connected components in directed graphs. Resources:
Exercises:
|
|
Lecture 26 (Apr 9) |
Network flows. The Ford-Fulkerson method. The Edmonds-Karp algorithm. Resources:
|