\documentclass{tufte-handout}
%\geometry{showframe}% for debugging purposes -- displays the margins
\newcommand{\bra}[1]{\left(#1\right)}
\usepackage{clrscode3e}
\usepackage{tikz}
\usepackage{amsmath,amsthm}
\usetikzlibrary{shapes}
\usetikzlibrary{positioning}
% Set up the images/graphics package
\usepackage{graphicx}
\setkeys{Gin}{width=\linewidth,totalheight=\textheight,keepaspectratio}
\graphicspath{{graphics/}}
\title{A Lecture on Cartesian Trees}
\author[Kaylee Kutschera, Pavel Kondratyev, Ralph Sarkis]{Kaylee Kutschera, Pavel Kondratyev, Ralph Sarkis}
\date{\today} % if the \date{} command is left out, the current date will be used
% The following package makes prettier tables. We're all about the bling!
\usepackage{booktabs}
% The units package provides nice, non-stacked fractions and better spacing
% for units.
\usepackage{units}
% The fancyvrb package lets us customize the formatting of verbatim
% environments. We use a slightly smaller font.
\usepackage{fancyvrb}
\fvset{fontsize=\normalsize}
% Small sections of multiple columns
\usepackage{multicol}
%--------Theorem Environments--------
%theoremstyle{plain} --- default
\newtheorem{thm}{Theorem}
\newtheorem{cor}[thm]{Corollary}
\newtheorem{prop}[thm]{Proposition}
\newtheorem{lem}[thm]{Lemma}
\newtheorem{conj}[thm]{Conjecture}
\newtheorem{quest}[thm]{Question}
\theoremstyle{definition}
\newtheorem{defn}[thm]{Definition}
\newtheorem{defns}[thm]{Definitions}
\newtheorem{con}[thm]{Construction}
\newtheorem{exmp}[thm]{Example}
\newtheorem{exmps}[thm]{Examples}
\newtheorem{notn}[thm]{Notation}
\newtheorem{notns}[thm]{Notations}
\newtheorem{addm}[thm]{Addendum}
\newtheorem{exer}[thm]{Exercise}
\theoremstyle{remark}
\newtheorem{rem}[thm]{Remark}
\newtheorem{rems}[thm]{Remarks}
\newtheorem{warn}[thm]{Warning}
\newtheorem{sch}[thm]{Scholium}
% These commands are used to pretty-print LaTeX commands
\newcommand{\doccmd}[1]{\texttt{\textbackslash#1}}% command name -- adds backslash automatically
\newcommand{\docopt}[1]{\ensuremath{\langle}\textrm{\textit{#1}}\ensuremath{\rangle}}% optional command argument
\newcommand{\docarg}[1]{\textrm{\textit{#1}}}% (required) command argument
\newenvironment{docspec}{\begin{quote}\noindent}{\end{quote}}% command specification environment
\newcommand{\docenv}[1]{\textsf{#1}}% environment name
\newcommand{\docpkg}[1]{\texttt{#1}}% package name
\newcommand{\doccls}[1]{\texttt{#1}}% document class name
\newcommand{\docclsopt}[1]{\texttt{#1}}% document class option name
\begin{document}
\maketitle
\begin{abstract}
This is the augmented transcript of a lecture given by Luc Devroye on the 23rd of February 2017 for a Data Structures and Algorithms class (COMP 252). The subject was Cartesian trees and quicksort.
\end{abstract}
\section{Cartesian Trees}\label{sec:cartTree}
\begin{defn}\label{def:ct}
A \textbf{Cartesian tree}\sidenote{\citet{Vuillemin-1980}} is a binary tree with the following properties:
\begin{enumerate}
\item The data are points in $\mathbb{R}^{2}$: $(x_1, y_1),\ldots,(x_n, y_n)$. We will refer to the $x_i$'s as keys and the $y_i$'s as time stamps.
\item Each node contains a single pair of coordinates.
\item It is a binary search tree with respect to the $x$-coordinates.
\item All paths from the root down have increasing $y$-coordinate.
\end{enumerate}
\end{defn}
\begin{marginfigure}
\centering
\begin{tikzpicture}
[scale=.9,auto=center,every node/.style={circle,fill=black!10, minimum size=25pt}]
\node (n01) at (0,3) {(0,1)};
\node (n102) at (3,2) {(10,2)};
\node (n34) at (1,1) {(3,4)};
\node (n78) at (2,0) {(7,8)};
\node (n153) at (5,1) {(15,3)};
\draw (n01) -- (n102);
\draw (n34) -- (n102);
\draw (n153) -- (n102);
\draw (n34) -- (n78);
\end{tikzpicture}
\caption{Example of a Cartesian tree with 5 nodes. No other configuration for these nodes will satisfy the properties of the Cartesian tree.}
\end{marginfigure}
The Cartesian tree was introduced in 1980 by Jean Vuillemin in his paper ''A Unifying Look at Data Structures.'' It is easy to see that if all $x_i$'s and $y_i$'s are distinct, then the Cartestian tree is unique --- its form is completely determined by the data.
\subsection{Ordinary Binary Search Tree}
An ordinary binary search tree for $x_1,\ldots,x_n$ can be obtained by using the values $1$ to $n$ in the y-coordinate in the order that data was given: $(x_1, 1),\ldots,(x_n, n)$. Unfortunately, this can result in a very unbalanced search tree with $height=n-1$.
\subsection{Random Binary Search Tree (RBST)}
Let $(\sigma_1,\ldots,\sigma_n)$ be a random uniform permutation of $(1,\ldots,n)$. Then the data of a RBST is $(x_{\sigma_1}, 1),\ldots,(x_{\sigma_n},n)$ or equivalently $(x_1,\tau_1),\ldots,(x_n,\tau_n)$, where $(\tau_1,\ldots,\tau_n)$ is the reverse permutation of $(\sigma_1,\ldots,\sigma_n)$. This inverse permutation will also be random and uniform as there is a 1-to-1 relationship with the original permutation.
\subsection{Treap}
Suppose we have data $x_1,\ldots,x_n$ that we want to store. To build a treap, we generate $n$ random numbers $T_1,\ldots,T_n$ called random time stamps and pair each $x_i$ with a $T_i$, thus giving data pairs $(x_i, T_i)$. A treap is a combination of a binary search tree and a heap where the $x_i$'s are keys to the binary search tree and $T_i$'s are the keys of the heap. Although treaps are not fully balanced trees, they have expected height $O(\ln(n))$ and form a typical Cartesian tree. The treap was first described in \citet{Aragon-1989}. Treaps are random binary search trees, but are easier to maintain as data are added and deleted.
\section{Operations on Cartesian Trees}\label{sec:ops}
Compared to other data structures and especially other search trees that we have seen in class, a Cartesian tree has a nice structure that lets us define complex operations in terms of two atomic operations. We will first see the two FIX operations and then look at how they are used to construct the others.\sidenote{In definition \ref{def:ct}, the fourth property comes from the heap data structure. We give two simpler formulations for this property (henceforth called the heap property):
\begin{itemize}
\item For any node, the parent has a smaller $y$ coordinate.
\item For any node, the children have a greater $y$ coordinate.
\end{itemize}
We will use these simpler formulations because they correspond more to the operations we will describe.}
\subsection{Atomic operations}
Firstly, we informally define what we mean by a fix. The input is $v$ a Cartesian tree in which only one node, $v$, does not satisfy the heap property. We fix the Cartesian tree by moving $v$ inside the tree, keeping the invariant that only $v$ does not satisfy the heap property. During this procedure, the node $v$ can only move one way, either upwards (FIX-UP) or downwards (FIX-DOWN). Since a Cartesian tree is finite, the procedure must halt and $v$ must be fixed when this happens (it is not clear why at the moment but it will be explained in the following paragraphs).
When we say that we are moving a node in the tree, we cannot do this in an arbitrary manner. Each change we make in the tree should keep the invariant mentioned above. We will use a simple move called a tree rotation\sidenote{More information on rotations in \citet{Sleator-1988}} in order to either push a node above its parent or below its children. The particularity of the rotation is that it maintains all the properties of the Cartesian tree. It is easier to describe this pictorially so we have to refer to Figure \ref{fig:treerot}.
\begin{figure}\label{fig:treerot}
\centering
\begin{tikzpicture}
[scale=.6,auto=center,circ/.style={circle,fill=black!30}, triangle/.style={regular polygon, regular polygon sides=3,draw,inner sep=1pt,minimum size=25pt}]
\node[circ] (par) at (3,2) {};
\node[circ, fill=black!100] (tag) at (5,1) {};
\node[triangle] (leftp) at (1,1) {1};
\node[triangle] (left) at (4,0) {2};
\node[triangle] (right) at (6,0) {3};
\coordinate (top) at (3,2.5);
\draw (par) -- (top);
\draw (par) -- (tag);
\draw (par) -- (leftp);
\draw (tag) -- (left);
\draw (tag) -- (right);
\node[circ, fill=black!100] (par2) at (12,2) {};
\node[circ] (tag2) at (10,1) {};
\node[triangle] (leftp2) at (14,1) {3};
\node[triangle] (left2) at (9,0) {1};
\node[triangle] (right2) at (11,0) {2};
\coordinate (top2) at (12,2.5);
\coordinate (arr1l) at (7,2);
\coordinate (arr1r) at (8,2);
\coordinate (arr2l) at (6.5,1.5);
\coordinate (arr2r) at (8.5,1.5);
\draw (par2) -- (top2);
\draw (par2) -- (tag2);
\draw (par2) -- (leftp2);
\draw (tag2) -- (left2);
\draw (tag2) -- (right2);
\draw [->] (arr2l) -- (arr2r);
\end{tikzpicture}
\caption{Example of a tree rotation that would be done during a FIX-UP operation. The black node is the misplaced and should be above the gray node.}
\end{figure}
In this diagram representing one iteration of a FIX-UP operation, we have drawn two nodes and three subtrees which would be part of a bigger tree. The tagged node (in black) is misplaced, namely, the $y$-coordinate of its parent is bigger, however, all of its descendants are well-sorted. We will use a rotation to try to fix it without messing the $x$-wise order. We put the black node above the gray one by replacing the left child of the tagged node with its parent. Then, we put sub-tree 2 as the right child of the gray node. Hence, sub-tree 2 is to the left of the black node and to the right of the gray node and the previous ordering is maintained. This is called a right rotation because the tagged node is the right child of its parent. Nevertheless, by changing the direction of the arrow and the colors of the node in the diagram above, we obtain the completely symmetric left rotation.
Secondly, we will describe the FIX-UP operation in more depth and argue about the correctness of the operation. As we mentioned above, the input is a node $v$ that has a parent with a larger $y$-coordinate. Moreover, it is the only unsorted node in the sense that if you list all $y$-coordinates from the root to a descendant leaf of $v$, you obtain a list in increasing order with one unsorted element. Viewing it as a list of $y$ coordinates, the procedure is very similar to an insertion sort. We will loop and halt when the $y$-coordinate of $v$ is greater than that of its parent or when $v$ is the root of the tree. At each iteration, we either do a right rotation or a left rotation to make the node $v$ go up. There are two possibilities for the loop to stop. If $v.y > v.parent.y$, then $v$ must satisfy the heap property because the $y$-coordinate of $v$ is smaller than all the $y$-coordinates of its descendants (which form a well ordered Cartesian tree) and it is greater than all the $y$-coordinates of its ancestors\sidenote{Recall the insertion sort comparison} (well ordered as well). If the loop stops because $v$ is at the root, then this means its $y$-coordinate is smaller than every other node in the tree and hence $v$ also satisfies the heap property.
\begin{codebox}
\Procname{$\proc{FIX-UP}(\id{node})$}
\li \While $\id{node} \neq \id{root}$ \&\& $\attrib{\id{node}}{y} < \attrib{\attrib{\id{node}}{parent}}{y}$ \Do
\li \Comment We assume the parent pointers are updated as well
\li \If $\id{node} \isequal \attrib{\attrib{\id{node}}{parent}}{right}$ \textbf{then} \Then
\li $\attrib{\attrib{\id{node}}{parent}}{right} \gets \attrib{\id{node}}{left}$
\li $\attrib{\id{node}}{left} \gets \attrib{\id{node}}{parent}$
\li \Else \Comment Completely symmetric process to above
\li $\attrib{\attrib{\id{node}}{parent}}{left} \gets \attrib{\id{node}}{right}$
\li $\attrib{\id{node}}{right} \gets \attrib{\id{node}}{parent}$
\End \End
\end{codebox}
For the FIX-DOWN operation, we will use the same rotation as described above but we want to make the tagged node move down, as shown in Figure \ref{fig:treerot2}.
\newpage
\begin{figure}\label{fig:treerot2}
\centering
\begin{tikzpicture}
[scale=.6,auto=center,circ/.style={circle,fill=black!30}, triangle/.style={regular polygon, regular polygon sides=3,draw,inner sep=1pt,minimum size=25pt}]
\node[circ, fill=black!100] (par) at (3,2) {};
\node[circ] (tag) at (5,1) {};
\node[triangle] (leftp) at (1,1) {1};
\node[triangle] (left) at (4,0) {2};
\node[triangle] (right) at (6,0) {3};
\coordinate (top) at (3,2.5);
\draw (par) -- (top);
\draw (par) -- (tag);
\draw (par) -- (leftp);
\draw (tag) -- (left);
\draw (tag) -- (right);
\node[circ] (par2) at (12,2) {};
\node[circ, fill=black!100] (tag2) at (10,1) {};
\node[triangle] (leftp2) at (14,1) {3};
\node[triangle] (left2) at (9,0) {1};
\node[triangle] (right2) at (11,0) {2};
\coordinate (top2) at (12,2.5);
\coordinate (arr1l) at (7,2);
\coordinate (arr1r) at (8,2);
\coordinate (arr2l) at (6.5,1.5);
\coordinate (arr2r) at (8.5,1.5);
\draw (par2) -- (top2);
\draw (par2) -- (tag2);
\draw (par2) -- (leftp2);
\draw (tag2) -- (left2);
\draw (tag2) -- (right2);
\draw [->] (arr2l) -- (arr2r);
\end{tikzpicture}
\caption{Example of a tree rotation that would be done during a FIX-DOWN operation. Note that the only difference is that the tagged node is the parent at first and goes down in the tree.}
\end{figure}
The algorithm for this operation is quite similar but it loops until $v$ becomes a leaf or until its $y$-coordinate becomes smaller than that of its children. Furthermore, there is a slight catch when choosing to do a left rotation or right rotation. Since we will put one of the children above the other, we must choose to rotate at the child with the smallest $y$-coordinate in order to maintain the heap property.
\begin{codebox}
\Procname{$\proc{FIX-DOWN}(\id{node})$}
\li \While $\id{node}$ is not leaf \&\& ($\attrib{\id{node}}{y} > \attrib{\attrib{\id{node}}{left}}{y}$ || $\attrib{\id{node}}{y} > \attrib{\attrib{\id{node}}{right}}{y}$) \Do
\li \Comment We assume that the $y$ coordinate of a non-existing node is $\infty$
\li \If $\attrib{\attrib{\id{node}}{right}}{y} > \attrib{\attrib{\id{node}}{left}}{y}$ \textbf{then} \Then
\li $\attrib{\id{node}}{leftt} \gets \attrib{\attrib{\id{node}}{left}}{right}$
\li $\attrib{\attrib{\id{node}}{left}}{right} \gets \id{node}$
\li \Else \Comment Completely symmetric process to above
\li $\attrib{\id{node}}{right} \gets \attrib{\attrib{\id{node}}{right}}{left}$
\li $\attrib{\attrib{\id{node}}{right}}{left} \gets \id{node}$
\End \End
\end{codebox}
\subsection{Other operations}
We will briefly describe four operations that follow almost immediately from the two atomic operations. We leave the details of the pseudocode as an exercise.
\begin{enumerate}
\item INSERT($t$,$(x,y)$): Inputs are a Cartesian tree $t$ and a node $(x,y)$ to be added to the tree. First, insert the node $(x, \infty)$ just as you would do it in a binary search tree, implying the node will be a leaf since the heap property must be satisfied. Second, change the node to $(x,y)$ and use FIX-UP to fix the Cartesian tree.
\item DELETE($t$, $node$): Inputs are a Cartesian tree $t$ and a pointer $node$ to a node that will be removed. First, change $node.y$ to $\infty$ and use FIX-DOWN to move the node down the tree. It will end up as a leaf so the second step is just to remove the node.
\item JOIN($t_1$,$t_2$): Inputs are two Cartesian trees that need to be joined into a single tree. It is understood that all keys in $t_1$ are smaller than all keys in $t_2$. First, create a temporary node $(k,-\infty)$ such that $MAX_X(t_1) < k < MIN_X(t_2)$ and let its left child be the root of $t_1$ while its right child is the root of $t_2$. Second, delete the node and the tree will fix itself up.
\item SPLIT($t$,$k$): Inputs are a Cartesian tree $t$ and a value $k$ that will split the tree into two trees that have $x$ coordinates smaller than $k$ and bigger than $k$, respectively. First, insert a temporary node $(k,-\infty)$ (it will end up as a root) and after the procedure, the left subtree and right subtree of the node are the trees we are looking for.
\end{enumerate}
\section{Probability review}\label{sec:prob}
To provide a probabilistic analysis of treaps, we have to provide some background on probability.\sidenote{\citet{Grimmett-2001}} We define a \textbf{sample space} (usually denoted as $\Omega$) as a set of \textbf{elementary events}, or possible outcomes of an experiment. We also denote $\mathcal{S}$ as the set of all \textbf{events} on $\Omega$, namely all subsets of $\Omega$.
\begin{exmp}
If a coin is flipped: $\Omega = \{H, T\} $, $\mathcal S = \{\{H\}, \{T\}, \{H,T\}, \emptyset\}$
\end{exmp}
\begin{defn}
A \textbf{probability} ($\mathbb{P}$) is a mapping from a sample space to $\mathbb{R}$ satisfying the following properties:
\begin{itemize}
\item[1] $\forall A \in \mathcal S, \mathbb P (A) \geq 0$
\item[2] $\mathbb P(\Omega) = 1$
\item[3] If $A_i \in \mathcal S; i \in \mathbb N;$ and $\forall i \not= j, A_i \cap A_j = \emptyset$ then $\mathbb P \bra{\cup_i A_i} = \sum_i \mathbb P(A_i)$
\end{itemize}
\end{defn}
\begin{exmp}
Given the coin flip example from above, $\mathbb P (\{H\}) = 1/2$, $\mathbb P (\{H,T\}) = 1$, $\mathbb P (\emptyset) = 0$.
\end{exmp}
\begin{defn}
A \textbf{random variable} is a function that associates each outcome in the sample space with a real number.
\end{defn}
\begin{exmp}
If we want to model an experiment with two outcomes we can express this as $\mathbb P(X=1)=p$ and $\mathbb P(X=0)=1-p$. In other words the probability of the first event happening is p, and the probability of the second event is 1-p. This particular example is often referred to as the Bernoulli random variable.
\end{exmp}
Lastly, we need to develop the idea of an expected value.
\begin{defn}
The \textbf{expectation} (we can think of this as the "mean" or "average") of a random variable $X$ is defined to be
\[\mathbb E[X] = \sum_{n=1}^\infty x_n \mathbb P(X = x_n).\]
\end{defn}
\begin{exmp}
If $X$ is the Bernoulli random variable(as above) ($p \in (0,1)$), $X \in \{0,1\}$ then
\[\mathbb E[X] = 0(1-p) + 1\cdot p = p.\]
\end{exmp}
\section{Expected value analysis for treaps}\label{analysis}
In a treap, data is represented as $(x_1, T_1),\ldots,(x_n, T_n)$, where the $T_i$ are random time stamps. Equivalently, and for simplicity, let the data be $(1, T_1),\ldots,(n, T_n)$ where $1,\ldots,n$ will be refereed to as the rank of the node/pair.
Define $D_k$ to be the depth of the node with rank $k$, namely $(k,T_k)$. Since the depth of a node is equivalent to the number of ancestors it has, let \[ D_k = \sum_{j\neq k} X_{jk} \textnormal{ where } X_{jk}=
\begin{cases}
1, & \textnormal{ if $j$ ancestor of $k$} \\
0, & \textnormal{ otherwise }
\end{cases}
\]
So next we wish to calculate the expected value for $D_k$.\marginnote{We will denote $H_k$ to be the harmonic number, $\sum _{n=1}^{k }\,{\frac {1}{n}} = 1+\frac{1}{2}+ \cdots +\frac{1}{k}$. It can be approximated by $\ln(k)$ but more importantly, it can be bounded as follows $H_k \in [\ln(k+1),\ln(k)+1]$. }
\[
\begin{aligned}
\mathbb E[D_k] =& \sum_{j\neq k}\mathbb P((j,T_j) \textnormal{ is an ancestor of } (i, T_i)) \\
=& \sum_{j\neq k}\mathbb P(T_j \textnormal{ is the smallest of } T_j, T_{j+1}, \cdots, T_{k}) \\
=& \sum_{j < k} \frac{1}{k-j+1} + \sum_{j > k} \frac{1}{j-k+1}\\
=& \left(\frac{1}{2} + \cdots + \frac{1}{k}\right) + \left(\frac{1}{2} + \cdots + \frac{1}{n-k}\right)\\
=& (H_{k} - 1) + (H_{n-k+1} - 1)\\
\leq & 2\ln(n)\\
\simeq & 1.39\log_2(n).
\end{aligned}
\]
If we set $T_k = \infty$, then
\[
\mathbb E[D_k] = H_{k+1} + H_{n-k}.
\]
\begin{marginfigure}
\includegraphics[scale=0.3]{Expected.png}
\caption{Representation of a treap with the expected height difference when inserting a new node.}
\end{marginfigure}
So next we wish to see how many FIX steps are expected when inserting $(k, T_k$.
Using the two results above, we find that it is equal to
\[(H_{k} + H_{n-k+1} - 2) - (H_{k+1} + H_{n-k}) \leq 2
\]
Using Figure 4 and the above argument, we can see that it is expected to take two steps to fix the Cartesian tree.
We have shown that search, insert, and delete take $O(\log(n))$ expected number of steps, making the treap a viable data structure for the abstract data type "dictionary".
%% Here is my pseudocode I did not do the partition part because he did not really mention it in class but we will ask him if he wants to include it and what method he prefers.
\section{Quicksort}\label{sec:qsort}
Given a set of numbers, Quicksort\sidenote{This algorithm was first introduced by Tony \citet{Hoare-1962} and a full analysis of the algorithm was done by Robert \citet{Sedgewick1977} for his Ph.D. thesis}. picks a pivot at random from the set, partitions the remaining numbers into elements smaller than the pivot and elements bigger than the pivot and recurses into the two new sets. There are many different algorithm to partition the elements of the set but it must be done in $O(n)$ time for a set of $n$ elements. A simple pseudocode follows.
\begin{codebox}
\Procname{$\proc{Quicksort}(\id{list}, \id{low}, \id{high})$}
\li $\id{i} \gets$ random index between $\id{low}$ and $\id{high}$
\li $ \id{a} \gets \id{list}[i]$
\li \If $\id{high} > \id{low}$ \textbf{then} \Then
\li Partition $\id{list}[\id{low}...\id{high}]$ so elements before index $j$ are \\ smaller than $a$ and elements after index $j$ are greater than $a$
\li $\proc{Quicksort}(\id{list},\id{low},\id{j}-1)$
\li $\proc{Quicksort}(\id{list},\id{j}+1,\id{high})$
\end{codebox}
One can see that the Quicksort procedure is equal to building a random binary search tree. In fact, the number of comparisons needed to build a random binary search tree is the same needed to Quicksort.
\subsection{Comparison Complexity}
Let $C_n = \# \textnormal{ of comparisons} = \sum_{i=1}^{n} D_i$.
\[
\begin{aligned}
\mathbb E[C_n] =& \sum_{i=1}^{n} E[D_i] \\
=& \sum_{i=1}^{n} (H_i + H_{n-i+1} - 2) \\
=& 2\sum_{i=1}^{n} (H_i -1)\\
=& 2\sum_{i=1}^{n} H_i - 2n\\
=& 2\sum_{i=1}^{n} \sum_{j=1}^{i} \frac{1}{j} - 2n \\
=& 2\sum_{j=1}^{n} \frac{1}{j} \sum_{i=j}^{n} 1 - 2n \\
=& 2\sum_{j=1}^{n} \frac{1}{j} (n-j+1) - 2n \\
=& 2(n+1)H_n -4n &\\
\sim & 2n\ln(n) \\
=& 1.386294\ldots n\log_2(n). \\
\end{aligned}
\]
It is noteworthy that this is about 39\% worse than if we had used mergesort, or indeed, the best possible comparison based sorting method.
\subsection{Improvements}
One can improve Quicksort, by picking three random numbers and taking the median of the three as the pivot. The resulting number of comparisons would be approximately $\frac{12}{7}n\ln(n)$. If we take five rather than three random numbers, the number of comparisons would be approximately $\frac{60}{37}n\ln(n)$.
\begin{exer}
Show that the expected number of FIX-DOWN rotations needed for the operation JOIN on two treaps of size m and n, respectively, is $H_n + H_m$.
\end{exer}
\newpage
\bibliographystyle{plainnat}
\bibliography{biblio}
\end{document}
Cartesian Trees
Paper: https://pdfs.semanticscholar.org/1742/195ba5043ec45345d6a9c9ee65145d345ecd.pdf
General Info (Wikipedia): https://en.wikipedia.org/wiki/Cartesian_tree
Treaps
Paper: http://faculty.washington.edu/aragon/pubs/rst89.pdf
General Info: https://en.wikipedia.org/wiki/Treap
Creators: https://en.wikipedia.porg/wiki/Raimund_Seidel https://en.wikipedia.org/wiki/Cecilia_R._Aragon
Probability:
Book: https://www.amazon.com/Statistical-Inference-George-Casella/dp/0534243126 ISBN: 0534243126 Pages: 5-6, 55
Quicksort
Paper: https://academic.oup.com/comjnl/article-abstract/5/1/10/395338/Quicksort?redirectedFrom=fulltext
General Info: https://en.wikipedia.org/wiki/Quicksort
Creator & Analyst: https://en.wikipedia.org/wiki/Tony_Hoare https://en.wikipedia.org/wiki/Robert_Sedgewick_(computer_scientist)