An algorithm is a well-defined procedure to solve a computational problem.
T(n) \in O(f(n)) if and only if \exists two positive constants c, n_0 such that \forall n \ge n_0, T(n) \le c\cdot f(n).
Example: T(n) = \sum_{i=0}^k a_i n^i \in O(n^k), a_k > 0.
\begin{aligned}T(n) &\le \sum_{i=0}^k|a_i|n^i \\ &= n^k\sum_{i=0}^k|a_i|\end{aligned} So for c = \sum_{i=0}^k|a_i|, n_0 = 1, T(n) \in O(n^k).
T(n) \in \Omega(f(n)) if and only if \exists c, n_0 > 0 such that \forall n \ge n_0, T(n) \ge c\cdot f(n).
Example: T(n) = \sum_{i=0}^k a_i n^i \in \Omega(n^k).
Let us guess c = \frac{a_k}{2}. \begin{aligned}\sum_{i=0}^ka_in^i &\ge \frac{a_k}{2}n^k \\ n &\ge \sum_{i=0}^{k-1} \frac{-2a_in^i}{a_k n^{k-1}}\end{aligned} The problem is that the sum is not a constant so it cannot be n_0. n = \sum_{i=0}^k\left|\frac{2a_i}{a_k}\right|
T(n) \in \Theta(f(n)) if and only if T(n) \in O(f(n)) and T(n) \in \Omega(f(n)).
T(n) \in o(f(n)) if and only if \forall c > 0, \exists n_0 such that \forall n \ge n_0 T(n) \le c \cdot f(n).
T(n) \in \omega(f(n)) if and only if \forall c > 0, \exists n_0 such that \forall n \ge n_0, T(n) \ge c\cdot f(n).
T(n) \in o(f(n)) iff \lim_{n\to \infty}\frac{T(n)}{f(n)} = 0.
T(n) \in \omega(f(n)) iff \lim_{n\to \infty}\frac{T(n)}{f(n)} = \infty.
T(n) \in \Theta(f(n)) iff \lim_{n\to\infty}\frac{T(n)}{f(n)} = c > 0.
Solving a computational problem C_1 by using an algorithm that solves a problem C_2.
An equation or an inequality that describes a function T(n) in terms of T’s value on inputs smaller than n and a base case.
Example: MergeSort.
T(n) \le 2T(\frac{n}{2}) + 7n, T(2) = 2.
Exampe: Median-of-Medians.
T(n) \le T(\frac{n}{5}) + T(\frac{7n}{10}) + n, T(1) = 1
Claim: T(n) \le 10n.
Base Case: T(1) = 1 < 10.
Induction Hypothesis: \forall k < n, T(k) \le 10k.
\begin{aligned}T(n) &\le \frac{10n}{5} + \frac{70n}{10} + n\\ &= 2n + 7n + n\\ &= 10\end{aligned}
We observed using the recursion tree method, that there are three possibilities in the overall runtime analysis.
Assume all subproblems are of equal size.
T(n) \le aT(\frac{n}{b}) + O(n^d), T(1) \le 1, where a is the number of recursive calls, b is the input size shrinkage order, and d is the exponent in the work done at the combine step.
T(n) \in \begin{cases} O(n^d \log(n)),\ &a = b^d \\ O(n^{\log_b(a)}),\ &a > b^d \\ O(n^d),\ &a < b^d \\ \end{cases}
Proof. Assume for simplicity that n is a power of b. We have \log_b(n) levels, with a^{\log_b(n)} leaves. The total work done at each level j is a^j(\frac{n}{b})^d so the total work is.
\begin{aligned}\sum_{j=0}^{\log_b(n)}a^j\frac{n^d}{b^{jd}} &= cn^d\sum_{j=0}^{\log_b(n)} \left(\frac{a}{b^d}\right)^j\end{aligned}
We can obtain the runtimes by checking the convergence of the above sum.
Input: Set P of n 2D points.
Output: All maximal points. p is maximal if there is no such p^\prime that dominates p (p^\prime.x > p.x \cap p^\prime.y > p.y).
Input: 2 n-digit integers X, Y.
Output: Z = XY.
Let X = a10^{\frac{n}{2}} + b, Y = c10^\frac{n}{2} + d. So XY = ac10^n + (ad + bc)10^\frac{n}{2} + bd. We have a divide and conquer algorithm.
T(n) \le 4T\left(\frac{n}{2}\right) + O(n)
a = 4, b = 2, d = 1, so O(n^{\log_2(4)}) = O(n^2) by Master Theorem. Which is not faster than the naive approach.