Basic-Algorithms
Introduction to Algorithm Analysis
Big O, Big Ω(Omega), and Big Θ(Theta).
Big O Notation
Big O describes the upper bound of an algorithm’s growth rate. It give the worst-case scenario of how the algorithm performs as the input size (n) grows.
Methematically:
$$
O(g(n)) = { f(n) \mid \exists c>0, n_0 > 0 \text{ such that } 0 \leq f(n) \leq c \cdot g(n) \text{ for all } n \geq n_0 }.
$$
Example:
For $f(n) = 3n^2 + 2n + 1$, the Big O is $O(n^2)$.
Big Ω (Omega) Notation
Big Ω provides a lower bound for the algorithm’s growth rate. It describes the best-case scenario.
Methematically:
$$
\Omega(g(n)) = { f(n) \mid \exists c > 0, n_0 > 0 \text{ such that } 0 \leq c \cdot g(n) \leq f(n) \text{ for all } n \geq n_0 }.
$$
Example:
For $f(n) = 3n^2 + 2n + 1$, the Big Ω is $\Omega(n^2)$, as the $n^2$ term is the minimum growth rate for large $n$.
Big Θ (Theta) Notation
Big Θ provides a tight bound for the algorithm’s growth rate, meaning it is both an upper and lower bound.
Mathematically:
$$
\Theta(g(n)) = { f(n) \mid \exists c_1, c_2 > 0, n_0 > 0 \text{ such that } 0 \leq c_1 \cdot g(n) \leq f(n) \leq c_2 \cdot g(n) \text{ for all } n \geq n_0 }.
$$
Example:
For $f(n) = 3n^2 + 2n + 1$, the Big Θ is $\Theta(n^2)$.
This is because $f(n)$ is bounded both above and below by $n^2$ for large $n$.
Limit Lemma
What Is the Limit Lemma?
The limit Lemma provides a way to compare two functions $f(n)$ and $g(n)$ by examining the limit of their ratio as $n \to \infty$.
Mathematical Definition
If the limit of the ratio $\frac{f(n)}{g(n)}$ exists as $n \to \infty$, the relationship between $f(n)$ and $g(n)$ can be classified as:
$$
\lim_{n \to \infty} \frac{f(n)}{g(n)} =
\begin{cases}
0 & \Rightarrow f(n) \in O(g(n)) \ (\text{Big-O}) \\
c \neq 0 & \Rightarrow f(n) \in \Theta(g(n)) \ (\text{Big-Theta}) \\
\infty & \Rightarrow f(n) \in \Omega(g(n)) \ (\text{Big-Omega})
\end{cases}
$$
Examples
Example 1: Compapring $f(n) = 2n^2$ and $g(n) = n^2$
Compute the limit:
$$
\lim_{n \to \infty} \frac{f(n)}{g(n)} = \lim_{n \to \infty} \frac{2n^2}{n^2} = 2
$$
- Since the limit is $c = 2 \neq 0$, $f(n) in \Theta(g(n))$.
($f(n)$ grows at the same rate as $g(n)$)
Example 2: Comparing $f(n) = n$ and $g(n) = n^2$
Compute the limit:
$$
\lim_{n \to \infty} \frac{f(n)}{g(n)} = \lim_{n \to \infty} \frac{n}{n^2} = \lim_{n \to \infty} \frac{1}{n} = 0
$$
- Since the limit is $0$, $f(n) \in O(g(n))$.
($f(n)$ grows slower or at the same rate as $g(n)$)
Example 3: Comparing $f(n) = n^3$ and $g(n) = n^2$
Compute the limit:
$$
\lim_{n \to \infty} \frac{f(n)}{g(n)} = \lim_{n \to \infty} \frac{n^3}{n^2} = \lim_{n \to \infty} n = \infty
$$
- Since the limit is $\infty$, $f(n) is \Omega(g(n))$.
($f(n)$ grows faster than or at the same rate as $g(n)$)
Recursive tree, Substitution, Master Theorem
Recursive tree
$$
T(n) = 2T\left(\frac{n}{2}\right) + O(n)
$$

Substitution
Steps for Substitution
- Guess
- Verify via induction using definition of Big-O
- Solve for the constant: $c$, $n_0$
$$
T(n) = T(\frac{n}{6}) + T(\frac{2n}{6}) + O(n)
$$
Algorithm Design
Divide and Conquer
Steps
- Divide: Divide the problem into smaller subproblems, where each subproblem is a smaller instance of the same problem.
- Conquer: Solve the smaller instances recursively by calling the same function
- if the subproblem is small enough (base case), solve it directly without recursion.
- Combine: Combine the solutions of the smaller subproblems to produce the final solution for the original problem.
Examples in Sorting and Searching
- Merge Sort
- Divide: Split the array into two halves
- Conquer: Recursively sort both halves
- Combine: Merge the two sorted halves into a single sorted array.
- Time Complexity: $O(n\log n)$
- Quick Sort
- Divide: Choose a pivot element and partition the array into two subarrays:
- Elements less than or equal to the pivot
- Elements greater than the pivot
- Conquer: Recursively sort the two subarrays
- Combine: Combine the sorted subarrays and the pivot
- Time Complexity:
- Best/Average Case: $O(n\log n)$
- Worst Case: $O(n^2)$
- Divide: Choose a pivot element and partition the array into two subarrays:
- Binary Search
- Divide: Find the middle element of a sorted array
- Conquer: Compare the target value with the middle element:
- If equal, the target is found
- If less, search the left half
- If greater, search the right half
- Repeat until the target is found or the subarray is empty
- Time Complexity: $O(\log n)$