Time Complexity
What is Time Complexity?
Time complexity measures the amount of time an algorithm takes to run as a function of the input size n. It does not measure actual clock time — it counts the number of basic operations (comparisons, assignments, arithmetic) the algorithm performs.
We express time complexity using Big-O notation to describe the worst-case growth rate, ignoring constants and lower-order terms.
How to Calculate Time Complexity
Rules for counting operations:
- Simple statements (assignment, comparison, arithmetic) → O(1)
- A loop running n times → O(n)
- Two nested loops each running n times → O(n²)
- A loop that halves the problem each iteration → O(log n)
- Sequential blocks → add them (take the dominant term)
- Recursive calls → use recurrence relations
public class TimeComplexity {
// O(1) — constant: no loops, fixed operations
static int getElement(int[] arr, int i) {
return arr[i]; // 1 operation → O(1)
}
// O(n) — linear: one loop over n elements
static int sum(int[] arr) {
int total = 0; // O(1)
for (int x : arr) { // runs n times
total += x; // O(1) per iteration
}
return total; // O(1)
// Total: O(1) + O(n) * O(1) + O(1) = O(n)
}
// O(n²) — quadratic: nested loops
static void printAllPairs(int[] arr) {
int n = arr.length;
for (int i = 0; i < n; i++) { // n iterations
for (int j = 0; j < n; j++) { // n iterations each
System.out.print(arr[i] + "," + arr[j] + " ");
}
}
// Total: n * n = n² operations → O(n²)
}
// O(log n) — logarithmic: problem halves each step
static int binarySearch(int[] arr, int target) {
int lo = 0, hi = arr.length - 1;
while (lo <= hi) { // logâ‚‚(n) iterations
int mid = (lo + hi) / 2;
if (arr[mid] == target) return mid;
else if (arr[mid] < target) lo = mid + 1;
else hi = mid - 1;
}
return -1;
// Each iteration halves the search space → O(log n)
}
// O(n log n) — merge sort (divide + merge)
static void mergeSort(int[] arr, int l, int r) {
if (l < r) {
int mid = (l + r) / 2;
mergeSort(arr, l, mid); // T(n/2)
mergeSort(arr, mid+1, r); // T(n/2)
merge(arr, l, mid, r); // O(n)
}
// Recurrence: T(n) = 2T(n/2) + O(n) → O(n log n)
}
static void merge(int[] arr, int l, int mid, int r) {
int n1 = mid - l + 1, n2 = r - mid;
int[] L = new int[n1], R = new int[n2];
for (int i = 0; i < n1; i++) L[i] = arr[l + i];
for (int j = 0; j < n2; j++) R[j] = arr[mid + 1 + j];
int i = 0, j = 0, k = l;
while (i < n1 && j < n2) arr[k++] = (L[i] <= R[j]) ? L[i++] : R[j++];
while (i < n1) arr[k++] = L[i++];
while (j < n2) arr[k++] = R[j++];
}
}
Best, Average, and Worst Case
| Algorithm | Best Case | Average Case | Worst Case |
|---|---|---|---|
| Linear Search | O(1) | O(n) | O(n) |
| Binary Search | O(1) | O(log n) | O(log n) |
| Bubble Sort | O(n) | O(n²) | O(n²) |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) |
| Quick Sort | O(n log n) | O(n log n) | O(n²) |
| Insertion Sort | O(n) | O(n²) | O(n²) |
Master Theorem
The Master Theorem solves recurrences of the form T(n) = aT(n/b) + f(n) where a ≥ 1, b > 1:
| Case | Condition | Result |
|---|---|---|
| Case 1 | f(n) = O(n^(log_b a - ε)) | T(n) = Θ(n^log_b a) |
| Case 2 | f(n) = Θ(n^log_b a) | T(n) = Θ(n^log_b a · log n) |
| Case 3 | f(n) = Ω(n^(log_b a + ε)) | T(n) = Θ(f(n)) |
Example: Merge Sort — T(n) = 2T(n/2) + O(n). Here a=2, b=2, f(n)=n. log_b a = log₂2 = 1. Since f(n) = Θ(n¹) → Case 2 → T(n) = Θ(n log n).