Tutorials Logic, IN +91 8092939553 info@tutorialslogic.com
FAQs Support
Navigation
Home About Us Contact Us Blogs FAQs
Tutorials
All Tutorials
Services
Academic Projects Resume Writing Interview Questions Website Development
Compiler Tutorials

Big O Notation

What is Big O Notation?

Big O Notation is a mathematical notation used to describe the upper bound of an algorithm's running time or space usage as the input size grows. It answers the question: "How does performance scale?"

Big O focuses on the worst-case scenario and describes the growth rate, not the exact number of operations. It lets you compare algorithms independently of hardware or implementation details.

Time Complexity vs Space Complexity

TypeWhat it measuresExample
Time ComplexityHow execution time grows with input size nSorting n elements takes O(n log n)
Space ComplexityHow memory usage grows with input size nStoring n elements takes O(n) space

Big O Rules

  • Drop constants — O(2n) simplifies to O(n); constants don't matter at scale
  • Drop non-dominant terms — O(n² + n) simplifies to O(n²); the dominant term wins
  • Different inputs = different variables — O(a + b) if iterating two separate arrays of size a and b
  • Nested loops multiply — a loop inside a loop is O(n × m), or O(n²) if same size
  • Sequential steps add — two separate loops over n is O(n + n) = O(n)

Common Complexities (Fastest to Slowest)

NotationNamen=10n=100n=1000Example
O(1)Constant111Array index access, hash lookup
O(log n)Logarithmic3710Binary search, balanced BST
O(n)Linear101001,000Linear search, array traversal
O(n log n)Linearithmic336649,966Merge sort, heap sort
O(n²)Quadratic10010,0001,000,000Bubble sort, nested loops
O(2^n)Exponential1,024~10^30~10^301Recursive Fibonacci, subsets
O(n!)Factorial3,628,800~10^157impossiblePermutations, traveling salesman brute force

Best / Average / Worst Case

  • Best case (Ω — Omega) — minimum time needed (e.g., linear search finds target at index 0)
  • Average case (Θ — Theta) — expected time over all inputs
  • Worst case (O — Big O) — maximum time needed (e.g., linear search target not found)

In practice, we almost always care about worst case because it gives a guaranteed upper bound.

Amortized Analysis

Amortized analysis averages the cost of operations over a sequence. A dynamic array (like C++ vector) doubles in size when full — most appends are O(1), but occasional resizes are O(n). Amortized over all operations, each append is still O(1) amortized.

O(1) vs O(n) vs O(n²) Examples
#include <iostream>
#include <vector>
using namespace std;

// O(1) — Constant time: does NOT depend on input size
int getFirst(const vector<int>& arr) {
    return arr[0];  // always 1 operation
}

// O(n) — Linear time: grows proportionally with n
int linearSearch(const vector<int>& arr, int target) {
    for (int i = 0; i < arr.size(); i++) {  // up to n iterations
        if (arr[i] == target) return i;
    }
    return -1;
}

// O(n²) — Quadratic time: nested loops over same input
bool hasDuplicate(const vector<int>& arr) {
    for (int i = 0; i < arr.size(); i++) {
        for (int j = i + 1; j < arr.size(); j++) {  // n*(n-1)/2 comparisons
            if (arr[i] == arr[j]) return true;
        }
    }
    return false;
}

// O(log n) — Logarithmic: halves search space each step
int binarySearch(const vector<int>& arr, int target) {
    int lo = 0, hi = arr.size() - 1;
    while (lo <= hi) {
        int mid = lo + (hi - lo) / 2;
        if (arr[mid] == target) return mid;
        else if (arr[mid] < target) lo = mid + 1;
        else hi = mid - 1;
    }
    return -1;
}

int main() {
    vector<int> arr = {1, 3, 5, 7, 9, 11, 13, 15};
    cout << "O(1) first: " << getFirst(arr) << endl;          // 1
    cout << "O(n) search: " << linearSearch(arr, 9) << endl;  // 4
    cout << "O(log n) bs: " << binarySearch(arr, 9) << endl;  // 4
    cout << boolalpha << "O(n²) dup: " << hasDuplicate(arr) << endl; // false
    return 0;
}
public class BigOExamples {

    // O(1) — Constant time
    static int getFirst(int[] arr) {
        return arr[0];
    }

    // O(n) — Linear time
    static int linearSearch(int[] arr, int target) {
        for (int i = 0; i < arr.length; i++) {
            if (arr[i] == target) return i;
        }
        return -1;
    }

    // O(n²) — Quadratic time
    static boolean hasDuplicate(int[] arr) {
        for (int i = 0; i < arr.length; i++) {
            for (int j = i + 1; j < arr.length; j++) {
                if (arr[i] == arr[j]) return true;
            }
        }
        return false;
    }

    // O(log n) — Logarithmic time
    static int binarySearch(int[] arr, int target) {
        int lo = 0, hi = arr.length - 1;
        while (lo <= hi) {
            int mid = lo + (hi - lo) / 2;
            if (arr[mid] == target) return mid;
            else if (arr[mid] < target) lo = mid + 1;
            else hi = mid - 1;
        }
        return -1;
    }

    public static void main(String[] args) {
        int[] arr = {1, 3, 5, 7, 9, 11, 13, 15};
        System.out.println("O(1) first: "    + getFirst(arr));          // 1
        System.out.println("O(n) search: "   + linearSearch(arr, 9));   // 4
        System.out.println("O(log n) bs: "   + binarySearch(arr, 9));   // 4
        System.out.println("O(n²) dup: "     + hasDuplicate(arr));      // false
    }
}
# O(1) — Constant time
def get_first(arr):
    return arr[0]  # always 1 operation regardless of arr size

# O(n) — Linear time
def linear_search(arr, target):
    for i, x in enumerate(arr):  # up to n iterations
        if x == target:
            return i
    return -1

# O(n²) — Quadratic time
def has_duplicate(arr):
    for i in range(len(arr)):
        for j in range(i + 1, len(arr)):  # n*(n-1)/2 comparisons
            if arr[i] == arr[j]:
                return True
    return False

# O(log n) — Logarithmic time
def binary_search(arr, target):
    lo, hi = 0, len(arr) - 1
    while lo <= hi:
        mid = (lo + hi) // 2
        if arr[mid] == target:   return mid
        elif arr[mid] < target:  lo = mid + 1
        else:                    hi = mid - 1
    return -1

arr = [1, 3, 5, 7, 9, 11, 13, 15]
print("O(1) first:",    get_first(arr))          # 1
print("O(n) search:",   linear_search(arr, 9))   # 4
print("O(log n) bs:",   binary_search(arr, 9))   # 4
print("O(n²) dup:",     has_duplicate(arr))      # False

# O(n) duplicate check using a set — much better!
def has_duplicate_on(arr):
    seen = set()
    for x in arr:
        if x in seen: return True
        seen.add(x)
    return False
// O(1) — Constant time
function getFirst(arr) {
    return arr[0];  // always 1 operation
}

// O(n) — Linear time
function linearSearch(arr, target) {
    for (let i = 0; i < arr.length; i++) {  // up to n iterations
        if (arr[i] === target) return i;
    }
    return -1;
}

// O(n²) — Quadratic time
function hasDuplicate(arr) {
    for (let i = 0; i < arr.length; i++) {
        for (let j = i + 1; j < arr.length; j++) {  // n*(n-1)/2 comparisons
            if (arr[i] === arr[j]) return true;
        }
    }
    return false;
}

// O(log n) — Logarithmic time
function binarySearch(arr, target) {
    let lo = 0, hi = arr.length - 1;
    while (lo <= hi) {
        const mid = Math.floor((lo + hi) / 2);
        if (arr[mid] === target) return mid;
        else if (arr[mid] < target) lo = mid + 1;
        else hi = mid - 1;
    }
    return -1;
}

const arr = [1, 3, 5, 7, 9, 11, 13, 15];
console.log("O(1) first:",    getFirst(arr));          // 1
console.log("O(n) search:",   linearSearch(arr, 9));   // 4
console.log("O(log n) bs:",   binarySearch(arr, 9));   // 4
console.log("O(n²) dup:",     hasDuplicate(arr));      // false

// O(n) duplicate check using a Set — much better!
const hasDuplicateOn = arr => new Set(arr).size !== arr.length;

Ready to Level Up Your Skills?

Explore 500+ free tutorials across 20+ languages and frameworks.