Time complexity

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/7

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

8 Terms

1
New cards

Constant Time – O(1)

The execution time remains constant regardless of input size.

Example: Accessing an element in an array

def get_first_element(arr):
    return arr[0]  # Always takes the same amount of time

arr = [1, 2, 3, 4, 5]

print(get_first_element(arr))  # Output: 1

2
New cards

Logarithmic Time – O(log n)

The execution time grows logarithmically, often seen in divide-and-conquer algorithms.

Example: Binary Search

def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    while left <= right:
        mid = (left + right) // 2
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1
    return -1  # Element not found

arr = [1, 3, 5, 7, 9, 11]
print(binary_search(arr, 7))  # Output: 3

3
New cards

Linear Time – O(n)

Linear Time – O(n)

The execution time increases linearly with the input size.

def find_element(arr, target):
    for i in range(len(arr)):
        if arr[i] == target:
            return i
    return -1  # Element not found

arr = [4, 2, 9, 1, 5]
print(find_element(arr, 9))  # Output: 2

4
New cards

Linearithmic Time – O(n log n)

Often seen in efficient sorting algorithms like Merge Sort or QuickSort.

Example: Merge Sort

def merge_sort(arr):
    if len(arr) <= 1:
        return arr

    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])

    return merge(left, right)

def merge(left, right):
    result = []
    i = j = 0
    while i < len(left) and j < len(right):
        if left[i] < right[j]:
            result.append(left[i])
            i += 1
        else:
            result.append(right[j])
            j += 1
    result.extend(left[i:])
    result.extend(right[j:])
    return result

arr = [5, 2, 9, 1, 5, 6]
print(merge_sort(arr))  # Output: [1, 2, 5, 5, 6, 9]

5
New cards

Quadratic Time – O(n²)

Execution time grows quadratically with input size, often seen in nested loops.

Example: Bubble Sort

def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]
    return arr

arr = [5, 3, 8, 2, 1]
print(bubble_sort(arr))  # Output: [1, 2, 3, 5, 8]

6
New cards

Cubic Time – O(n³)

Execution time grows cubically, often seen in algorithms with three nested loops.

Example: Checking for triplets in an array

def find_triplets(arr):
    n = len(arr)
    for i in range(n):
        for j in range(i+1, n):
            for k in range(j+1, n):
                print(arr[i], arr[j], arr[k])

arr = [1, 2, 3, 4]
find_triplets(arr)

7
New cards

Exponential Time – O(2ⁿ)

Time doubles with each additional input size, often seen in brute-force recursive algorithms.

Example: Fibonacci Sequence (Recursive)

def fibonacci(n):
    if n <= 1:
        return n
    return fibonacci(n-1) + fibonacci(n-2)

print(fibonacci(5))  # Output: 5

8
New cards

Factorial Time – O(n!)

Time grows factorially, seen in problems like the Traveling Salesman Problem (TSP).

Example: Generating all permutations of a string

from itertools import permutations

def generate_permutations(s):
    return list(permutations(s))

print(generate_permutations("abc"))