A contiguous block of memory storing elements of the same type, allowing O(1) random access. Used for problems needing indexed access like finding the max element in a list.
2
New cards
Linked List
A sequence of nodes where each node points to the next, allowing efficient insertions and deletions. Commonly used in implementing stacks, queues, and adjacency lists.
3
New cards
Doubly Linked List
A linked list where nodes have pointers to both previous and next nodes. Used in problems like LRU Cache or browser history navigation.
4
New cards
Stack
A LIFO (Last-In-First-Out) data structure supporting push and pop operations. Used in expression evaluation, DFS, and undo functionality.
5
New cards
Queue
A FIFO (First-In-First-Out) data structure where elements are processed in arrival order. Used in BFS and scheduling problems.
6
New cards
Deque
A double-ended queue supporting insertion and deletion from both ends. Used in sliding window problems and monotonic queue algorithms.
7
New cards
Priority Queue
A structure where each element has a priority and the highest priority is processed first. Used in Dijkstra’s shortest path and scheduling problems.
8
New cards
Heap
A complete binary tree maintaining the heap property (min or max). Used for efficient retrieval of minimum or maximum elements in O(1) time.
9
New cards
Binary Heap
A heap implemented with an array-based binary tree. Used in implementing priority queues and heap sort.
10
New cards
Binary Search Tree (BST)
A tree where left < node < right for every node, allowing sorted traversal. Used for dynamic data that needs fast search, insert, and delete.
11
New cards
AVL Tree
A self-balancing BST keeping height differences ≤ 1. Used when consistent O(log n) operations are needed, like in ordered maps.
12
New cards
Red-Black Tree
A balanced BST using red/black coloring rules to maintain balance. Used in C++ STL maps and sets for efficient ordering and lookup.
13
New cards
Segment Tree
A binary tree for range queries and updates like sum or minimum. Used in range sum or range minimum query problems.
14
New cards
Fenwick Tree (Binary Indexed Tree)
A compact data structure for prefix sums and updates. Used in frequency counting and cumulative sum problems.
15
New cards
Trie
A prefix tree storing strings by characters in sequence. Used in autocomplete, dictionary, and prefix search problems.
16
New cards
Hash Table
Stores key-value pairs using hashing for O(1) average access. Used in frequency counting and fast lookup problems.
17
New cards
Hash Function
Maps keys to numerical indices in a hash table. Used to distribute keys uniformly and minimize collisions.
18
New cards
Hash Collision
Occurs when two keys hash to the same index. Managed using separate chaining or open addressing.
19
New cards
Open Addressing
Resolves hash collisions by probing for the next available slot. Used in linear and quadratic probing hash tables.
20
New cards
Separate Chaining
Resolves collisions using a list or chain at each hash index. Commonly used in hash maps for efficient insertion.
21
New cards
Graph
A set of vertices and edges representing connections. Used to model networks, social graphs, and pathfinding problems.
22
New cards
Adjacency List
Stores graph connections as lists of neighbors for each node. Used for sparse graphs and efficient traversal.
23
New cards
Adjacency Matrix
Stores edges in a 2D matrix for constant-time edge lookups. Used for dense graphs and algorithms like Floyd–Warshall.
24
New cards
Directed Graph
A graph where edges have direction. Used in dependency graphs and topological sorting problems.
25
New cards
Undirected Graph
A graph where edges have no direction. Used in social networks and connectivity problems.
26
New cards
Weighted Graph
A graph where edges have associated weights or costs. Used in shortest path and minimum spanning tree problems.
27
New cards
Breadth-First Search (BFS)
Explores nodes level by level using a queue. Used in shortest path problems on unweighted graphs.
28
New cards
Depth-First Search (DFS)
Explores as deep as possible before backtracking. Used in pathfinding, cycle detection, and connected components.
29
New cards
Topological Sort
Orders vertices of a DAG so dependencies come first. Used in build order and task scheduling problems.
30
New cards
Dijkstra’s Algorithm
Finds shortest paths from a source in a weighted graph. Used in routing and network optimization.
31
New cards
Bellman-Ford Algorithm
Finds shortest paths with support for negative edges. Used in detecting negative cycles in graphs.
32
New cards
Floyd–Warshall Algorithm
Computes shortest paths between all pairs of nodes. Used for dense graphs and all-pairs routing.
33
New cards
Minimum Spanning Tree (MST)
Connects all vertices with minimum total edge weight. Used in designing least-cost networks.
34
New cards
Kruskal’s Algorithm
Builds MST by adding smallest edges without forming cycles. Used for sparse graph MST problems.
35
New cards
Prim’s Algorithm
Builds MST starting from one node and adding minimal edges. Used for dense graph MST construction.
36
New cards
Union-Find / Disjoint Set
Keeps track of connected components with union and find operations. Used in Kruskal’s MST and cycle detection.
37
New cards
Recursion
A function calling itself on smaller subproblems until a base case. Used in divide-and-conquer, DFS, and backtracking problems.
38
New cards
Divide and Conquer
Solves problems by dividing into subproblems, solving, and combining results. Used in merge sort, quicksort, and binary search.
39
New cards
Dynamic Programming
Stores subproblem results to avoid recomputation. Used in problems like knapsack, LIS, and Fibonacci.
40
New cards
Greedy Algorithm
Makes the locally optimal choice at each step. Used in interval scheduling, coin change, and MST problems.
41
New cards
Big-O Notation
Expresses asymptotic upper bound of algorithm growth. Used to compare efficiency across input sizes.
42
New cards
Time Complexity
Measures how runtime scales with input size. Used to evaluate and optimize algorithm performance.
43
New cards
Space Complexity
Measures additional memory usage relative to input size. Used to analyze memory efficiency.
44
New cards
Amortized Analysis
Averages the cost of operations over time. Used in dynamic array resizing and union-find operations.
45
New cards
Cache Locality
Accessing nearby memory locations improves performance due to caching. Used to optimize loops over arrays.
46
New cards
Memory Fragmentation
Occurs when free memory becomes split into small blocks. Affects performance in dynamic allocation systems.
47
New cards
Pointer
A variable storing a memory address of another variable. Used in linked lists, trees, and dynamic memory management.
48
New cards
Reference
An alias for another variable allowing direct access. Used to simplify pointer syntax and prevent null issues.
49
New cards
Node
A basic unit of data structures storing value and links to others. Used in linked lists, trees, and graphs.
50
New cards
Iterator
An object providing sequential access to elements in a container. Used in traversing arrays, lists, or trees efficiently.