DATA STRUCTURES AND ALGORITHMS
ALGORITHMS TOPICS
- important for programming
Algorithm - step-by-step procedure for solving a computational problem
Program - step-by-step procedure for solving a computational problem
Phases of developing a software project
2 phases
Design Phase - make your design understandable so that you can visualize what you are going to develop
Implementation Phase - take your design and develop a program/algorithm
you CAN NOT develop programs through trial and error (:O)
In design phase, we write it in simple english-like statements to better visualize (usually has no proper syntax, as it is just used as framework) so that you can be familiar with how your program works so that
In implementation phase, your framework can be used to create your desired program.
DIFFERENCE BETWEEN THE TWO
Algorithms:
ALGORITHMS ARE WRITTEN AT DESIGN PHASE
Other people should have DOMAIN KNOWLEDGE (know how to use program)
Can be created by using ANY LANGUAGE as long as it is understandable by people using it
Hardware and Software independent (usable on any platform)
ANALYZE if algorithm has proper syntax, is efficient with time and space
Programs:
PROGRAMS ARE WRITTEN AT IMPLEMENTATION PHASE
Programmer should have PROGRAMMING KNOWLEDGE (know how to create program)
Can be created by using PROGRAMMING LANGUAGE (C, C++, Java, Python, etc.)
Hardware and Software dependent (needs to be set to specific platform)
TESTING if program works (test case ehehheheheh)
TIME COMPLEXITY:
relationship between growth of input size and growth of operations executed
communicated through O notation (o (n))
O Notation: a mathematical notation used to classify algorithms according to how their run time or space requirements grow as the input size grows
Common Complexities:Constant: O(1), is just constant uwu
Linear: O(n), algorithm grows linear or is dependent on input size
Quadratic: O(n2), is not dependent on input size but is inefficient for large input sizes
Logarithmic: O(log n)
n log n: O(n log n)
Exponential: O(2n)
Factorial: O(n!)
For Big O notations drop constants (O(N/10) and O(10 x N) is still equivalent to O(N))
SPACE COMPLEXITY:
relationship between growth of input size and growth of space needed
works the same way as time complexity
LOGS:
inverse to exponential functions, represent the logarithmic relationship in algorithms which indicates how space requirements grow relative to the input size
SORTING ALGORITHMS:
are used to arrange a set of elements in a set order
How most sorting algorithms work:
2 pointers (one at the start of the list and one that is linear scan)
As both pointers are incremented, if the pointer ahead finds an element that is less than the previous element in the list, the elements are swapped
Pointers are repeatedly incremented until list is sorted
(obviously there are way more efficient sorting algorithms, but this is how i understand most sorting algorithms to follow these basic concepts)
ARRAYS:
Most fundamental data structure
Indexable, contiguous chunk of memory
Fixed size (cannot append/add element to a full array ex. inserting 9th element in an 8 element array)
LINKED LIST:
Traversable data structure
Uses Nodes to store elements, where each node contains data and a reference to the next node in the sequence, allowing dynamic memory allocation and flexible memory usage.
BINARY TREES:
Similar to linked lists, except the main node points to left and right child node.
Node with no child: Leaf Node
Main Node: Parent Node/Root Node
Binary Search Tree:
The value of the key of the left sub-tree is less than the value of its parent (root) node’s key
and
The value of the key of the right sub-tree is greater than or equal to the value of its parent (root) node’s keyBinary Trees’ time complexity in worst case is O(H) where H is the height.
HEAP:
primary diff. between Heap and Binary tree is that Heap Tree is that the parent nodes have greater than or equal priority as their child nodes.