Lecture 10 & 11: Understanding Program Efficiency

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/44

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

45 Terms

1
New cards

Why we want to understand efficiency of programs

knowt flashcard image
2
New cards

Constant Complexity

knowt flashcard image
3
New cards

Logarithmic Complexity

knowt flashcard image
4
New cards

Bisection Search

knowt flashcard image
5
New cards

Bisection Search Complexity Analysis

knowt flashcard image
6
New cards

Complexity of First Bisection Search Method

knowt flashcard image
7
New cards

Bisection Search Alternative

knowt flashcard image
8
New cards

Complexity of 2nd Bisection Search Method

knowt flashcard image
9
New cards

Linear Complexity

  • searching a list in sequence to see if an element is present

  • iterative loops

10
New cards

O() for Iterative factorial

knowt flashcard image
11
New cards

O() for Recursive Factorial

knowt flashcard image
12
New cards

Log-Linear Complexity

  • many practical algorithms are this

  • very commonly used algorithm of this complexity is merge sort

13
New cards

Polynomial Complexity

  • most common algorithms of this complexity are quadratic, i.e, complexity grows with square of size of input

  • commonly occurs when we have nested loops or recursive function calls

14
New cards

Exponential Complexity

  • recursive functions where more than one recursive call for each size of problem (ex. Tower of Hanoi)

  • many important problems are inherently this type of complexity (unfortunate but cost is high, will lead us to consider approximate solutions as may provide reasonable answer more quickly)

15
New cards

Power Set- Concept

knowt flashcard image
16
New cards

Complexity of Power Set

knowt flashcard image
17
New cards

Why does efficient programs matter?

Even though computers are getting faster, the size of problems/data sites are getting larger

18
New cards

Tradeoff between time and space efficiency of a program

  • can someAmes pre-compute results are stored; then use “lookup” to

    retrieve (e.g., memoization for Fibonacci)

    ◦ will focus on Timme efficiency

19
New cards

Challenges in understanding efficiency of solution to a computational problem

  • a program can be implemented in many different ways

  • can solve a problem using only a handful of different algorithms

  • would like to separate choices of implementation from choices of more abstract algorithm

20
New cards

How to evaluate efficiency of programs

  • measure with a timer

  • count the operations

  • abstract the notion of growth (most appropriate)

21
New cards

Timing a Program

  • use time module

  • start clock

  • call function

  • stop clock

  • time = difference between start clock time and stop clock time

<ul><li><p>use time module</p></li><li><p>start clock</p></li><li><p>call function</p></li><li><p>stop clock</p></li><li><p>time = difference between start clock time and stop clock time </p></li></ul><p></p>
22
New cards

Timing a Program is Inconsistent

<p></p>
23
New cards

Counting Operations

  • assume that mathematical operations, comparisons, assignments, and accessing objects in memory are all steps that take constant time

  • then count the number of operations executed as function of size of input

<ul><li><p>assume that mathematical operations, comparisons, assignments, and accessing objects in memory are all steps that take constant time </p></li><li><p>then count the number of operations executed as function of size of input </p></li></ul><p></p>
24
New cards

Why Counting Operations is better than Timing a Program

knowt flashcard image
25
New cards

Goals of Order of Growth

  • want to evaluate program’s efficiency when input is very big

  • want to express the growth of program’s run time as input size grows

  • want to put an upper bound on growth - as tight as possible

  • do not need to be precise: “order of” not “exact” growth

  • will look at largest factors in run time ( which section of the program will take the longest to run?)

  • generally we want tight upper bound on growth, as function of size of input, in worst case

26
New cards

Big Oh Notation

measures an upper bound on the asymptotic growth (often called order of growth); used to describe worst case (occurs often and is the bottleneck when a program runs); express rate of growth of program relative to the input size; evaluate algorithm NOT machine or implementation

27
New cards

What does O(N) measure?

Interested in describing how amount of time needed grows as size of (input to) problem grows → given an expression for the number of operations needed to compute an algorithm, want to know asymptotic behavior as size of problem gets large → focus on term that grows most rapidly in a sum of terms & will ignore multiplicative constants (want to know how rapidly time required increases as increase size of input)

28
New cards

Analyzing Programs and Their Complexity

combine complexity classes by analyzing inside functions and apply some rules, focus on dominant term

29
New cards

Law of Addition for O()

  • used with sequential statements

  • O(f(n)) + O(g(n)) is O(f(n) + g(n))

<ul><li><p>used with sequential statements</p></li><li><p>O(f(n)) + O(g(n)) is O(f(n) + g(n))</p></li></ul><p></p>
30
New cards

Law of Multiplication for O()

  • used with nested statements/loops

  • O (f(n)) * O(g(n)) is O(f(n) times g(n))

<ul><li><p>used with nested statements/loops</p></li><li><p>O (f(n)) * O(g(n)) is O(f(n) times g(n))</p></li></ul><p></p>
31
New cards

O(1)

denotes constant running time; code does not depend on size of problem

32
New cards

O (logn)

denotes logarithmic running time; reduce problem in half each time through process

33
New cards

O (n)

denotes linear running time; simple iterative or recursive programs

34
New cards

O (nlogn)

denotes log-linear running time

35
New cards

O(nc)

denotes polynomial running time (c is a constant); nested loops or recursive calls

36
New cards

O (cn)

denotes exponential running time ( c is a constant being raised to a power based on size of input); muliple recursive calls at each level

37
New cards

Complexity Ordered from Low to High

knowt flashcard image
38
New cards

Linear Search on Unsorted List

knowt flashcard image
39
New cards

Constant Time List Access

knowt flashcard image
40
New cards

Linear Search on Sorted List

knowt flashcard image
41
New cards

O() for Nested Loops

knowt flashcard image
42
New cards

Big Oh Summary

knowt flashcard image
43
New cards

Complexity of Common Python Functions: Lists (n is len(L))

knowt flashcard image
44
New cards

Complexity of Common Python Functions: Dictionaries (n is len(d) (worst case)

knowt flashcard image
45
New cards

Complexity of Common Python Functions: Dictionaries (n is len(d) (average case)

knowt flashcard image