Numerical Analysis FINAL

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/30

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

31 Terms

1
New cards

Numerical Analysis

The analysis and development of methods
used to approximate solutions to math problems.

2
New cards

What ways are theory intertwined with numerical analysis

theories guarantee the methods actually work

3
New cards

What ways are application intertwined with numerical analysis

  1. applications motivate development of new theories

  2. test and validate theories

4
New cards

Is a python package producing a reasonable result?

  1. compare the actual result vs. package

  2. code it yourself

5
New cards

Algorithmic error

error arising from using approx. method instead of exact

6
New cards

Floating-Point error

“round-off error”, computer rounding

7
New cards

Truncation error

using finite number of terms to approx. infinite process

8
New cards

Bisection Method

split [a,b] in half and choose interval where root is, repeat

9
New cards

Advantages of Bisection method

always converges, easy implementation

10
New cards

Disadvantages of Bisection method

slow

doesn’t work for even multiplicity roots

unlikely to hit the exact solution

linear order

11
New cards

Fixed-Point Iteration

rewrite f(x)=0 as g(x)=x, then iterate x_n+1 = g(x_n)
p_n+1=F(p_n)

12
New cards

Advantages of FP Iteration

only one function evaluation is required

if second derivative is continuous - quad order

13
New cards

Disadvantages of FP Iteration

requires F’(x)<1 for unique solution

14
New cards

Newton’s Method

using tangent line at current approx. to find next approx.
p_n+1=p_n-F(p_n)/F’(p_n)

15
New cards

Advantages of Newtons method

very fast, quadratic convergence

16
New cards

Disadvantages of Newtons method

often requires a good initial guess

two function evaluations at each iteration

17
New cards

Secant Method

p_n+1=p_n - F(p_n)(p_n-p_n-1)/(F(p_n)-F(p_n-1))

18
New cards

Advantages of Secant method

fast

1+sqrt(5)/2 order

1 function evaluation per iteration

19
New cards

Disadvantages of Secant method

need two good first guesses to start

20
New cards

what is the purpose of interpolation?

estimate values between known data points

build useful approximations from limited data

21
New cards

Pros of Global Method

Lagrange

very accurate for smooth functions

one unified formula

22
New cards

Cons of Global Method

Lagrange

high-degree polynomials can oscillate badly

sensitive to small changes in data

not good for discontinuous behavior

23
New cards

Pros of Local Method

Piecewise

stable

efficient for large datasets

good for irregular behavior

24
New cards

Cons of Local Method

Piecewise

not one unified formula

derivatives may not be smooth in simpler methods

requires keeping track of intervals

25
New cards

Why do truncation and round-off errors affect the derivative approx.?

using finite difference formulas requires subtracting nearly equal numbers and approximating a limit with a finite step size, which together create numerical inaccuracies depending on how small or large the step size h is.

26
New cards

Richardson Extrapolation

improves the accuracy of an approximation by combining two estimates of the same quantity—each computed with different step sizes—so that the leading error term cancels out

27
New cards

Advantages of Richardson Extrapolation

increases accuracy

reduces truncation error

helps choose a good step size

28
New cards

Disadvantages of Richardson Extrapolation

sensitive to round-off error

assumes smooth error behavior

requires extra computations

not ideal for extremely small h

29
New cards

Gradient Descent

find the minimum of a function by iteratively moving in the direction of the steepest decrease, which is opposite to the gradient, until the function reaches its lowest value or converges to an acceptable approximation.

30
New cards

Challenges of gradient descent for cost function with large variables

lots of data (takes time)

very expensive

getting stuck in local minima or saddle points

31
New cards

Ways to mitigate GD

pre-coding analysis of best initial learning rate

use random sample of data to “train”

Explore top flashcards

Shut Up
Updated 964d ago
flashcards Flashcards (65)
hbs 5.1
Updated 953d ago
flashcards Flashcards (61)
CSE111 - Indexes
Updated 385d ago
flashcards Flashcards (52)
The Knee
Updated 89d ago
flashcards Flashcards (33)
Shut Up
Updated 964d ago
flashcards Flashcards (65)
hbs 5.1
Updated 953d ago
flashcards Flashcards (61)
CSE111 - Indexes
Updated 385d ago
flashcards Flashcards (52)
The Knee
Updated 89d ago
flashcards Flashcards (33)