Module 5 Textbook readings

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/16

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

17 Terms

1
New cards

Estimator/estimate

  • Estimator

    • is a rule/function that is used to estimate a population parameter based on some sample data

      • the sample mean Xbar is an estimator of the population mean mu

      • For a way to calculate the mean weight, we intend to sum 100 almonds’ weights, divide by 100, and use this value to estimate the population mean weight

      • changes depending on the sample

  • Estimate

    • the actual numerical value obtained by applying an estimator to a given sample

      • Xbar = 5.2, then 5.2 is the estimate of mu

  • esitmator is formula, does not return a number.

<ul><li><p>Estimator </p><ul><li><p>is a rule/function that is used to estimate a population parameter based on some sample data </p><ul><li><p>the sample mean Xbar is an estimator of the population mean mu</p></li><li><p>For a way to calculate the mean weight, we intend to sum 100 almonds’ weights, divide by 100, and use this value to estimate the population mean weight </p></li><li><p>changes depending on the sample </p></li></ul></li></ul></li><li><p>Estimate </p><ul><li><p>the actual numerical value obtained by applying an estimator to a given sample </p><ul><li><p>Xbar = 5.2, then 5.2 is the estimate of mu </p></li></ul></li></ul></li></ul><p></p><ul><li><p>esitmator is formula, does not return a number. </p></li></ul><p></p>
2
New cards

Unbiased vs biased estimator

  1. Unbiased estimator occurs when the expected value of the sample mean is equal to the population mean

    1. on average, the sample MEANS will equal the population mean

  2. Biased estimator is when the sample means is not equal to the population mean

3
New cards

SD in normal distribution

  1. One SD is 68.27%

  2. Two SD is 95.45%

  3. 3 SD is 99.7%

4
New cards

(LC8.4) What does the population standard deviation (σ) represent?

B. It measures the average difference between each almond’s weight and the population mean weight.

5
New cards

Z score and SD’s

  1. Z score is how far a something is away from the mean in terms of SD

    1. in the picture, if x =11, it is 3 standard deviations above the mean

  2. 68% → 1

  3. 95% → 1.96

  4. 99% → 3

  5. So for a 95%CI, that means that the z-score distance from the sample statistic to the population parameter is less than 1.96

<ol><li><p>Z score is how far a something is away from the mean in terms of SD </p><ol><li><p>in the picture, if x =11, it is 3 standard deviations above the mean </p></li></ol></li><li><p>68% → 1 </p></li><li><p>95% → 1.96</p></li><li><p>99% → 3 </p></li><li><p>So for a 95%CI, that means that the z-score distance from the sample statistic to the population parameter is less than 1.96 </p></li></ol><p></p>
6
New cards

Point estimate vs Confidence Interval fishing example

  1. point estimate is spear fishing

  2. CI is like throwing a net around the ripples

7
New cards

(LC8.6) How is the standard error of the sample mean weight of almonds calculated?

Answer: B. By dividing the population standard deviation by the square root of the sample size.

8
New cards

Sample mean relation to CLT facts

  1. the sampling distribution of the sample mean is approximately normal with mean equal to the population mean and standard deviation given by standard error sigma/sqrt(n)

    1. this is due to sufficiently large samples tending towards the normal

9
New cards

Reason for t-dist, and benefits

  1. required since we don’t know the population SD, must use sample SD

  2. allows us to create a 95% confidence interval based entirely on our sample information using sample mean and sample SD

10
New cards

(LC8.8) Why does the t distribution have thicker tails compared to the standard normal distribution?

(LC8.9) What is the effect of increasing the degrees of freedom on the tt distribution?

  • D. Because it accounts for the extra uncertainty that comes from using the sample standard deviation instead of the population standard deviation.

  • B. The tails of the distribution become thinner.

11
New cards

Degrees of freedom for sample mean problems

  1. n-1

12
New cards

Creating a t-dist code and finding lower and upper bounds

almonds_sample_100 |>
  summarize(sample_mean = mean(weight), sample_sd = sd(weight))
almonds_sample_100 |>
  summarize(sample_mean = mean(weight), sample_sd = sd(weight),
            lower_bound = mean(weight) - 1.98*sd(weight)/sqrt(length(weight)),
            upper_bound = mean(weight) + 1.98*sd(weight)/sqrt(length(weight)))

13
New cards

Comparison between construction of CI with and wihtout known sigma

  1. 95% corresponds to 1.98 in a t-dist since the area becomes n-1 = 99

  2. t_critical <- qt(0.975, df)

<ol><li><p>95% corresponds to 1.98 in a t-dist since the area becomes n-1 = 99 </p></li><li><p>t_critical &lt;- qt(0.975, df)</p></li></ol><p></p>
14
New cards

qnorm function

  1. used to find the appropriate standard errors in calculations

    1. qnorm(%+remaining area of left or right)

      1. or 1-CI for a and a/2 for second part

    2. so for 90% it would be (0.9+0.5) = qnorm(0.95)

<ol><li><p>used to find the appropriate standard errors in calculations </p><ol><li><p>qnorm(%+remaining area of left or right) </p><ol><li><p>or 1-CI for a and a/2 for second part </p></li></ol></li><li><p>so for 90% it would be (0.9+0.5) = qnorm(0.95) </p></li></ol></li></ol><p></p>
15
New cards

Relationship between margin of error and sample size

  1. If the sample size increases, the margin of error decreases proportional to the square root of the sample size. For example, if we secure a random sample of size 25, 1/√25=0.21/25=0.2, and if we draw a sample of size 100, 1/√100=0.11/100=0.1.

  2. increase in sample isze is decrease in margin of error

<ol><li><p><span>If the sample size increases, the margin of error decreases proportional to the square root of the sample size. For example, if we secure a random sample of size 25, 1/√25=0.21/25=0.2, and if we draw a sample of size 100, 1/√100=0.11/100=0.1.</span></p></li><li><p><span>increase in sample isze is decrease in margin of error </span></p></li></ol><p></p>
16
New cards

Why larger samples are better and tradeoffs

  1. larger samples = narrower CI

    1. However creates logistical problems w/ data collection etc.

17
New cards