econometrics chap 5 and 6

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/31

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

32 Terms

1
New cards

perfect multicollinearity

The dummy variable trap is an example​ of:

2
New cards

E(ui|Xi)=0

The following OLS assumption is most likely violated by omitted variables​ bias:

3
New cards

are unbiased and consistent.

Under the least squares assumptions for the multiple regression problem​ (zero conditional mean for the error​ term, all Upper X Subscript i and Upper Y Subscript i being​ i.i.d., all Upper X Subscript i and mu Subscript i having finite fourth​ moments, no perfect​ multicollinearity), the OLS estimators for the slopes and​ intercept:

A.

4
New cards

If you had a two regressor regression​ model, then omitting one variable which is relevant

can result in a negative value for the coefficient of the included​ variable, even though the coefficient will have a significant positive effect on Y if the omitted variable were included.

5
New cards

When you have an omitted variable​ problem, the assumption that E​(ui vertical line Xi​) ​= 0 is violated. This implies that

the OLS estimator is no longer consistent.

6
New cards

When there are omitted variables in the​ regression, which are determinants of the dependent​ variable, then

the OLS estimator is biased if the omitted variable is correlated with the included variable.

7
New cards

Imperfect​ multicollinearity:

implies that it will be difficult to estimate precisely one or more of the partial effects using the data at hand.

8
New cards

Imperfect​ multicollinearity:

means that two or more of the regressors are highly correlated.

9
New cards

you are no longer controlling for the influence of the other variable.

In a two regressor regression​ model, if you exclude one of the relevant variables then

10
New cards

the intercept in the multiple regression model

determines the height of the regression line.

11
New cards

In a multiple regression​ framework, the slope coefficient on the regressor X2i

is measured in the units of Yi divided by units of X2i.

12
New cards

In the multiple regression​ model, the least squares estimator is derived by

minimizing the sum of squared prediction mistakes.

13
New cards

The sample regression line estimated by OLS

is the line that minimizes the sum of squared prediction mistakes.

14
New cards

The OLS residuals in the multiple regression model

can be calculated by subtracting the fitted values from the actual values.

15
New cards

The main advantage of using multiple regression analysis over differences in means testing is that the regression technique

gives you quantitative estimates of a unit change in X.

16
New cards

if X1 and X2 are correlated

Consider the multiple regression model with two regressors X1 and X2​, where both variables are determinants of the dependent variable. When omitting X2 from the​ regression, then there will be omitted variable bias for ModifyingAbove beta 1 with arc

17
New cards

omitted variable bias

Consider the multiple regression model with two regressors X1 and X2​, where both variables are determinants of the dependent variable. You first regress Y on X1 only and find no relationship. However when regressing Y on X1 and X2​, the slope coefficient ModifyingAbove beta 1 with arc changes by a large amount. This suggests that your first regression suffers from

18
New cards

the OLS estimator cannot be computed in this situation.

You have to worry about perfect multicollinearity in the multiple regression model because

19
New cards

independently and identically distributed.

One of the least squares assumptions in the multiple regression model is that you have random variables which are​ "i.i.d." This stands for

20
New cards

are unbiased and consistent.

Under the least squares assumptions for the multiple regression problem​ (zero conditional mean for the error​ term, all Xi and Yi being​ i.i.d., all Xi and ui having finite fourth​ moments, no perfect​ multicollinearity), the OLS estimators for the slopes and intercept

21
New cards

none of the OLS estimators to exist because there is perfect multicollinearity.

Imagine you regressed earnings of individuals on a​ constant, a binary variable ​("Male​") which takes on the value 1 for males and is 0​ otherwise, and another binary variable ​("Female​") which takes on the value 1 for females and is 0 otherwise. Because females typically earn less than​ males, you would expect

22
New cards

If you wanted to​ test, using a​ 5% significance​ level, whether or not a specific slope coefficient is equal to​ one, then you​ should:

subtract 1 from the estimated​ coefficient, divide the difference by the standard​ error, and check if the resulting ratio is larger than 1.96.

23
New cards

In the multiple regression​ model, the t​-statistic for testing that the slope is significantly different from zero is​ calculated:

by dividing the estimate by its standard error.

24
New cards

he​ homoskedasticity-only F​-statistic and the​ heteroskedasticity-robust F​-statistic typically​ are:

different

25
New cards

The critical value of Upper F Subscript 4 comma infinity at the​ 5% significance level​ is:

2.37

26
New cards

When there are two​ coefficients, the resulting confidence sets​ are:

ellipses

27
New cards

Suppose you run a regression of test scores against parking lot area per pupil. Is the Upper R squared likely to be high or​ low?

​High, because parking lot area is correlated with studentdashteacher ​ratio, with whether the school is in a suburb or a​ city, and possibly with district income.

28
New cards

Are the OLS estimators likely to be biased and​ inconsistent?

The OLS estimators are likely biased and inconsistent because there are omitted variables correlated with parking lot area per pupil that also explain test​ scores, such as ability.

29
New cards

The confidence interval for a single coefficient in a multiple regression

contains information from a large number of hypothesis tests.

30
New cards

When testing joint​ hypothesis, you should

use the F-statistics and reject at least one of the hypothesis if the statistic exceeds the critical value.

31
New cards

he overall regression F-statistic tests the null hypothesis that

all slope coefficients are zero.

32
New cards

For a single restriction ​(q​ = 1), the F-statistic

is the square of the t-statistic.