1/26
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
"Stationary, weakly dependent time series are ideal for use in multiple regression analysis." Define what is meant by covariance stationarity in the context of time series analysis.
Covariance stationarity means that the statistical properties of a time series, such as the mean, variance and auto-covariance, do not change over time. Alternatively, a time series is stationary if its means and variance are constant over time, and the covariance between two time periods depends only on the time lag between them, not on the actual time at which the covariance is calculated.
What is strict stationarity in time series econometrics?
Strict stationarity refers to a property of a stochastic process {xt : t = 1,2,.. } where the joint distribution of any set of observations at different time indices remains unchanged when shifted in time.
What is the practical role of stationarity in time series econometrics?
Stationarity helps to ensure that the relationships observed are stable and interpretable over time, making it a key concept in regression analysis for time series data. Without it, the results of the analysis may not be reliable.
Explain what it means for a time series to be weakly dependent.
A time series is weakly dependent if the correlation between observations diminishes as the time lag between them increases. Alternatively, observations that are far apart in time have little to no correlation with each other.
How does weak dependence influence the ability to draw valid conclusions from time series data?
Weak dependence ensures that correlations between observations in time series data diminish over time, allowing aggregated observations to converge to a normal distribution. This convergence is essential for making valid statistical inferences, such as hypothesis tests and constructing confidence intervals, enabling accurate analysis of relationships between variables.
Describe the characteristics of an autoregressive process of order one AR(1) with respect to weak dependence.
An autoregressive process of order one (AR(1)) exhibits weak dependence if the absolute value of the autoregressive coefficient ρ is less than one (|ρ| < 1)
Consider the finite sample properties of OLS under classical assumptions in time series analysis and their associated asymptotic properties. What is the difference between Assumption TS.1 and Assumption TS.1'?
The important extra restriction in Assumption TS.1’ as compared with Assumption TS.1 is the weak dependence assumption.
Consider the finite sample properties of OLS under classical assumptions in time series analysis and their associated asymptotic properties. Is Assumption TS.3' weaker than Assumption TS.3? Justify your answer.
Yes. Assumption TS.3’ is weaker than Assumption TS.3 because it places no restrictions on how μt is related to the explanatory variables in other time periods.
If we assume stationarity, would there be a difference between Assumption TS.3 and Assumption TS.3'?
No, under stationarity, if contemporaneous exogeneity holds for one time period, it holds for all time periods. That is, there will be strict exogeneity in both Assumptions TS.3 and Assumption TS.3’
Consider the finite sample properties of OLS under classical assumptions in time series analysis and their associated asymptotic properties. Under Assumptions TS.1, TS.2 and TS.3, we conclude that OLS estimators are unbiased i.e. 𝐸(𝛽j) = 𝛽𝑗. What conclusion do we make under TS.1’, TS.2’ and TS.3’?
Under TS.1’, TS.2’ and TS.3’, we conclude that plim(βj) = βj. The key difference between E(βj)= βj and plim(βj) = βj lies in the concepts of unbiasedness and consistency in statistical estimation.
Unbiasedness means that the expected value of the OLS estimator βj is equal to the true parameter βj. On average, across many samples , the OLS estimator will give the correct parameter value.
Consistency (the plim refers to the probability limit) means that as the sample size n goes to infinity, the OLS estimator βj converges in probability to the true value βj. As the sample size increases, the estimator becomes more accurate and tends to give the true parameter value, even though it may be biased in small samples.
In time series analysis, what is contemporaneous homoskedasticity
Refers to the condition where the variance of the error term does not vary with explanatory variables at any given point in time though the variance of error term in this current period could vary with explanatory variables in different time periods. Specifically, for a time series model with error terms μt, contemporaneous homoskedasticity implies that:
Var(μt | xt ) = σ2 for all t
What is serial correlation?
The correlation of a time series with its own past values. Alternatively, it occurs when the residuals (errors) are correlated across time periods, violating the assumption of independence.
Under TS.1' through TS.5', what do we conclude about the distribution of the OLS estimators? Explain how this affects the prospects for statistical inference?
Under TS.1’ through TS.5’ the OLS estimators are asymptotically normally distributed. The usual OLS standard errors, t statistics, f statistics, and LM statistics are asymptotically valid, meaning we may conduct statistical inference.
What are highly persistent time series?
Those where shocks or deviations form the mean have long-lasting effects, meaning that the series tends to “remember” past values for an extended period.
Briefly explain what a random walk process is
A random walk process is a time series model where the current value of the series is equal to the previous value plus a random shock or innovation. Alternatively, the series “walks” randomly, with no tendency to revert to a mean or trend over time.
What is a unit root process?
A unit root process is a time series that has a stochastic trend, meaning it is non-stationary and exhibits a high degree of persistence. Alternatively, the characteristic equation for an autoregressive model has a root equal to one, implying that shocks to the series have a permanent effect.
Give an example of an economic or financial variable that could be modelled by a unit root process.
The Gross Domestic Product or a stick price series.
Distinguish between processes that are I(0) and I(1)
A process is I(0) (integrated of order zero) if it is stationary and does not require differencing to achieve stationarity. Alternatively, an I(0) process has a constant mean, variance and short memory.
In contrast, a process is I(1) (integrated of order one) if it is non-stationary and becomes stationary after first differencing. Alternatively, an I(1) process typically has a unit root and exhibits a random walk behaviour.
What does it mean to say that yt is integrated of order 2?
Saying that yt is integrated of order 2 indicates that it needs to be be differenced twice (i.e. two differencing operations are necessary) to transform it into a stationary series.
For large samples, the sample first-order autocorrelation 𝜌𝜌 can be used to assess whether a variable has a unit root. What rule of thumb would you apply to conclude that the variable exhibits a unit root? Additionally, what steps would you take to eliminate the unit root from the variable?
To determine if a variable has a unit root, apply the rule of thumb that if the sample first-order autocorrelation coefficient ρ is greater than 0.8, the variable likely has a unit root.
To eliminate the unit root, difference the series.
Why is it ordinarily not a good idea to use time series with strong persistence in a regression equation?
Using time series with strong persistence in a regression equation can lead to spurious regression results, where the estimated relationships between variables may appear significant even when they are not due to the stationarity of the series.
How can inertia or sluggishness cause serial correlation in the data?
Inertia or sluggishness in a time series can create dependencies on past values, thereby causing serial correlation in the data.
How might omitted variables and incorrect functional forms lead to serial correlation in a time series model?
Both omitted variables and incorrect functional forms can cause the model’s residuals to exhibit systematic relationships over time, leading to serial correlation, which violates the assumption of independent errors in regression analysis.
Distinguish between first-order, second-order, and third-order serial correlation in the context of time series analysis
First-order serial correlation refers to the correlation between a variable and its immediate past value, indicating how current bales are influenced by the previous observation.
Second-order serial correlation measures the correlation between a variable and its value from two periods ago, capturing longer lag effects.
Third-order serial correlation assesses the correlation between a variable and its value from three periods ago, indicating even longer-term dependencies.
Each order of serial correlation provides insight into the persistence and relationships within the time series data.
What are the properties of Ordinary Least Squares (OLS) estimators when errors exhibit serial correlation, and how does this affect their efficiency and validity in hypothesis testing?
When errors exhibit serial correlation, OLS estimators remain linear, unbiased and consistent, but their standard errors become biased, leading to invalid hypothesis tests and loss of efficiency, as they are no longer the Best Linear Unbiased Estimators (BLUE)
What are the appropriate Stata commands for running regressions of y on x1 and x2 with robust standard errors to correct for heteroskedasticity alone, and for both heteroskedasticity and serial correlation? [
Regress y x1 ×2, robust
Newey y x2 ×2, lag(#)
What are the Durbin-Watson statistic values associated with 𝜌 = 1, 𝜌 = 0, and 𝜌 = -1 where ρ is the coefficient of μt-1 in an AR(1) model of μt?
When ρ = ,1 the Durbin-Watson (DW) is 0, indicating strong positive serial correlation.
When ρ = 0, the DW statistic is 2, indicating no serial correlation
when ρ = -1, the DW statistic is 4, indicating strong negative serial correlation