How to estimate and interpret VAR models in Eviews - Vector Autoregression model

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello welcome back to another reviews video tutorial my name is juan and today i'm going to be covering the topic vector or regression bar models interviews so let's begin with an overview of our models they are our model generalized univariate autoregressive models by allowing multivariate time series so recall that a univariate regression is a one single equation model where its current values are explained by its lagged values whereas a var model is an m variables n equation model which express each variable as a linear function of its own past values the past values of all other variables being considered a necessarily uncorrelated error term so popularity of bar models was impulsed by christopher sims due to his research paper from 1980 called macroeconomics and reality the main premise is that bar models provide a coherent and credible approach to data description forecasting structural inference and policy analysis the main use of our models is for forecasting macroeconomic variables such as gdp and inflation and for policy analysis so let's take a look now at a formal representation this is an example of a bivariate bar because it contemplates two variables such as y and x and it contains one log that's the reason why it has a one here in brackets so as i have explained before it's in equations so we have two equations we have a constant term here here we have the lacked values of the variable y t and then we have the luck values as well of the variable x t so y t is going to be explained by a constant its own past values the past values of the other variables in the in the model and the uncorrelated error term if when i take this into a matrix representation the first vector would contain our equations our second vector would contain our constant the main matrix contains the lag values of yt the lagged value of xt and here the same goes for the variable xt here it's going to explain by yt minus 1 and xt minus 1 here so what are some of the assumptions of the bar models well variables yt and xt are stationary logs and differences can be applied if required some of these transformations may be required in order to make our variables stationary ut and vt which are our error terms are white noise disturbances and they are commonly called innovations or shock terms and finally the coefficients in the main matrix are estimated by ols so now let's take a look at our example which is a stock on watson 2001 paper called vector autoregressions what we are going to do now is follow an example step by step we are going to be replicating this research paper and using vector regression models so just as a summary the authors review how bar models perform the four tasks that econometricians do which is data description forecasting structural inference and policy analysis in order to do so they set up a bar model to see how monetary policy affects inflation and employment and also review the philips curve the variables that they use in their model is inflation unemployment and the fed fund rate the frequency is quarterly data and goes from 1960 to 2000 q4 and this data is available in the description of the video so feel free to download it and follow the steps one by one so this is the representation they do they have inflation which is going to explain as i said by past realizations of inflation itself past values of unemployment and past values of the fed fund rate and similarly for unemployment and the fed rate is going to be explained as well past values of inflation unemployment and the fed fund rate so let now let's set up this model in views we are here in reviews what we are going to do is we are going to select our variables inflation unemployment and the fed rate we are going to click the right click we are going to go with open and we open as a bar model so here it tells us what type of bar we want we are going to just go with the standard bar but you could technically be working as well with vector error correction bayesian bar switching bar or mixed frequency which i may be covering in the future but for now we're just going to stick with the standard bar here goes our endogenous variables as inflation unemployment and the fed rate and here goes the lags that we are going to be using in our model just for now we're going to stick to two which is the default and then we're going to see how we can select optimally the amount of lags to be included in the model and here is the exogenous variables here we only have a constant that's what the authors did in their paper but alternatively some other authors what they do is they include dummy variables or other exogenous variables here but like i said for now we won't need this we're going to hit ok and here now we get our estimates um we have inflation unemployment on the fed rate and as you can see because we have selected two lakhs it it here shows the two lakhs for inflation the two lacks for unemployment the two lags for the fed rate here goes the constant and here what shows this output is just the normal ols estimation outputs which is the r square you know log likelihood the model selection criteria like akaike schwartz and some other informations that we'll cover later length criteria the importance of an appropriate lag length is that if the line left is too small the model is going to be misspecified while if the length is too large degrees of freedoms are going to be wasted so there are two options to work around this you can let the frequency of the data determine the lag length if you're working with annual data it's going to be one lakh monthly data will be 12 lakhs and quarterly data is going to be 4 lakhs option 2 is using one of the criterions to determine the lag length criteria which would be for example like schwartz and han and queen just as a note there should be no auto correlation at the selected lag so whether we choose option one or two we have to verify that there will be no auto correlation at that specific line so now what we are going to do is going back to our abuse i'm going to show you how to re-estimate the model um we would select estimate the authors have used four lags i invite you to download the paper and follow through they have used four lags there's no autocorrelation and this lag selected by the authors which i'm going to show you later but let's go with um then okay and now you see that it's going to re-estimate our model with four lags if you were to use a selection criteria what you would do is go to view you're going to go to lag structure and you're going to select here lag length criteria what you're going to see in these results is that it's going to report us here some criterions um the most commonly used are kaike schwartz and the han and queen and here is going to tell us the lags it goes from 0 to 8 that's what we have specified earlier but you could select more lags with with no travel and this star shows that it's significative so the schwartz is suggesting to use two lags hannah and queen is suggesting you three likes and then a is suggesting to use five blacks the difference between these um criterions is how much you're penalized for using more likes so that's why the schwarz criterion uses less lags than the akaike and hannah queen is somewhere in between akaiken schwartz that's the reason why it's suggesting three of three lags however like the authors have done in their paper they have used four locks and i'm going to show you a little bit later the decision for this which is most probably related to autocorrelation in the residuals so now let's take a look at the var stability conditions and residual diagnostics the stability of the var system implies stationarity and in the literature stability condition is also referred to as stationarity conditions so if all inverse roots of the characteristic r polynomial have modulus less than 1 and lie inside the unit circle the estimate bar is stable if the bar is not stable diverse tests conducted on our var model may be invalid also impulse response standard errors are not valid so now let's go back to our reviews we're going to select here view we're going to go into lag structure and we're going to select our roots graph as we can see here it's showing us the unit circle and all the roots lay inside the circle however if there are some values that you're not sure about it you can click in view you can go to like structure and report report the table the table is going to show us the values from the bigger value to the to the smaller value and what we can see here is that the biggest value is 0.96 which is smaller than one so although it's kind of close um this is still acceptable um but but definitely if you have values that lay outside the circle you should definitely be re-estimating a different model with different amount of lags so now that we have determined that our model is stable it's time to check for our residuals so let's go back to reviews now and i'm going to show you how to do this we're going to go into view we're going to go into residual tests and then we're going to hit parallelograms we're going to let 12 lags as a default we're going to hit okay and what you have to check here is going to show us is the autocorrelations um within the two standard error bounds so you have to check that these values lay inside the two standard error bounds and we can see that overall all of these values lay inside the two standard error bounds so that's a good signal if you have all these values that are very critical and all laying outside the two standard error bounds then you will definitely want to re-estimate the model using a different like selection the next thing that we're going to do now is go back to view residual test and we're going to go into autocorrelation test we're going to select four lags hit okay here reports our residual zero correlation test the null hypothesis is that there is no serial correlation at the lag specified yeah in this case we have four lags and here reports the p-value we're going to use as a default the 5 significance level meaning that if the p-value in this case is you know bigger than 0.05 we do not reject these hypotheses while if the p-value is smaller than 0.05 we're going to reject this hypothesis and we're going to say that there is correlation so as you can see in lag one there is correlation so using one lag it's not appropriate using lag two remember that for example the shorts criterion told you to use two lags um would have been okay um there's no autocorrelation here using three lags there would have been autocorrelation and using four lakhs it's appropriate as well there's no auto correlation so that's one of the reasons why the authors have decided to go with four also it has to be to do with being quarry data there's no auto correlation on this lag it's a decent amount of lags for you know for our system to be able to show some dynamics so now let's go back to our slides we have decided that our model is stable and that there is no auto at the selected lag so that's a good thing in our model so now we can take a look at the granger causality test so the test of causality examines the flag values of one variable helps to predict other variables in the model the credential test null hypothesis is that x does not grant you cos y and then alternative hypothesis is that x crunch or causes y so if we reject the null hypothesis then what we're going to be saying is that um x grandeur causes y so let's take a look at that in views in views we're going to go into view we're going to go into lag structure and select the ground reconsolidate test it's going to show us the dependent variables inflation unemployment and the fed rate and here are the explanatory variables and it's going to be reporting the corresponding p-values remember that the null hypothesis is that this variable does not grant your cost yes inflation so for example unemployment does not cause inflation is this true or not well because the p-value is smaller than 0.05 what we are saying is we reject this null hypothesis consequently unemployment does help to predict inflation the fed rate because the value is bigger than 0.05 then the fed rate does not help us to predict inflation let's go to unemployment and similarly the same year inflation does not help us to predict unemployment however the fed rate does help us to predict unemployment and in terms of the fed rate what we can see here is both inflation and unemployment help us to predict changes in the fed rate yes so if inflation or unemployment changes the fed rate will be reacting and and be changing the the rate so again the quantity consolidated test what it show us is whether or not these variables will help us to predict current values of the dependent variable right so this is going to be all the content for this video i don't want to make this video too long but in the next video what i'm going to be showing you is impulse response functions and the variance decomposition so i hope that you found this video useful i invite you to subscribe to my channel to keep getting content related to economics research and how to use your views stata and also how to use latex with overleaf it's really important that if you're going to be writing assignments or writing a research paper you know how to use latex because that's going to help us help you sorry to present your your work in a more fancy way in a more formal way and in a way that is used normally in the academics so once again thank you very much for watching and have a have a great day
Info
Channel: JD Economics
Views: 10,014
Rating: 4.9499998 out of 5
Keywords: How to estimate and interpret VAR models, Vector autoregression model in eviews, eviews var model, Vector autoregression model eviews, var model in eviews, var model eviews, var in eviews, var eviews tutorial, var eviews, estimate var in eviews, stock and watson 2001, Monetary policy shock eviews, estimating a var(p) in eviews, how to run var model in eviews, var model explained, Vector autoregression model, Vector Autoregression, var model, what is the var model, Var, eviews
Id: SbE8ns0oOTs
Channel Id: undefined
Length: 14min 57sec (897 seconds)
Published: Sat Jan 23 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.