(EViews10): Estimate and Interpret VECM (1) #var #vecm #causality #lags #Johansen #innovations

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
it's another interesting session on crunch econometrics today I will take you through the basic steps of estimating and interpreting a vetto error correction model if you are a beginner or an intermediate user of econometric tools or any analytical software like eviews and Stata I will encourage you to stay with my channel Kranti econometrics is dedicated to beginners and intermediates users whenever you are my channel try to load in your data and just follow my procedure with that you cannot begin to gather some confidence on how you can analyze your data okay basic steps estimating if they came what are they you can see them on the screen step 1 make sure the series are stationary at first difference definitely not at second difference step 2 go ahead to determine the optimal a plan for the model I call the optimal angle in this case lag length P after that they form jurassica integration test by imputing dialog lens into the Johnson model step 4 if after submitting johansson and there is no cointegration simply estimate the unrestricted VAR or you can call it the basic VAR well it does cointegration go ahead to specify the vacuum now you are going to reduce the number of lags by one that you use into the vacuum model after that perform some diagnostic tests in this video because I only want to explain to you I'm going to take steps one two three then I'll conclude step four five six in the next video okay this is the data will be using PDI PCE and GDP let's start with step one perform stationarity test so i double-click PDI i go to view and manova to you need to test i select the level I'm using a IC so I change this to attack a info criteria because I have the quarterly data I can use up to four it lags so I'll just use for I click ok so we have our results for the LPD I you need to test you can see it hop yet potus's is that this variable has a unit root so how do we reject or not reject the null hypothesis you have to look at the T statistic if the absolute value of the Dickey fuller test statistic is higher than the critical values indicated here then we will reject the null but if that absolute value is not higher than the critical values indicated here then we will fail to reject the null so from what we are seeing here the absolute value of the LP di series is lower than the indicator critical values even at the 10% level so this one tells us that fpd high as a unit root so how do we correct this we go back to view we click on unit root test now we choose force difference we leave the maximum lags at 4 and we click OK now you can see when the probe value is so significant it won't be low 1 percent level so now this is a stationary series at first difference so we do do something for the second variable I double click that I go to view click on you need to test button is on level I change this to a I see I reduce 11 laps to 4 and I click OK so using the same arguments that I told you earlier on once it's his statistics that is the absolute value of the t statistic of the dickey-fuller test if is lower than the critical value here then we cannot reject the null and from what we have seen the absolute value is lower so that means L PCE as a unit root to correct that we go back to view unit root test and now we select first difference and click OK so looking at the absolute value at 3.2 clearly higher than 5% level is now a first difference stationary series so we take the last variable GDP do the same thing view unit root test check the level button I changed 11 back to 4 I click ok so you can see it has a unit root to correct that view you need to test now change it to force difference nothing else is affected I'll click OK so clearly it is now a force difference stationary series so step one is done step two determine optimal ads for the model to do that I'm going to open all the variables as an unrestricted van and I like all then I right click and I select open as well you can see all the variables are listed here not enforced difference both in their levels and the standard verboten is checked I'm going to change two to four just arbitrarily to four then I click OK now this is the own restricted VAR results but this is know where we are going we need to determine the appropriate locks to use so we go to view we click on lab structure with select Locklear criteria and here and we decide to leave it at its I click OK I'm using the AIC so for consistency I'm looking at the AIC and the AIC as asterix' lab - so I'm using that - for this model so whenever the information criterion puts an Asterix at Sophie go that is the optimal lag chosen by that information criterion so you may decide to use SC you may decide to use HQ or a I see it depends on your choice well I'll be using AIC for this analysis so step two is done ultima log is to step three perform Johansson taste to do that we go to quick group statistics Johansson cointegration test so now I'm going to list all the variables there's something I need to tell you here the variable you list first will be your target variable that is the variable by which the normalization will be conducted upon so be careful how you arrange depending on what you have put a cc's or what your research is all about so know that the variable you list forced will be read it's variable so in this case my standard variable is PDI I click OK now we are in the dressing conversion test dialog box by default views as indicated case 3 and case 3 is often the standard practice so I'm not going to change that and case Chris simply means this model we use an unrestricted constants that has no trend that means the constant will be both in the consecration equation and it will also be in the VAR equation but there will not be trained in the model some ones who leave it at history this is the standard practice unless you have a reason to include trained in your Moodle then you can choose option for or in a very or in a very extreme case you can choose option 5 now coming to logins of us remember our optimal log indicated by AIC is true so this stays the way it is so make sure that you impute the optimal logs indicated by the information criteria you have chosen that is what you put here you don't put a bit worried log in the space you put the optimal log in this place so I click OK so this is the results of our Johanson cointegration test so how do you interpret all this first thing you need to look at your series they are well indicate set you just the way you put them in into the johnson consecration model and you have two results the trace result up here and the result from the maximum eigen value remember any where you see the ax teri sign it tells you something is happening and in this case it tells you that you are rejecting that null hypothesis everything listed here are the null hypothesis of the johansson cointegration same thing for the eGain value equation here so whenever one of the appetizers is a series it tells you you are rejecting the null hypothesis so now because none is asterisk that null hypothesis of no cointegration is rejected it is also rejected for the maximum eigen value and by the time you look loaded table you can see that it twists test indicates that we have one cointegration equation so also the max eigenvalue indicates one cointegration equation so this is how you can read the draw simple integration test and you interpret accordingly at most one you can see we cannot reject our null hypothesis it is at nineteen point zero seven likewise at most true we cannot reject the null hypothesis because it's also higher than five percent level but overall the test indicates we have one cointegrating equation and this is the evidence here the non is asterisk so we are rejecting that there is no consideration in this module let's scroll down a bit I need to show you the normalized cointegrating equation and how you can interpret it so here we are you can see here one Chris Griffin equation and this is a normalized cointegration equation I've already copied is out neatly to a PowerPoint slide somewhere to move away from here to the PowerPoint slide so here you can see the result I showed you earlier from eviews students often make mistakes when they are interpreting the Johanson normal normalization results you interpret the normalization results by reversing the signs like I told you before because I put PDI as a false variable in the Jurassic went regression model normalization took place on the PDI variable so the coefficient of PC and GDP will be reversed during interpretation even though it shows as a negative sign here when you are interpreting you must interpret it as having a positive relationship with PDI so that is how you read the Johanson normalized cointegrating equation just some few notes I wrote here I said L PDI is positioned as a dependent variable so by interpretation you can say something like this in the long run LPC here are the positive impacts while log of GDP has a negative impact on PDI on average ceteris paribus you have to use a ceteris paribus because this exhaust Oilers estimates and this also goes to tell us that the coefficients are statistically significant are the 1% level how do we know that we can easily see from the standard arrow that we can compute the T statistics how can you compute the T statistics you divide the coefficients by the respective T statistics and if you do that for the LM PCE variable you are going to obtain 7.23 statistics and 5.8 for for GDP these are clearly above 2 so this one indicates one percent significance level so in conclusion the null hypothesis of no cointegration is rejected against the alternative of equals greater relationship in the model so again when you are interpreting Johanson normalization please reverse the signs don't read it exactly the way you see it you have to reverse the signs this one says PC as a positive relationship with PDI and GDP now has a negative relationship with PDI so to wrap up this on Paswan remember the target variable is place post determining the maximum length is an empirical issue that would depend on the structure of your data if you have a monthly data or a quarterly data or a weekly data or a yearly data so you cannot arbitrarily determine lads if you do that and you put too many lags you will lose the risk of freedom your coefficients may turn out to be statistically not significant and you may end up having multicollinearity again if you have too few labs you could lead to specification arrows so the only way out of that is to choose your optimal lags using any of this information criterion after that performs Oh hands on chest then estimate working with the P minus 1 labs it supplies your coefficients as ceteris paribus effects there never forgets to pay for diagnostics please read up on these test books and so many out there I'm showing you on the screen and you can also download journal articles to see how they went about their post video subscribe to my channel if you have not done so don't go away I'll be right back with how you can go ahead to estimate the victim interpret AIDS and also perform Diagnostics
Info
Channel: CrunchEconometrix
Views: 67,072
Rating: 4.8751445 out of 5
Keywords: how to estimate and interpret vecm, vecm stata, vecm eviews, vecm interpretation eviews, vecm interpretation in eviews, vecm results eviews, vecm results in eviews, estimate vector error correction mo
Id: yv5vwQhmH2Y
Channel Id: undefined
Length: 13min 44sec (824 seconds)
Published: Fri May 25 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.