Gauss-Markov assumptions part 2 - YouTube

Channel: unknown

[0]
Hi thanks for joining me. Today we are going to be talking about the
[5]
second half of the Gauss-Markov assumptions.
[7]
If you missed the first half you may want to have a look at the previous video
[10]
which looks through assumptions one to three.
[13]
So just to reiterate, the Gauss-Markov assumptions are the set of conditions which if
[19]
they are upheld
[20]
then that means that least-square estimators are BLUE.
[25]
So, that means that they are the best, linear, unbiased estimators
[28]
which are possible.
[30]
So the fourth Gauss-Markov assumption is something which we refer to as no
[36]
perfect
[37]
collinearity.
[39]
And this is
[41]
referring to our particular sample, but by deduction it also refers to
[46]
the population.
[47]
So, what does it actually mean? Well no perfect collinearity - in regressors
[51]
I should say -
[53]
that means that if i have some sort of model that y equals alpha plus 'beta
[58]
one' times 'x one'
[60]
plus 'beta two' times 'x two',
[62]
plus some sort of an error.
[65]
That there cannot be an exact relationship between 'x one' and 'x two',
[70]
so I cannot write down in an equation that 'x one' is equal to 'delta
[75]
nought',
[75]
plus 'delta one'
[77]
times 'x two'.
[79]
That means that if I know 'x two', I
[82]
exactly know 'x one'.
[83]
In a sense 'x one' and 'x two' are
[86]
exactly the same event.
[88]
So, an example of this might be, if I was trying to determine
[92]
which factors affect the house price of a given house
[97]
from its attributes,
[98]
then if I was to include a regression
[102]
which included the square meterage of that house,
[106]
and also the square footage.
[110]
Well, obviously if I know square meterage,
[114]
I actually know square footage - they are both essentially the same thing.
[117]
Square footage is essentially equal to nine, times
[122]
the square meterage of the house.
[124]
So, obviously within a regression, I am going to have a hard time unpicking
[129]
square footage from square meterage,
[131]
because they're exactly the same thing.
[133]
And, the assumption of no perfect collinearity among regressions means that
[138]
I cannot include both of these things in my regression.
[141]
Assumption five
[143]
is
[144]
called 'homoskedastic errors'.
[147]
So, homoskedastic errors means that if I was to draw a process -
[153]
so let's say that I have the number of years of education
[157]
and the wage rate,
[159]
and this
[160]
again is referring to population rather than to the sample.
[165]
If I have errors which,
[168]
looks something like - when I draw the population line - like that
[172]
whereby the distribution of errors away from the line remain relatively constant,
[179]
that are lying between the error lines which I draw here.
[182]
There's no increasing or decreasing of errors along the
[187]
education variable,
[189]
then that means that errors are
[191]
homoskedastic.
[192]
So, mathematically that just means that I can write the variance of our
[196]
error in
[198]
the population process,
[200]
is equal to some constant, 'sigma squared', or writing it a little bit more
[205]
completely. The variance of 'u i' given 'x i' is equal to 'sigma squared'.
[209]
In other words the variance - how far the points are away from the line -
[213]
does not vary systematically
[215]
with x.
[218]
The last Gauss-Markov assumption is
[221]
called 'no serial
[224]
correlation'.
[225]
What this means is mathematically that the covariance between
[230]
a given error 'u i' and
[232]
another error 'u j',
[236]
must be equal to zero,
[239]
unless i equals j. In which case we are considering the covariance of the error with itself, in
[244]
which case we have variance, which is to do with assumption five.
[247]
So,
[248]
this last assumption of 'no serial correlation' means that
[251]
the errors essentially have to be independent of one another.
[255]
So, knowing one of the errors,
[258]
doesn't help me predict another error. So in other words if I know this error here in
[263]
my diagram this doesn't help me predict the error here for a higher level of
[267]
education.
[269]
This concludes my video summarising the Gauss-Markov assumptions.
[273]
I'm going to go and examine each of these assumptions in detail
[277]
in the next few videos.
[278]
I'll see you then.