## From convergence in distribution to convergence in probability

Posted: December 21, 2018 in Cross-Validated relay, Educational Material, stats.stackexchange relay
Tags: , ,

When $\hat \theta \to_d D(\theta, v) \implies \hat\theta \to_p \theta$ ?

We know that in general, convergence in distribution does not imply convergence in probability. But for the case of most interest in econometrics, where we examine a sequence of estimators, convergence in distribution does imply convergence in probability to a constant, under two regularity conditions that are also satisfied in most cases of interest.

This post of mine in stats.stackexchange.com has the proof. Essentially, under these regularity conditions we are able to prove the even stronger result that convergence in distribution  implies  convergence in quadratic mean to a constant (which in turn implies convergence in probability to that constant).

## Marginal and Joint Normality

Posted: December 9, 2018 in Educational Material
Tags:

Time and again I encounter people confused about marginal and joint Normality, and I could not find a single internet or book source that lists together the main cases of interest. So here they are:

1. Subject to the usual regularity conditions, the below hold if we are talking about asymptotic (limiting) marginal/joint Normality also.
2. If two random variables are not each marginally Normal, then they are certainly not jointly Normal.
3. If two random variables are not each marginally Normal, then their linear combinations are not marginally Normal.
4. If $X$ and $Y$ have Normal marginals and they are independent, then they are also jointly Normal. By the Cramér-Wold theorem, it then follows that all linear combinations of them (like their sum, difference, etc) are also marginally Normal. If we want to consider more than two random variables, then for the above to hold, they must be jointly independent, and not only pair-wise independent. If they are only pair-wise independent, then they may be jointly Normal, may be not.
5. If $X$ and $Y$ have Normal marginals but they are dependent, then it is not necessarily the case that they are jointly Normal. They may be, they may be not. It follows that a linear combination of them, may be Normal, may be not. It must be proven based on something more than marginal Normality. This is important to remember when one wants to show asymptotic Normality of a test statistic that is linearly composed by two random variables that are each asymptotically normal, but they are dependent without the dependence dying out asymptotically. This is the case, for example, in all “Hausman endogeneity tests” in econometrics, where the test statistic is the difference of two estimators that use the same data, and so are in general dependent. Even if each is asymptotically Normal, the asymptotic Normality of their difference does not necessarily follow. In his original paper in fact, Hausman (1978) explicitly assumed asymptotic normality of the test statistic, he did not prove it.
6. If $X$ and $Y$ have Normal marginals, are uncorrelated (i.e. their covariance is zero), but they have some higher order/”non-linear” form of dependence, than they are certainly NOT jointly normal (because in joint Normality, existence of dependence is always expressed also as non-zero covariance). Their linear combinations may be marginally Normal, may be not. Again, if a linear combination of them is the object of interest, its asymptotic normality must be proven based on something more than marginal Normality.