# Useful Formulae

## Statistics

### Variance and Covariance

$$ Cov(X,Y) = E(X-E(X)) E(Y-E(Y)) = E(XY) - E(X)E(Y) $$

If $$X,Y$$ are independent then $$E(XY) = E(X)E(Y)$$ and the covariance is zero. NB: the opposite implication does **not** hold: $$Cov(X,Y) = 0$$ does not mean $$X,Y$$ are independent.

The variance is the covariance of X with itself:

$$ Var(X) = Cov(X,X) = E((X-E(X))^{2}) = E(X^{2}) - E(X)^{2}

### Moment Generating Functions

The moment-generating function for a random variable $$X$$ is:

$$ \[ M_{X}(t) = E(e^{tX}) \] $$

Useful MGFs:

$$ X \sim N(\mu, \sigma^{2}), M(t) = exp[\mut + \frac{\sigma^{2} t^{2}}{2}] $$

This is particularly valuable as the log-normal distribution ($$Y \sim e^{X}$$ where $$X \sim N(\mu, \sigma^{2})$$) is ubiquitous in Economics. This formula allows us to derive expectations for this distribution from the MGF.

$$ X \sim \textrm{t-distn}, M(t) = \textrm{undefined}, \infty $$

### Martingales

A discrete time stochastic process (i.e. a sequence of rvs $$X_{1}, X_{2}, ...$$) is a martingale if:

Expectations are bounded: $$E(|X_{n}|) < \infty$$

- Expectations are 'stationary' given history (alternatively the change in a variable given history has expectation zero): $$E(X_{n+1} | X_{1}, ... X_{n}) = X_{n}$$

Convergence Theorem (informal): Assume the $${X_{k}}$$ have bounded moments then the sequence converges almost surely to a RV $$X_{\infty}$$ which is finite with prob. 1.