Math Rules and Formulas

Formulas

These are all of the important formulas for this course. All of these will be on the back page of both midterms and the final exams.

Probability

Let \(X\) be a discrete random variable, \(x_i\) a potential outcome for \(X\), and \(p_i\) the probability that outcome occurs. Then:

  1. Expected value of a discrete random variable: \(E[X] = \mu_X = \sum_{i=1}^n x_i p_i\)
  2. Variance of a discrete random variable: \(Var(X) = \sigma^2_X = E[(X - \mu_X)^2] = \sum_i (x_i - \mu_X)^2 p_i\)
  3. If \(Y\) is another random variable, \(Cov(X, Y) = \sigma_{XY} = E[(X - \mu_X)(Y - \mu_Y)]\)
  4. Correlation of two random variables: \(\rho_{XY} = \frac{\sigma_{XY}}{\sqrt{\sigma_X^2\sigma_Y^2}}\)

Statistics

Let \(X\) be a random variable and let \(x_i\) be an observation of a sample of \(X\).

  1. The estimator of the expected value of \(X\) is the sample mean: \(\bar{x} = \frac{1}{n} \sum_{i = 1}^n x_i\)
  2. The estimator for \(Var(X)\) is \(\frac{1}{n-1}\sum_{i=1}^n(x_i - \bar{x})^2\)
  3. The estimator for \(Cov(X, Y)\) is \(\frac{1}{n-1}\sum_{i=1}^n(x_i - \bar{x})(y_i - \bar{y})\)

Simple Regression

The true model: \(y_i = \beta_0 + \beta_1 x_i + u_i\)

The estimated model: \(y_i = \hat{\beta_0} + \hat{\beta_1}x_i + e_i\)

Formulas for simple regression coefficients:

\[\hat{\beta_0} = \bar{y} - \hat{\beta}_1 \bar{x}\]

\[\hat{\beta_1} = \frac{(\sum_i x_i y_i) - \bar{x}\bar{y}n}{(\sum_i x_i^2) - \bar{x}^2 n}\]

The \(R^2\) of a regression: \(\frac{\sum_i (\hat{y}_i - \bar{y})^2}{\sum_i (y_i - \bar{y})^2} = 1 - \frac{\sum_i e_i^2}{\sum_i (y_i - \bar{y})^2}\)

Useful Math Rules

These rules will not be included in the formulas sheet on the exams, but you should know all of these math rules by heart.

Summation Rules

Let x and y be vectors of length n.

  1. Summation definition: \(\sum_{i = 1}^{n} x_i \equiv x_1 + x_2 + ... + x_n\)

  2. The sum of x + y is the same as the sum of x + the sum of y: \(\sum_i (x_i + y_i) = \sum_i x_i + \sum_i y_i\)

  3. For any constant c, the sum of c * x is the same as c times the sum of x. \(\sum_i c x_i = c \sum_i x_i\)

  4. In general, the sum of x times y is not equal to the sum of x times the sum of y: \(\sum_i x_i y_i \neq \sum_i x_i \sum_i y_i\)

Variance Rules

  • The variance of a constant is zero: \(Var(c) = 0\)
  • The variance of a constant times a random variable: \(Var(cX) = c^2 Var(X)\)
  • The variance of a constant plus a random variable: \(Var(c + X) = Var(X)\)
  • The variance of the sum of two random variables: \(Var(X + Y) = Var(X) + Var(Y) + 2 Cov(X, Y)\)

Covariance Rules

  • The covariance of a random variable with a constant is 0: \(Cov(X, c) = 0\)
  • The covariance of a random variable with itself is its variance: \(Cov(X, X) = Var(X)\)
  • You can bring constants outside of the covariance: \(Cov(X, c Y) = c Cov(X, Y)\)
  • If Z is a third random variable: \(Cov(X, Y + Z) = Cov(X, Y) + Cov(X, Z)\)

\(plim\) rules

Let \(c\) be a constant. Let \(x_n\) and \(y_n\) be sequences of random variables where \(plim(x_n) = x\) and \(plim(y_n) = y\). That is, for large x, the probability density function of \(x_n\) collapses to a spike on the value x and the same for \(y_n\) and y. Then:

  1. The probability limit of a constant is the constant: \(plim(c) = c\)
  2. \(plim(x_n + y_n) = x + y\)
  3. \(plim(x_n y_n) = x y\)
  4. \(plim(\frac{x_n}{y_n}) = \frac{x}{y}\)
  5. \(plim(g(x_n, y_n)) = g(x, y)\) for any function g.

Expectations

Let A and B be random variables, and let c be a constant.

  1. \(E[A + B] = E[A] + E[B]\)

  2. In general, \(E[A B] \neq E[A] E[B]\)

  3. Constants can pass outside of an expectation: \(E[c A] = c E[A]\)

And continuing from 3), since \(E[A]\) is a constant, \(E[B \ E[A]] = E[A] E[B]\).

Conditional Expectations

If the conditional expectation of something is a constant, then the unconditional expectation is that same constant:

If \(E[A | B] = c\), then \(E[A] = c\).

Why? The law of iterated expectations:

\[\begin{align*} E[A] &= E \left [ E[A | B] \right ] \\ &= E[c] \\ &= c \end{align*}\]

Log rules

  1. \(log_e(e) = 1\)
  2. \(log(a b) = log(a) + log(b)\)
  3. \(log(\frac{a}{b}) = log(a) - log(b)\)
  4. \(log(a^b) = b \ log(a)\)