Math Rules and Formulas

Formulas

These are all of the important formulas for this course. All of these will be on the back page of both midterms and the final exams.

Probability

Let X be a discrete random variable, xi a potential outcome for X, and pi the probability that outcome occurs. Then:

  1. Expected value of a discrete random variable: E[X]=μX=i=1nxipi
  2. Variance of a discrete random variable: Var(X)=σX2=E[(XμX)2]=i(xiμX)2pi
  3. If Y is another random variable, Cov(X,Y)=σXY=E[(XμX)(YμY)]
  4. Correlation of two random variables: ρXY=σXYσX2σY2

Statistics

Let X be a random variable and let xi be an observation of a sample of X.

  1. The estimator of the expected value of X is the sample mean: x¯=1ni=1nxi
  2. The estimator for Var(X) is 1n1i=1n(xix¯)2
  3. The estimator for Cov(X,Y) is 1n1i=1n(xix¯)(yiy¯)

Simple Regression

The true model: yi=β0+β1xi+ui

The estimated model: yi=β0^+β1^xi+ei

Formulas for simple regression coefficients:

β0^=y¯β^1x¯

β1^=(ixiyi)x¯y¯n(ixi2)x¯2n

The R2 of a regression: i(y^iy¯)2i(yiy¯)2=1iei2i(yiy¯)2

Useful Math Rules

These rules will not be included in the formulas sheet on the exams, but you should know all of these math rules by heart.

Summation Rules

Let x and y be vectors of length n.

  1. Summation definition: i=1nxix1+x2+...+xn

  2. The sum of x + y is the same as the sum of x + the sum of y: i(xi+yi)=ixi+iyi

  3. For any constant c, the sum of c * x is the same as c times the sum of x. icxi=cixi

  4. In general, the sum of x times y is not equal to the sum of x times the sum of y: ixiyiixiiyi

Variance Rules

  • The variance of a constant is zero: Var(c)=0
  • The variance of a constant times a random variable: Var(cX)=c2Var(X)
  • The variance of a constant plus a random variable: Var(c+X)=Var(X)
  • The variance of the sum of two random variables: Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)

Covariance Rules

  • The covariance of a random variable with a constant is 0: Cov(X,c)=0
  • The covariance of a random variable with itself is its variance: Cov(X,X)=Var(X)
  • You can bring constants outside of the covariance: Cov(X,cY)=cCov(X,Y)
  • If Z is a third random variable: Cov(X,Y+Z)=Cov(X,Y)+Cov(X,Z)

plim rules

Let c be a constant. Let xn and yn be sequences of random variables where plim(xn)=x and plim(yn)=y. That is, for large x, the probability density function of xn collapses to a spike on the value x and the same for yn and y. Then:

  1. The probability limit of a constant is the constant: plim(c)=c
  2. plim(xn+yn)=x+y
  3. plim(xnyn)=xy
  4. plim(xnyn)=xy
  5. plim(g(xn,yn))=g(x,y) for any function g.

Expectations

Let A and B be random variables, and let c be a constant.

  1. E[A+B]=E[A]+E[B]

  2. In general, E[AB]E[A]E[B]

  3. Constants can pass outside of an expectation: E[cA]=cE[A]

And continuing from 3), since E[A] is a constant, E[B E[A]]=E[A]E[B].

Conditional Expectations

If the conditional expectation of something is a constant, then the unconditional expectation is that same constant:

If E[A|B]=c, then E[A]=c.

Why? The law of iterated expectations:

E[A]=E[E[A|B]]=E[c]=c

Log rules

  1. loge(e)=1
  2. log(ab)=log(a)+log(b)
  3. log(ab)=log(a)log(b)
  4. log(ab)=b log(a)