Conditioning on an Event

Conditional PMF and Expectation

We consider conditioning on an event A. Assuming that the probability that event A occurs is positive, i.e., P(A)>0, the conditional PMF of a random variable X given that A occurred is

pX|A(x)=P(X=x|A).

Like ordinary probabilities, it is normalized, i.e.,

xpX|A(x)=1.

The conditional expectation of X is then

E[X|A]=xxpX|A(x).

Given a function of X, g(X), its conditional expectation is

E[g(X)|A]=xg(x)pX|A(x).

Total Expectation Theorem

If we divide the sample space into n disjoint events A1,,An, then the expectation of X is

E[X]=P(A1)E[X|A1]++P(An)E[X|An].

Conditioning on another Random Variable

We can also condition a random variable on another random variable. The conditional PMF of X given that Y=y is defined by

pX|Y(x|y)=pX,Y(x,y)pY(y),

assuming that pY(y)>0. We can also condition on two random variables with the conditional PMF being

pX|Y,Z(x|y,z)=pX,Y,Z(x,y,z)pY,Z(y,z).

Conditional Expectation

The conditional expectation of a random variable X conditioned on Y=y is

E[X|Y=y]=xxpX|Y(x|y).

Moreover, for any function g(X),

E[g(X)|Y=y]=xg(x)pX|Y(x|y).

Independence

There are several statements we can make when two random variables, X and Y, are independent. First,

E[XY]=E[X]E[Y].

Moreover, g(X) and h(Y) are also independent, and

E[g(X)h(Y)]=E[g(x)]E[h(Y)].

Finally, the variance of their sum is

var(X+Y)=var(X)+var(Y).