Conditional Expectation: 7 Facts You Should Know

For the random variable dependent on one another requires the calculation of conditional probabilities which we already discussed, now we will discuss some more parameters for such random variables or experiments like conditional expectation and conditional variance for different types of random variables.

Conditional Expectation

   The definition of conditional probability mass function of discrete random variable X given Y is

image

here pY(y)>0 , so the conditional expectation for the discrete random variable X given Y when pY (y)>0 is

image 1

in the above expectation probability is the conditional probability.

  In similar way if X and Y are continuous then the conditional probability density function of the random variable X given Y is

image 2

where f(x,y) is joint probability density function and for all yfY(y)>0 , so the conditional expectation for the random variable X given y will be

MT2

for all yfY(y)>0.

   As we know that all the properties of probability are applicable to conditional probability same is the case for the conditional expectation, all the properties of mathematical expectation are satisfied by conditional expectation, for example conditional expectation of function of random variable will be

image 3

and the sum of random variables in conditional expectation will be

image 4

Conditional Expectation for the sum of binomial random variables

    To find conditional expectation of the sum of binomial random variables X and Y with parameters n and p which are independent, we know that X+Y will be also binomial random variable with the parameters 2n and p, so for random variable X given X+Y=m the conditional expectation will be obtained by calculating the probability

image 5

since we know that

image 6

thus the conditional expectation of X given X+Y=m is

image 7

Example:

Find the conditional expectation

image 8

if the joint probability density function of continuous random variables X and Y is given as

image 9

solution:

To calculate the conditional expectation we require conditional probability density function, so

image 10

since for the continuous random variable the conditional expectation is

image 11

hence for the given density function the conditional expectation would be

image 12

Expectation by conditioning||Expectation by conditional expectation

                We can calculate the mathematical expectation with the help of conditional expectation of X given Y as

image 13

for the discrete random variables this will be

image 14

which can be obtained as

image 15

and for the continuous random we can similarly show

image 16

Example:

                A person is trapped in his building underground as the entrance is blocked due to some heavy load fortunately there are three pipelines from which he can come out the first pipe take him safely out after 3 hours, the second after 5 hours and the third pipeline  after 7 hours, If any of these pipeline chosen equally likely by him, then what would be the expected time he will come outside safely.

Solution:

Let X be the random variable that denote the time in hours until the person came out safely and Y denote the pipe he chooses initially, so

image 17

since

image 18

If the person chooses the second pipe , he spends 5 hous in that but  he come outside with expected time

image 19

so the expectation  will be

image 20

Expectation of sum of random number of random variables using conditional expectation

                Let N be the random number of random variable and sum of random variables is     then the expectation  

image 21

since

image 22

as

MT11

thus

MT12

Correlation of bivariate distribution

If the probability density function of the bivariate random variable X and Y is

image 23

where

image 24

then the correlation between random variable X and Y for the bivariate distribution with density function is

since correlation is defined as

image 25

since the expectation using conditional expectation is

image 26

for the normal distribution the conditional  distribution X given Y is having mean

image 27

now the expectation of  XY given Y is

image 28

this gives

image 29

hence

image 30

Variance of geometric distribution

    In the geometric distribution let us perform successively independent trials which results in success with probability p , If N represents the time of first success in these succession then the variance of N as by definition will be

image 31

Let the random variable Y=1 if the first trial results in success and Y=0 if first trial results in failure, now to find the mathematical expectation here we apply the conditional expectation as

image 32

since

image 33

if success is in first trial then N=1 and N2=1 if failure occur in first trial , then to get the first success  the total number of trials will have the same distribution as 1 i.e the first trial that results in failure with  plus the necessary number of additional trials,  that is

image 34

Thus the expectation will be

image 35

since the expectation of geometric distribution is so

image 36

hence

image 37

and

E

image 38

so the variance of geometric distribution will be

image 39

Expectation of Minimum of sequence of uniform random variables

   The sequence of uniform random variables U1, U2 … .. over the interval (0, 1) and N is defined as

image 40

then for the expectation of N, for any x ∈ [0, 1] the value of N

image 41

we will set the expectation of N as

image 42

to find the expectation we use the definition of conditional expectation on continuous random variable

lagrida latex editor 6

now conditioning for the first term of the sequence  we have

image 43

here we get

image 44

the remaining number of uniform random variable is same at the point where the first uniform value is y,in starting and then were going to add uniform random variables until their sum surpassed x − y.

so using this value of expectation the value of integral will be

image 45

if we differentiate this equation

image 46

and

image 47

now integrating this gives

image 48

hence

image 49

the value of k=1 if x=0 , so

m

image 50

and m(1) =e, the expected number of uniform  random variables over the interval (0, 1) that need to be added until their sum surpasses 1, is equal to e

Probability using conditional Expectation || probabilities using conditioning

   We can find the probability also by using conditional expectation like expectation we found with conditional expectation, to get this consider an event and a random variable X as

image 51

from the definition of this random variable and expectation clearly

image 52

now by conditional expectation in any sense we have

image 53

Example:

compute the probability mass function of random variable X , if U is the uniform random variable on the interval (0,1), and consider the conditional distribution of X given U=p as binomial  with parameters n and p.

Solution:

For the value of U the probability by conditioning is

image 54

we have the result

lagrida latex editor 15

so we will get

image 55

Example:

what is the probability of X < Y, If X and Y are the continuous random variables with probability density functions fX and fY respectively.

Solution:

By using conditional expectation and conditional probability

image 56

as

image 57

Example:

Calculate the distribution of sum of continuous independent random variables X and Y.

Solution:

To find the distribution of X+Y we have to find the probability of the sum by using the conditioning as follows

image 58

Conclusion:

The conditional Expectation for the discrete and continuous random variable with different examples considering some of the types of these random variables discussed using the independent random variable and the joint distribution in different conditions, Also the expectation and probability how to find using conditional expectation is explained with examples, if you require further reading go through below books or for more Article on Probability, please follow our Mathematics pages.

https://en.wikipedia.org/wiki/probability_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH