Maybe you had to multiply means or expected values already. If you know, for instance, how often people go shopping on average and how much money they spent on their shopping tours on average then you could multiply both to obtain the average amount spent. In this post I will explain why multiplying means and expected values is a valid operation.

As in “Why you can add means and expected values”, my reasoning will be based on expected values. You way want to read that post first to gain an understanding of the key ideas which I will use straight forward in this post. Otherwise, let’s start with the definition of our expected value for two multiplied random variables X and Y:

\displaystyle \mathrm{E}\big[X \times Y\big]=\sum\limits_{x} \sum\limits_{y} x \times y \times p_{x,y}(x,y)

Above you see that we add all possible products resulting when X and Y take specific values. The probability for each combination has been stated by p_{x,y}(x,y). Now we assume – and this is important! – that X and Y are independent. Under the condition of independence p_{x,y}(x,y) can be expanded to p_{x}(x)\times p_{y}(y). Wait until you spot the implications on our calculation!

\displaystyle \mathrm{E}\big[X Y\big]=\sum\limits_{x} \sum\limits_{y} x \times y \times p_{x}(x) \times p_{y}(y)

\displaystyle \mathrm{E}\big[X Y\big]=\sum\limits_{x} x \times p_{x}(x) \times \sum\limits_{y} y \times p_{y}(y)

With our simplification we were able to take x and p_{x}(x) out of the inner sum, because both measures are constant in terms of y. Have you recognized that the inner sum, which iterates for all values for y, now resembles the expected value \mathrm{E}[Y]?

\displaystyle \mathrm{E}\big[X Y\big]=\sum\limits_{x} x \times p_{x}(x) \times \mathrm{E}\big[Y\big]

\mathrm{E}[Y] is constant in terms of x and, therefore, can be taken in front of the remaining sum:

\displaystyle \mathrm{E}\big[X Y\big]= \mathrm{E}\big[Y\big] \times \sum\limits_{x} x \times p_{x}(x)

Surprise! 😀 The sum gives the expected value of Y.

\displaystyle \mathrm{E}\big[X Y\big]= \mathrm{E}\big[Y\big] \times \mathrm{E}\big[X\big]

Et voilà! The above line is what we wanted to show in the first place. Keep in mind that you could replace all the sum signs by integrals to show that you can also multiply continous random variables. Further, multiplications works also for means as they are approximations of the expected value of a distribution. So, it is valid to state \overline{xy}=\overline{x}\times \overline{y}, if both measures are independent!