Maybe you had to multiply means or expected values already. If you know, for instance, how often people go shopping on average and how much money they spent on their shopping tours on average then you could multiply both to obtain the average amount spent. In this post I will explain why multiplying means and expected values is a valid operation.
As in “Why you can add means and expected values”, my reasoning will be based on expected values. You way want to read that post first to gain an understanding of the key ideas which I will use straight forward in this post. Otherwise, let’s start with the definition of our expected value for two multiplied random variables and :
Above you see that we add all possible products resulting when and take specific values. The probability for each combination has been stated by . Now we assume – and this is important! – that and are independent. Under the condition of independence can be expanded to . Wait until you spot the implications on our calculation!
With our simplification we were able to take and out of the inner sum, because both measures are constant in terms of . Have you recognized that the inner sum, which iterates for all values for , now resembles the expected value ?
is constant in terms of and, therefore, can be taken in front of the remaining sum:
Surprise! 😀 The sum gives the expected value of .
Et voilà! The above line is what we wanted to show in the first place. Keep in mind that you could replace all the sum signs by integrals to show that you can also multiply continous random variables. Further, multiplications works also for means as they are approximations of the expected value of a distribution. So, it is valid to state , if both measures are independent!
Leave a Reply