One of statistic’s foundations lies in the fact you can add variances. Maybe you wonder a little bit, because the formula for the variance does not look like that at first glance. This article will show you the proof why and under which circumstances adding variances is a valid practice. Please check the information given in my articles on addition and multiplication of expected values, if you do not have collected experiences with it yet.

First, consider the definition of the variance for a random variable Z .

\displaystyle \mathrm{Var}\big(Z\big)=\mathrm{E}\Big[\big(Z-\mathrm{E}[Z]\big)^{2}\Big]

As we want to demonstrate that \mathrm{Var}(X+Y)=\mathrm{Var}(X)+\mathrm{Var}(Y), let’s substitute Z=X+Y. It is valid to add expected values, so we replace \mathrm{E}[X+Y] by \mathrm{E}[X]+\mathrm{E}[Y] afterwards.

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{E}\Big[\big(X+Y-\mathrm{E}[X+Y]\big)^{2}\Big]=\mathrm{E}\Big[\big(X+Y-\mathrm{E}[X]-\mathrm{E}[Y]\big)^{2}\Big]

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{E}\Big[\big(\big(X-\mathrm{E}[X]\big)+\big(Y-\mathrm{E}[Y]\big)\big)^{2}\Big]

Unfortunately, multiplying out this mess is hard work, but we need to 😀 I did this below. Take notice that, in the second step, I splitted the whole expected value on the right-hand side.

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{E}\Big[\big(X-\mathrm{E}[X]\big)^{2}+2\big(X-\mathrm{E}[X]\big)\big(Y-\mathrm{E}[Y]\big)+\big(Y-\mathrm{E}[Y]\big)^{2}\Big]

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{E}\Big[\big(X-\mathrm{E}[X]\big)^{2}\Big]+\mathrm{E}\Big[2\big(X-\mathrm{E}[X]\big)\big(Y-\mathrm{E}[Y]\big)\Big]+\mathrm{E}\Big[\big(Y-\mathrm{E}[Y]\big)^{2}\Big]

It is not obvious, but first and the second summand are variances themselves! Therefore, further simplification gives:

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\Big)+\mathrm{E}\Big[2\big(X-\mathrm{E}[X]\big)\big(Y-\mathrm{E}[Y]\big)\Big]+\mathrm{Var}\big(Y\Big)

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\Big)+2 \times \mathrm{E}\Big[XY+\mathrm{E}[X]\mathrm{E}[Y]-X\mathrm{E}[Y]-Y\mathrm{E}[X]\Big]+\mathrm{Var}\big(Y\Big)

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\big)+\mathrm{Var}\big(Y\big)+2\Big(\mathrm{E}\Big[XY\Big]+\mathrm{E}\Big[\mathrm{E}[X]\mathrm{E}[Y]\Big]-\mathrm{E}\Big[X\mathrm{E}[Y]\Big]-\mathrm{E}\Big[Y\mathrm{E}[X]\Big]\Big)

I reorganized the right-hand side of the equation. The term we are working on can now be found on the right. As you know, we can take constant factors out of the expected value. For example \mathrm{E}[X\mathrm{E}[Y]] becomes \mathrm{E}[Y]\mathrm{E}[Y], because \mathrm{E}[Y] is constant in terms X. With this technique we get the following.

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\big)+\mathrm{Var}\big(Y\big)+2\big(\mathrm{E}[XY]+\mathrm{E}[X]\mathrm{E}[Y]-\mathrm{E}[X]\mathrm{E}[Y]-\mathrm{E}[X]\mathrm{E}[Y]\big)

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\big)+\mathrm{Var}\big(Y\big)+2\big(\mathrm{E}[XY]-\mathrm{E}[X]\mathrm{E}[Y]\big)


The term on the right is known as the covariance. Assuming that X and Y are independent (or at least uncorrelated), this term equals zero. Why? Because then you can make use of the multiplication rule:

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\big)+\mathrm{Var}\big(Y\big)+2\big(\mathrm{E}[X]\mathrm{E}[Y]-\mathrm{E}[X]\mathrm{E}[Y]\big)

\displaystyle \mathrm{Var}\big(X+Y\big)=\mathrm{Var}\big(X\big) +\mathrm{Var}\big(Y\big)

That’s why you can add the variances of uncorrelated random variables. Heureka!