C
corpact90
Member
Hi everyone, I have a doubt regarding normal random variables.
I know it could really easy to solve but still I can't figure this out.
So, if you have to sum to gaussian variable as follows:
aN(0,1)+bN(0,1)
where a and b are constant we can proceed as follows:
aN(0,1)+bN(0,1)=N(0,a^2)+N(0,b^2)=N(0,a^2+b^2)
why is this different from this:
aN(0,1)+bN(0,1)=(a+b)N(0,1)=N(0,(a+b)^2)
which one of the two approaches is correct?
Thanks
I know it could really easy to solve but still I can't figure this out.
So, if you have to sum to gaussian variable as follows:
aN(0,1)+bN(0,1)
where a and b are constant we can proceed as follows:
aN(0,1)+bN(0,1)=N(0,a^2)+N(0,b^2)=N(0,a^2+b^2)
why is this different from this:
aN(0,1)+bN(0,1)=(a+b)N(0,1)=N(0,(a+b)^2)
which one of the two approaches is correct?
Thanks