It comes from the formula for the variance of the sum of two random variables Var(X+Y) = Var(X) + 2Cov(X,Y) + Var(Y) X1*=X+W Var(X1*)= Var(X1+W), applying the formula above, Var(X1*)= Var(X1) + 2Cov(X1,W) + Var(W) given that Cov(X1,W)=0, because assumption-2 (6:54), i.e., no systematic measurement error, then Var(X1*) = Var(X1) + Var(W)
@@youshengtang3997 simply Var(x*)=Var(x+w)=var(x)+var(w)+2Cov(x,w), which equals to var(x)+var(w) since Cov(x,w) equals to zero by assumption. However, I cant prove that Cov(x,w)=var(w) any good?
@youshengtang3997 simply Var(x*)=Var(x+w)=var(x)+var(w)+2Cov(x,w), which equals to var(x)+var(w) since Cov(x,w) equals to zero by assumption. However, I cant prove that Cov(x,w)=var(w) any good?
hi, can i please know is there any circumstances where this type of bias does not matter?
great explanation!
Why is Var(X1*) = Var(X1) + Var(W) ? Did you show this in the video before proclaiming it at 19:55?
It comes from the formula for the variance of the sum of two random variables
Var(X+Y) = Var(X) + 2Cov(X,Y) + Var(Y)
X1*=X+W
Var(X1*)= Var(X1+W), applying the formula above,
Var(X1*)= Var(X1) + 2Cov(X1,W) + Var(W)
given that Cov(X1,W)=0, because assumption-2 (6:54), i.e., no systematic measurement error, then
Var(X1*) = Var(X1) + Var(W)
same question, did you figure it out?
@@youshengtang3997 simply Var(x*)=Var(x+w)=var(x)+var(w)+2Cov(x,w), which equals to var(x)+var(w) since Cov(x,w) equals to zero by assumption. However, I cant prove that Cov(x,w)=var(w) any good?
@youshengtang3997 simply Var(x*)=Var(x+w)=var(x)+var(w)+2Cov(x,w), which equals to var(x)+var(w) since Cov(x,w) equals to zero by assumption. However, I cant prove that Cov(x,w)=var(w) any good?