A. Var(X) + Var (Y)
B. Var(X) – Var (Y)
C. Var (X)±
D. Zero
Related Mcqs:
- Let X1,X2,……,Xn be a random sample from a density,,,, f(x ι θ) where θ is a value of the random variable Θwith known density gΘ(θ) Then the estimator ∏(θ) with…/ respect to the prior gΘ(θ) is define as_________________E[∏(θ)ιX1,X2,…..,Xn] is called?
A. Posterior Bay’s estimator
B. Minimax estimator
C. Bay’s estimator
D. Sufficient estimator - Let X1,X2,……,Xn be a random sample from a density,,,, f(x ι θ) where θ is a value of the random variable Θwith known density gΘ(θ) Then the estimator ∏(θ) with…/ respect to the prior gΘ(θ) is define as______________E[∏(θ)ιX1,X2,…..,Xn] is called?
A. Posterior Bay’s estimator
B. Minimax estimator
C. Bay’s estimator
D. Sufficient estimator - Let Z1,Z2,….Zn be independent and identically distributedrandom variable, satisfying E[ι Zt ι]<∞. Let N be an integer valued random variable whose value n depends only on the values of the first n Z¡'s. Suppose E(N)< ∞, then E(Z1,Z2,….Zn)=E(N)E(Z) is called ?
A. Independence Equation
B. Sequential Probability Likelihood Equation
C. Neyman Pearson Lemma
D. Wald’s Equation - If a & b are constants, then Var (a + bX) equal to_________________?
A. Var(bX) + a
B. b2 Var(X)
C. b Var(x)
D. None of these - Var (kY)= ____________?
A. k Var (Y)
B. k2 Var(Y)
C. Var(Y)
D. k2 - The variance of randomvariable of x then var(x) = E[x-E[x]2 = E[________]?
A. (X – A)2
B. E(x)
C. (x – u)2
D. (x – 4)2 - If a and b are two constants, then Var(a+bX) is_______________?
A. a±bVar(X)
B. Var(a)±Var(X)
C. ±bVar(X)
D. b2Var(X)
E. (a±b)Var(X) - Var(2X+3) is________________?
A. 5 Var(X)
B. 4 Var(X)
C. 4 Var(X)+3
D. 5 Var(X)+3 - if Var(θˆ)→0 as n → 0, then θˆ is said to be______________?
A. Sufficient
B. Efficient
C. Unbiased
D. Consistent - if Var(T2)_______________?
A. Efficient
B. Sufficient
C. Unbiased
D. Consistent
Advertisement