1 tutorial

1. Statistical Inference Welcome Tutorial :-) Tutorial 1 Lecturer: Ming Gao DaSE @ ECNU May. 5, 2018 mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
2. Tutorial 1 1. Prove each of the following statements: a. If P(B) = 1, then P(A B) = P(A) for any A; P(A) b. If A ⊂ B, then P(B A) = 1 and P(A B) = P(B) ; c. If A and B are mutually exclusive, then P(A A ∪ B) = P(A) . P(A) + P(B) d. P(A ∩ B ∩ C ) = P(A B ∩ C )P(B C )P(C ). 2. A certain river floods every year. Suppose that the low-water mark is set at 1 and the high-water mark Y has distribution function FY (y ) = P(Y ≤ y ) = 1 − 1 , 1 ≤ y ≤ ∞. y2 a. Verify that FY (y ) is a cdf; b. Find fY (y ), the pdf of Y ; mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
3. Tutorial 1 Cont’d 3. Let X have the pdf 1 f (x) = (1 + x), −1 < x < 1. 2 a. Find the pdf of Y = X 2 ; b. Find E (Y ) and Var (Y ); c. Find E (aY + b) and Var (aY + b), where a and b are constant. 4. In each of the following find the pdf of Y . Show that the pdf integrates to 1. a. b. c. d. Y Y Y Y = X 3 and fX (x) = 42x 5 (1 − x), 0 < x < 1; = 4X + 3 and fX (x) = 7e −7x , 0 < x < ∞; = X 2 and fX (x) = 30x 2 (1 − x)2 , 0 < x < 1; = X 2 and fX (x) = 1, 0 < x < 1. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
4. Tutorial 1 Cont’d 5. For each of the following families, please verify that it is an exponential family: a. N(µ, σ 2 ); b. N(θ , aθ ), a known; c. f (x θ ) = C · exp −(x −θ )4  , C is a normalizing constant. 6. Calculate P( X − µX ≥ kσX ) for X ∼ uniform(0, 1) and X ∼ exponential (λ ), and compare your answers to the bound from Chebychev’s inequality and Chernoff bound. 7. A pdf is defined by  C (x + 2y ), if 0 < y < 1 and 0< x < 2 f (x, y ) = 0, otherwise. a. b. c. d. Find Find Find Find the the the the value of C ; marginal distribution of X ; joint cdf of X and Y ; 9 pdf of the r.v. Z = (X +1) 2. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
5. Tutorial 1 Cont’d 8. a. Find P(X > √ Y ) if X and Y are jointly distributed with pdf f (x, y ) = x + y , 0 ≤ x ≤ 1, 0 ≤ y ≤ 1. b. Find P(X 2 < Y < X ) if X and Y are jointly distributed with pdf f (x, y ) = 2x, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1. 9. Prove that if the joint cdf of X and Y satisfies FX ,Y (x, y ) = FX (x)FY (y ), then for any pair of intervals (a, b) and (c, d ), P(a ≤ X ≤ b, c ≤ Y ≤ d ) = P(a ≤ X ≤ b)P(c ≤ Y ≤ d ). 10. Let X ∼ N(µ, σ 2 ) and let Y ∼ N(γ, σ 2 ). Suppose X and Y are independent. Define U = X + Y and V = X − Y . Show that U and V are independent normal r.v.s. Find the distribution of each of them. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
6. Tutorial 1 Cont’d 11. Let X and Y be independent r.v.s with means µX , µY and variances σX2 , σY2 . Find an expression for the correlation of XY and Y in terms of these means and variances. 12. Let X1 , X2 , and X3 be uncorrelated r.v.s, each with mean µ and variance σ 2 . Find Cov (X1 + X2 , X2 + X3 ) and Cov (X1 + X2 , X1 − X2 ). 13. Let X1 , · · · , Xn be i.i.d. r.v.s with continuous cdf FX , and suppose E (Xi ) = µ. Define the r.v.s  1, if Xi > µ, Yi = 0, otherwise. Find E (Yi ), Var (Yi ), and the distribution of ∑ni=1 Yi . mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
7. Tutorial 1 Cont’d 14. Establish the following recursion relations for means and variances. Let X n and Sn2 be the mean and variance, respectively, of X1 , · · · , Xn . Then suppose another observation Xn+1 , becomes available. Show that X +nX n ; a. X n+1 = n+1 n+1 n 2 2 b. nSn+1 = (n − 1)Sn + n+1 (Xn+1 − X n )2 . 15. Let X1 , · · · , Xn be a random sample from a population with pdf  1 θ , if 0 < x < θ ; fX (x) = 0, otherwise. Let X(1) < · · · < X(n) be the order statistics. Show that and X(n) are independent r.v.s. X(1) X(n) 16. If X 1 and X 2 are the mean of two independent samples of size n from a population with variance σ 2 , find a value for n so that P( X 1 − X 2 < σ5 ) ≈ 0.99. Justify your calculations. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
8. Tutorial 1 Cont’d 17. Let X1 , · · · , Xn be a random sample from a population with mean µ and variance σ 2 . Show that √ √ n(X n − µ) n(X n − µ) E( ) = 0 and Var ( ) = 1. σ σ Thus, the normalization of X n in the central limit theorem gives r.v.s that have the same mean and variances as the limiting N(0, 1) distribution. 18. Let X1 , · · · , Xn be a random sample from the pdf 1 −(x−µ)/σ e , µ < x < ∞, 0 < σ < ∞. σ Find a two-dimensional sufficient statistic for (µ, σ ). 19. For each of the following distributions let X1 , · · · , Xn be a random sample. Find a minimal sufficient statistic for θ f (x µ, σ ) = a. f (x θ ) = 2 e −(x −θ ) /2 , x 2π −(x −θ ) √1 b. f (x θ ) = e ∈ R, θ ∈ R; , θ < x < ∞, θ ∈ R. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
9. Tutorial 1 Cont’d 20. One observation, X , is taken from a N(0, σ 2 ) population. a. Find an unbiased estimator of σ 2 , and justify your answer; b. Find the MLE of σ . 21. Let X1 , · · · , Xn be a random sample from a population with pmf Pθ (X = x) = θ x (1 − θ )1−x , x = 0 or 1, 0 ≤ θ ≤ 1 2 a. Find the method of moments estimator and MLE of θ ; b. Find the mean squared errors of each of the estimators; c. Which estimator is preferred? Justify your choice. 22. Let X1 , · · · , Xn be i.i.d. Poisson(λ ), and λ have a gamma(α, β ) distribution, the conjugate family for the Poisson a. Find the posterior distribution of λ ; b. Calculate the posterior mean and variance. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs
10. Tutorial 1 Cont’d 23. Let X1 , · · · , Xn be i.i.d. Bernoulli(p). Show that the variance of X attains the Cramér-Rao lower bound, and hence X is the best unbiased estimator of p. 24. Let X1 , · · · , Xn be a random sample from a population with mean µ and variance σ 2 a. Show that the estimator ∑ni=1 ai Xi is an unbiased estimator of µ if ∑ni=1 ai = 1; b. Among all unbiased estimators of this form find the one with minimum variance, and calculate the variance. mgao@dase.ecnu.edu.cn The Foundations of Logic and Proofs