Consider independent random variables \(X_1\), \(X_2\), and \(X_3\) with

- \(\text{E}[X_1] = 1\), \(\text{Var}[X_1] = 4\)
- \(\text{E}[X_2] = 2\), \(\text{SD}[X_2] = 3\)
- \(\text{E}[X_3] = 3\), \(\text{SD}[X_3] = 5\)

**(a)** Calculate \(\text{E}[5 X_1 + 2]\).

**(b)** Calculate \(\text{E}[4 X_1 + 2 X_2 - 6 X_3]\).

**(c)** Calculate \(\text{Var}[5 X_1 + 2]\).

**(d)** Calculate \(\text{Var}[4 X_1 + 2 X_2 - 6 X_3]\).

Consider random variables \(H\) and \(Q\) with

- \(\text{E}[H] = 3\), \(\text{Var}[H] = 16\)
- \(\text{SD}[Q] = 4\), \(\text{E}\left[\frac{Q^2}{5}\right] = 3.2\)

**(a)** Calculate \(\text{E}[5H^2 - 10]\).

**(b)** Calculate \(\text{E}[Q]\).

Consider a random variable \(S\) with probability density function

\[ f(s) = \frac{1}{9000}(2s + 10), \ \ 40 \leq s \leq 100. \]

**(a)** Calculate \(\text{E}[S]\).

**(b)** Calculate \(\text{SD}[S]\).

Consider independent random variables \(X\) and \(Y\) with

- \(X \sim N(\mu_X = 2, \sigma^2_X = 9)\)
- \(Y \sim N(\mu_Y = 5, \sigma^2_Y = 4)\)

**(a)** Calculate \(P[X > 5]\).

**(b)** Calculate \(P[X + 2Y > 5]\).

Consider random variables \(Y_1\), \(Y_2\), and \(Y_3\) with

- \(\text{E}[Y_1] = 1\), \(\text{E}[Y_2] = -2\), \(\text{E}[Y_3] = 3\)
- \(\text{Var}[Y_1] = 4\), \(\text{Var}[Y_2] = 6\), \(\text{Var}[Y_3] = 8\)
- \(\text{Cov}[Y_1, Y_2] = 1\), \(\text{Cov}[Y_1, Y_3] = -1\), \(\text{Cov}[Y_2, Y_3] = 0\)

**(a)** Calculate \(\text{Var}[3Y_1 - 2Y_2]\).

**(b)** Calculate \(\text{Var}[3Y_1 - 4Y_2 + 2Y_3]\).

Consider using \(\hat{\xi}\) to estimate \(\xi\).

**(a)** If \(\text{Bias}[\hat{\xi}] = 5\) and \(\text{Var}[\hat{\xi}] = 4\), calculate \(\text{MSE}[\hat{\xi}]\)

**(b)** If \(\hat{\xi}\) is unbiased, \(\xi = 6\), and \(\text{MSE}[\hat{\xi}] = 30\), calculate \(\text{E}\left[\hat{\xi}^2\right]\)

Using the identity

\[ (\hat{\theta}-\theta) = \left(\hat{\theta}-\text{E}[\hat{\theta}]\right) + \left(\text{E}[\hat{\theta}] - \theta\right) = \left(\hat{\theta} - \text{E}[\hat{\theta}]\right) + \text{Bias}[\hat{\theta}] \]

show that

\[ \text{MSE}[\hat{\theta}] = \text{E}\left[(\hat{\theta} - \theta)^2\right] = \text{Var}[\hat{\theta}] + \left(\text{Bias}[\hat{\theta}]\right)^2 \]

Let \(X_1, X_2, \ldots, X_n\) denote a random sample from a population with mean \(\mu\) and variance \(\sigma^2\).

Consider three estimators of \(\mu\):

\[ \hat{\mu}_1 = \frac{X_1 + X_2 + X_3}{3}, ~~~\hat{\mu}_2 = \frac{X_1}{4} + \frac{X_2 + \cdots + X_{n - 1}}{2(n - 2)} + \frac{X_n}{4}, ~~~\hat{\mu}_3 = \bar{X}, \]

Calculate the mean squared error for each estimator. (It will be useful to first calculate their bias and variances.)

Let \(X_1, X_2, \ldots, X_n\) denote a random sample from a distribution with density

\[ f(x) = \frac{1}{\theta}e^{-x/\theta}, ~~x > 0, \theta \geq 0 \]

Consider five estimators of \(\theta\):

\[ \hat{\theta}_1 = X_1, ~~~\hat{\theta}_2 = ~~~\frac{X_1 + X_2}{2}, ~~~\hat{\theta}_3 = ~~~\frac{X_1 + 2X_2}{3}, ~~~\hat{\theta}_4 = \bar{X}, ~~~\hat{\theta}_5 = 5 \]

Calculate the mean squared error for each estimator. (It will be useful to first calculate their bias and variances.)

Suppose that \(\text{E}\left[\hat{\theta}_1\right] = \text{E}\left[\hat{\theta}_2\right] = \theta\), \(\text{Var}\left[\hat{\theta}_1\right] = \sigma_1^2\), \(\text{Var}\left[\hat{\theta}_2\right] = \sigma_2^2\), and \(\text{Cov}\left[\hat{\theta}_1, \hat{\theta}_2\right] = \sigma_{12}\). Consider the unbiased estimator

\[ \hat{\theta}_3 = a\hat{\theta}_1 + (1-a)\hat{\theta}_2. \]

If \(\hat{\theta}_1\) and \(\hat{\theta}_2\) are independent, what value should be chosen for the constant \(a\) in order to minimize the variance and thus mean squared error of \(\hat{\theta}_3\) as an estimator of \(\theta\)?

Let \(Y\) have a binomial distribution with parameters \(n\) and \(p\). Consider two estimators for \(p\):

\[ \hat{p}_1 = \frac{Y}{n} \]

and

\[ \hat{p}_2 = \frac{Y + 1}{n + 2} \]

For what values of \(p\) does \(\hat{p}_2\) achieve a lower mean square error than \(\hat{p}_1\)?

Let \(X_1, X_2, \ldots, X_n\) denote a random sample from a population with mean \(\mu\) and variance \(\sigma^2\).

Create an unbiased estimator for \(\mu^2\). Hint: Start with \(\bar{X}^2\).

Let \(X_1, X_2, X_3, \ldots, X_n\) be iid random variables form \(\text{U}(\theta, \theta + 2)\). (That is, a uniform distribution with a minimum of \(\theta\) and a maximum of \(\theta + 2\).)

Consider the estimator

\[ \hat{\theta} = \frac{1}{n}\sum_{i = 1}^{n}X_i = \bar{X} \]

**(a)** Calculate the **bias** of \(\hat{\theta}\) when estimating \(\theta\).

**(b)** Calculate the **variance** of \(\hat{\theta}\).

**(c)** Calculate the **mean squared error** of \(\hat{\theta}\) when estimating \(\theta\).