\magnification=\magstep1
\def\hb{{\widehat{\beta}}}
\centerline{Mathematics 376 -- Probability and Statistics II}
\centerline{Problem Set 8}
\centerline{{\it due:} Friday, April 12, 2006}
\bigskip
\noindent
A)  Assume we have a linear statistical model
$$Y = \beta_0 + \beta_1 x + \epsilon$$
and $\epsilon$ is {\it normally distributed} with mean $\mu = 0$, and
variance $\sigma^2$.  
\item{1)} Given observations $(x_1,y_1),\ldots,(x_n,y_n)$ of 
$Y$ explain why the likelihood of these observations (as a function
of $\beta_0$ and $\beta_1$) is given by 
$$L(\beta_0,\beta_1) = \left({1\over \sqrt{2\pi\sigma^2}}\right)^n 
\exp\left(\left(-\sum_{i=1}^n (y_i - \beta_0 - \beta_1x_i)^2\right)
{1\over 2\sigma^2}\right)$$
\item{2)} Find the maximum likelihood estimators for $\beta_0$
and $\beta_1$ by computing the appropriate partial derivatives
of $\ln(L)$, setting equal to zero, and solving for $\beta_0$, $\beta_1$.
(You should get the same formulas as we obtained for the 
least squares estimators.)
\item{3)} Show that your answer in part 2 is really a maximum of 
$L$ or $\ln(L)$ using the Second Derivative Test for functions of
two variables.
\bigskip
\noindent
B)  In class, by solving the normal equations using
Cramer's Rule, we obtained the following formulas
(all summations extend from $i = 1$ to $i = n$, so we
omit limits of summation for simplicity):
$$\hb_0 = 
{\left(\Sigma y_i\right)\left(\Sigma x_i^2\right) - 
\left(\Sigma x_i\right)\left(\Sigma x_iy_i\right)\over
n\left(\Sigma x_i^2\right) - \left(\Sigma x_i\right)^2}\leqno(1)$$
and 
$$\hb_1 = 
{n\left(\Sigma x_iy_i\right) - 
\left(\Sigma x_i\right)\left(\Sigma y_i\right)\over
n\left(\Sigma x_i^2\right) - \left(\Sigma x_i\right)^2}\leqno(2)$$
Recall that in class we also introduced the quantities
$$S_{xx} = \sum (x_i - \overline{x})^2\qquad S_{xy} = \sum (x_i - \overline{x})(y_i - \overline{y})$$
Show that (2) is equivalent to 
$$\hb_1 = {S_{xy}\over S_{xx}}$$
and then that (1) is equivalent to 
$$\hb_0 = \overline{y} - \hb_1 \overline{x}$$
\bigskip
\noindent
{\it From the text}:  Chapter 11/4,8,9,22,20,22,23,57.
(Note: parts of problems that call for graphing can be done 
conveniently using Maple -- see handout from class).
\bye