\documentclass[11pt]{article}
\def\Z{\mathbb{Z}}
\def\Q{\mathbb{Q}}
\def\R{\mathbb{R}}
\def\C{\mathbb{C}}
\usepackage{latexsym,amsmath,amssymb}
\usepackage[all]{xy}
\setlength{\textwidth}{6.5in}
\setlength{\oddsidemargin}{0in}
\setlength{\textheight}{8.0in}
\begin{document}
\centerline{Mathematics 375 -- Probability and Statistics 1}
\centerline{Solutions for Problem Set 5}
\centerline{October 22, 2009}
\vskip 10pt
\noindent
3.39.  (a)  $Y = $ the number of components
that last longer than 1000 hours has a binomial
distribution.   $P(Y = 2) = \binom{4}{2}(.8)^2(.2)^2 \doteq .1536$.

(b)  The subsystem will operate if $2,3$, or $4$ of the 
components last longer than 1000 hours.  So this 
probability is 
$P(Y\ge 2) = \binom{4}{2}(.8)^2(.2)^2 + \binom{4}{3}(.8)^3(.2) + \binom{4}{4}(.8)^4 \doteq .973$.  
\vskip 10pt 
\noindent
3.40.  These are all binomial probabilities. Let $Y$ be the number
who recover out of the 20.
(a)  $P(Y = 14) = \binom{14}{20} (.8)^{14}(.2)^6 \doteq .11$.

(b)  $P(Y \ge 10) = \displaystyle{\sum_{y = 10}^{20} \binom{20}{y}(.8)^y(.2)^{20-y}}\doteq 
.9994$.  

(c)  $P(14 \le Y \le 18) = 
\displaystyle{\sum_{y=14}^{18} \binom{20}{y}(.8)^y(.2)^{20-y}}\doteq .844$.

(d) $P(0 \le Y \le 16) = 
\displaystyle{\sum_{y=0}^{16} \binom{20}{y}(.8)^y(.2)^{20-y}}\doteq .589$.
\vskip 10pt
\noindent
3.43.  Let $Y$ be the number who qualify for the favorable rates (out of the 5),
a binomial random variable. (a) $P(Y = 5) = (.7)^5 \doteq .168$.

(b)  $P(Y \ge 4) = 
\displaystyle{\sum_{y=4}^5 \binom{5}{y} (.7)^y (.3)^{5-y}} \doteq .528$.  
\vskip 10pt
\noindent
3.44.  Let $Y$ be the number of successful operations (binomial).
(a)  $P(Y = 5) = (.8)^5 \doteq .328$.

(b)  $P(Y = 4) = \binom{5}{4}(.6)^4(.4)\doteq .259.$

(c)  $P(Y < 2) = 
\displaystyle{\sum_{y=0}^1 \binom{5}{y} (.3)^y (.7)^{5-y}} \doteq .528$.
(Note that this is the same as the answer for 3.43 (b).  Do you see why?  For
a formal statement of the pattern here, see Exercise 3.54.)
\vskip 10pt
\noindent
3.53.  $Y = $ the number of children (out of 3) that develop 
Tay-Sachs has a binomial distribution.  (a)  $P(Y = 3) = (.25)^3 \doteq .0156$.

(b)  $P(Y = 1) = \binom{3}{1}(.25)(.75)^2 \doteq .422$.

(c)  This conditional probability is $\frac{(.75)^2(.25)}{(.75)^2} = .25$.
(This follows because of the independence.)  
\vskip 10pt
\noindent
3.59.  Let $Y$ be the number of good motors, so $10 - Y$
is the number of defectives.  The problem is asking for the 
expected value of $G = 1000 - 200(10 - Y) = 200Y - 1000$.  
Since $Y$ is binomial with $n = 10$ and $p = .92$, we have
$E(G) = E(200Y - 10000) = 200E(Y) - 1000 = (200)(9.2) - 1000 = 840$
(dollars).
\vskip 10pt
\noindent
3.62.  (a)  To justify multiplying $p_1p_2p_3$ to get the probability
of detecting a wing crack, you need to assume the individual events
are independent.

(b)  $p = (.9)(.8)(.5) = .36$.  Then the probability desired is
$P(Y \ge 1)= 
\displaystyle{\sum_{y=1}^3 \binom{3}{y} (.36)^y (.64)^{3-y}} \doteq .738$.
\vskip 10pt
\noindent
3.67.  This is geometric with $p = .3$.  So $P(Y = 5) = (.7)^4(.3) \doteq .072$.
\vskip 10pt
\noindent
3.69.  $Y$ is a geometric random variable with $p = .59$, so 
$P(Y = y) = (.41)^{y-1} (.59)$.  
\vskip 10pt
\noindent
3.71. (a)  For a geometric random variable $Y$, 
$$P(Y > a) = \sum_{y=a+1}^\infty q^{y-1} p = \frac{pq^a}{1 - q} = q^a$$
(where we used the geometric series sum formula -- this is a 
geometric series with first term $q^ap$ and ratio $q$).  

(b) Using the definition of conditional probability
and part a),
$$P(Y> a + b| Y > a) = \frac{P(Y>a+b)}{P(Y > a)} = \frac{q^{a+b}}{q^a} = q^b.$$
(Note $Y>a+b$ implies $Y > a$, so the event $(Y > a+b) \cap (Y > a)$
is the same as $Y > a + b$.)

(c)  This is the same as $P(Y > b)$.  In other words,
for geometric random variables, the probability that
$Y > a + b$ given that $Y > a$ (i.e. the probability that 
the first success occurs after trial $a + b$, given that
the first success occurs after trial $a$) is the same
as the probability that the first success occurs after trial $b$.
Geometric random variables thus are ``memoryless'' in 
the sense that starting from the $a$-th trial (with no previous
successes), the probability
that the first success occurs more than $b$ trials later is the same 
as if we started from the beginning with the $0$th trial.  
The previous non-successes don't affect the probability 
of later successes -- no ``memory''.
\vskip 10pt
\noindent
%3.104.  We are given that the four packets selected first 
%do contain cocaine.
%So after those are removed, the remaining 16 packets
%consist of $11$ with cocaine and $5$ without.  The probability that
%the two sold to the buyer do not contain cocaine is
%$\frac{5}{16}\frac{4}{15} = \frac{1}{12}$.  Note:  This can also be seen as
%a hypergeometric probability with $r = 5$, $y = 2$, $N = 16$, $n = 2$:
%$$
%\frac{\binom{r}{y} \binom{N-r}{n-y}}{\binom{N}{n}} = 
%\frac{\binom{5}{2} \binom{11}{0}}{\binom{16}{2}} = 
%\frac{10}{120} = \frac{1}{12}.
%$$
%\noindent
3.105.  (a)  $Y$ is hypergeometric because the teachers
are selected \emph{without replacement} from the finite pool of $8$ candidates.

(b) The number of internship candidates chosen is hypergeometric with
$N = 8$, $r = 5$ (the internship candidates) $n = 3$ (the chosen ones).
So the desired probability is 
$$P(Y = 2) + P(Y = 3) = \frac{\binom{5}{2}\binom{3}{1}}{\binom{8}{3}} + 
\frac{\binom{5}{3}\binom{3}{0}}{\binom{8}{3}} \doteq .536 + .179 \doteq .714.$$
(c) Using the formulas from Theorem 3.10,
$\mu = \frac{nr}{N} = \frac{15}{8} = 1.875$, and 
$$
\sigma^2 = n \cdot \frac{r}{N}\cdot\frac{N-r}{N}\cdot\frac{N-n}{N-1} 
= 3 \cdot \frac{5}{8}\cdot\frac{3}{8}\cdot\frac{5}{7} = \frac{225}{448}.
$$
So the standard deviation is 
$\sigma = \sqrt{\frac{225}{448}} \doteq .709$.
\vskip 10pt
\noindent
3.122. (a)  
$P(Y \le 3) = \displaystyle{\sum_{y=0}^3 \frac{7^y e^{-7}}{y!}} \doteq .0818$.  

(b)  $P(Y \ge 2) = 1 - P(Y = 0) - P(Y = 1) \doteq .993$.

(c)  $P(Y = 5) = \frac{7^5 e^{-7}}{5!} \doteq .128$.
\vskip 10pt
\noindent
3.123.  For a Poisson random variable, $P(Y=y) = \frac{\lambda^y e^{-\lambda}}{y!}$.
So if $P(Y = 0) = P(Y = 1)$, then we have the equation
$$
e^{-\lambda} = \lambda e^{-\lambda}
$$
This implies $\lambda = 1$.  Hence $P(Y = 2) = \frac{e^{-1}}{2} \doteq .184$. 
\vskip 10pt
\noindent
3.125.  If $Y$ is the number of customers who 
arrive, the total service time is $T = 10Y$.  Then $E(T) = E(10Y) = 10E(Y) 
= 10\cdot 7 = 70$ and $V(T) = V(10Y) = 100 V(Y) = 100 \cdot 7 = 700$.
To answer the last part of the question, note that $\sigma = \sqrt{700} 
\doteq 26.46$.  $2.5$ hours is $150$ minutes, which is more than 2 standard
deviations above $\mu$.  So it is not likely that the total service
time will exceed 2.5 hours.
\vskip 10pt
\noindent
3.126. (a) The arrivals over the two-hour period are a Poisson
process with $\lambda = 14$.  So the probability that there are
exactly 2 arrivals is $\displaystyle{\frac{14^2 e^{-14}}{2!}}$.
 

(b) {\it Method 1 -- ``direct approach'':} Let $I$ be the number of customers arriving between 1 and 2 and
$II$ the number between 3 and 4.  Let $T = I + II$ be the total number.  We can 
get $T = 2$ in three different ways ($I = 2$ and $II = 0$, or
$I = 1$ and $II = 1$, or $I = 0$ and $II = 2$.)  Then 
using independence:
\begin{align*}
P(T = 2) &= \sum_{k=0}^2 P(I = k) P(II = 2-k)\\
&= \sum_{k=0}^2 \frac{7^k e^{-7}}{k!}\cdot\frac{7^{2-k}e^{-7}}{(2-k)!}\\
&= e^{-14}\left( \frac{7^2}{2} + 7^2 + \frac{7^2}{2}\right)\\
&= e^{-14} \frac{4\cdot 7^2}{2!}\\
&= \frac{14^2 e^{-14}}{2!}
\end{align*}
(same as in part (a)!)

\vskip 10pt
\noindent
{\it Method 2 -- ``clever'':}  The fact that there are two noncontiguous
hours is really irrelevant for the total number $T$ of customers arriving if the 
numbers from the two hours are independent.  It should be the same 
as a situation where you have a Poisson distribution 
with $\lambda = 7 + 7 = 14$ customers arriving per hour.
Then $P(T = 2) = \frac{14^2 e^{-14}}{2!}$.  (Same as above!)

\vskip 10pt
\noindent
{\it Comment:}  In fact, this problem illustrates a \emph{general} fact
about Poisson random variables.  If you have $X$ Poisson with 
parameter $\lambda$ and $Y$ Poisson with parameter $\mu$ and 
$X,Y$ are independent, then $Z = X+Y$ also has a Poisson distribution
with parameter $\lambda + \mu$.  One proof comes from an argument 
like the ``direct approach'' above:
\begin{align*}
P(Z = z) &= \sum_{x=0}^z P(X = x)P(Y = z - x)\\
&= \sum_{x=0}^z P(X = x)P(Y = z - x)\\
&= \sum_{x=0}^z \frac{\lambda^x e^{-\lambda}}{x!} \cdot
\frac{\mu^{z-x} e^{-\mu}}{(z-x)!}\\
&= e^{-(\lambda+\mu)} \sum_{x=0}^z \frac{1}{x! (z-x)!} \lambda^x \mu^{z-x}\\
&= \frac{e^{-(\lambda+\mu)}}{z!} \sum_{x=0}^z \binom{z}{x} \lambda^x \mu^{z-x}\cr
&=\frac{e^{-(\lambda+\mu)}}{z!} (\lambda + \mu)^z
\end{align*}
(using the binomial theorem).  This is exactly the 
right probability mass function for a Poisson random variable
with parameter $\lambda + \mu$.  

There's another proof via
the moment generating functions too(!)  If $X,Y$ are independent,
and $Z = X+Y$, then 
$$m_Z(t) = E(e^{tX + tY}) = E(e^{tX})E(e^{tY}) = 
e^{\lambda(e^t-1)} e^{\mu(e^t-1)} = e^{(\lambda + \mu)(e^t - 1)}$$
By the uniqueness theorem, $Z$ has a Poisson distribution with 
parameter $\lambda + \mu$.  
\vskip 10pt
\noindent
3.139. By the linearity of the expected value, 
we have $E(X) = E(50 - 2Y - Y^2) = 50 - 2E(Y) - E(Y^2)$.
By Theorem 3.11, $E(Y) = \lambda = 2$.  Also, $E(Y^2) = V(Y) + E(Y)^2
= 2 + 4 = 6$.  So $E(X) = 50 - 4 - 6 = 40$.  
\end{document}
