% 256f19Assignment8.tex Conditional distributions and independence \documentclass[11pt]{article} %\usepackage{amsbsy} % for \boldsymbol and \pmb %\usepackage{graphicx} % To include pdf files! \usepackage{amsmath} \usepackage{amsbsy} \usepackage{amsfonts} \usepackage[colorlinks=true, pdfstartview=FitV, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref} % For links \usepackage{fullpage} % \pagestyle{empty} % No page numbers \begin{document} %\enlargethispage*{1000 pt} \begin{center} {\Large \textbf{\href{http://www.utstat.toronto.edu/~brunner/oldclass/256f19}{STA 256f19} Assignment Eight}}\footnote{Copyright information is at the end of the last page.} \vspace{1 mm} \end{center} \noindent Please read Section 3.1-3.3 in the text. Also, look over your lecture notes. The following homework problems are not to be handed in. They are preparation for Term Test 3 and the final exam. Use the formula sheet. For some of these questions, it may not be clear whether you are supposed to use linear properties of expected value, or whether you are supposed to go back to the definition and use integration or summation. The rule is that if integration or summation is not explicitly mentioned, just use expected value signs. %\vspace{5mm} \begin{enumerate} %%%%%%%% Expectation \item Do Exercise 3.1.1 part (a) only. \item Show that if $P(X \geq 0)=1$, then $E(X)\geq 0$. Treat the discrete and continuous cases separately. \item Let $a$ be a constant, and let $X$ be a continuous random variable, so you will integrate rather than adding. Show $E(aX) = a \, E(X)$. \item Let $X$ and $Y$ be discrete random variables, so you will add rather than integrating. % Using the change of variables formula from the formula sheet, Show that $E(X+Y) = E(X) + E(Y)$. You are proving a special case of $E\left(\sum_{i=1}^n a_iX_i \right) = \sum_{i=1}^n a_iE(X_i)$ from the formula sheet, so you can't use that. If you assume independence, you get a zero. \item Let $X$ and $Y$ be discrete random variables, so you will add rather than integrating. Prove the double expectation formula: $E(Y) = E[E(Y|X)]$. \item Let $X$ be a continuous random variable with $P(X \geq 0) = 1$. Prove $E(X) = \int_0^\infty P(X>t) \, dt$. Hint: Write the probability as an integral, sketch the region of integration, and switch order of integration using Fubini's Theorem. %%%%%%%% Variance \item Show $Var(a+X) = Var(X)$. \item Show $Var(bX) = b^2Var(X)$ \item Show $Var(X) = E(X^2)-[E(X)]^2$ \item Let $X$ have a beta distribution with parameters $\alpha$ and $\beta$. \begin{enumerate} \item \label{betaEXk} Calculate $E(X^k)$. \item The Uniform(0,1) distribution is a special case of the beta distribution. What are the parameters $\alpha$ and $\beta$? \item Use your answer to Question \ref{betaEXk} to show that the variance of the Uniform(0,1) distribution is 1/12. \end{enumerate} \item To play this casino game, you must pay an admission fee. You toss a fair coin, and wait until the first head appears. Your payoff is $2^z$ pennies, where $z$ is the number of tails that occur \emph{before} the first head. What should the admission fee be in order to make this a ``fair" game --- that is, a game with expected value zero? % St. Petersburg paradox on page 133. \pagebreak \item Derive the expected value formulas for the following distributions. Most of this is in the text or lecture. \begin{enumerate} \item Bernoulli ($\theta$): $E(X) = \theta$ \item Binomial ($n,\theta$): $E(X) = n\theta$ \item Geometric ($\theta$): $E(X) = \frac{1-\theta}{\theta}$ \item Poisson ($\lambda$): $E(X) = \lambda$ \item Gamma ($\alpha,\lambda$): $E(X) = \alpha/\lambda$ \item Beta ($\alpha,\beta$): $E(X) = \frac{\alpha}{\alpha+\beta}$ \end{enumerate} \item \label{prodex} Let $X$ and $Y$ be independent (discrete) random variables, so you will add rather than integrating. Show $E(XY)=E(X)E(Y)$. This formula also holds for continuous random variables. \item Do Exercises 3.1.5, 3.1.7 and 3.1.9. \item Do Exercise 3.1.11 the easy way, using independence. \item Do Exercise 3.1.13. Use double expectation. \item Do Exercise 3.2.1 part (c) only. \item Do Exercise 3.2.3 parts (a) and (f) only. \item Do Exercise 3.2.11 the easy way. Is there any need to assume that people get married at random? % \pagebreak %%%%%%%% Covariance \item Prove the following facts about covariance. \begin{enumerate} \item $Cov(X,Y) = E(XY)-E(X)E(Y)$ \item If $X$ and $Y$ are independent, $Cov(X,Y)= 0$ \item $Cov(a+X,b+Y) = Cov(X,Y)$ \item $Cov(aX,bY) = abCov(X,Y)$ \item $Cov(X,Y+Z) = Cov(X,Y) + Cov(X,Z)$ \item $Var(aX+bY) = a^2Var(X)+b^2Var(Y) + 2abCov(X,Y)$ \end{enumerate} \item \label{contxy} The continuous random variables $X$ and $Y$ have joint density function $f_{xy}(x,y)=1$ for $0