\documentclass[12pt]{article} \pagestyle{empty} % No page numbers \begin{document} \begin{center} {\LARGE Mixed Distributions } \vspace{3 mm} \end{center} Our text discusses random variables that are either discrete or continuous. We will go further, and consider \emph{mixed} random variables that have a distrete part and a continuous part. To justify this, consider observing a lightbulb until it fails. What if there is a positive probability that the failure time is zero (the bulb never goes on)? Let $X_1$ be a discrete random variable, and let $X_2$ be (absolutely) continuous. You may think of a mixed random variable $Y$ as arising from a two-step statistical experiment, like this. First, toss a coin with probability of a Head equal to $\alpha$. If the coin shows Heads, $Y=X_1$; if it is Tails, $Y=X_2$. Denoting by $C$ the outcome of the coin toss, we can write the distribution function of Y as \begin{eqnarray*} F_Y(y) & = & P(Y \leq y) = P(Y \leq y|C=h)\,P(C=h) + P(Y \leq y|C=t)\,P(C=t) \\ & = & \alpha P(X_1 \leq y) + (1-\alpha) P(X_2 \leq y) \\ & = & \alpha F_{X_1}(y) + (1-\alpha) F_{X_2}(y) \end{eqnarray*} Let $g(\cdot)$ be a function for which all the relevant expectations exist. By the double expectation formula (which is actually part of the \emph{definition} of conditional probability in more advanced courses), we have \begin{eqnarray*} E[g(Y)] & = & E[E[g(Y)|C]] = [E[g(Y)|C=h]\,P(C=h) + [E[g(Y)|C=t]\,P(C=t) \\ & = & \alpha E[g(X_1)] + (1-\alpha) E[g(X_2)] \\ & = & \alpha \sum_x g(x)\,f_{X_1}(x) + (1-\alpha) \int_{-\infty}^\infty g(x)\,f_{X_2}(x)\,dx \end{eqnarray*} \enlargethispage*{1000 pt} This formula completely determines the distribution of $Y$, since the function $g$ could be an indicator for any set of interest. We can even use it to \emph{define} some notation that might otherwise be confusing. Let us write \begin{eqnarray*} E[g(Y)] & = & \int g(y)\,dF_Y(y) = \int g(y)\,dP_Y(y) \\ & = & \alpha \sum_x g(x)\,f_{X_1}(x) + (1-\alpha) \int_{-\infty}^\infty g(x)\,f_{X_2}(x)\,dx \end{eqnarray*} You will prove in homework that this ``integral" enjoys all the usual properties of sums and integrals. If you later learn that it is a special case of a Lebesgue integral, no difficulty will arise. In the meantime, you will have a concrete meaning for a notation that is frequently used without much explanation. \end{document}