% 302f13Assignment3.tex \documentclass[11pt]{article} %\usepackage{amsbsy} % for \boldsymbol and \pmb \usepackage{graphicx} % To include pdf files! \usepackage{amsmath} \usepackage{amsbsy} \usepackage{amsfonts} \usepackage[colorlinks=true, pdfstartview=FitV, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref} % For links \usepackage{fullpage} %\pagestyle{empty} % No page numbers \begin{document} %\enlargethispage*{1000 pt} \begin{center} {\Large \textbf{STA 302f14 Assignment Three}}\footnote{Copyright information is at the end of the last page.} \vspace{1 mm} \end{center} \noindent For this assignment, Chapter 2 in the text covers matrix algebra and Chapter 3 covers random vectors. You are responsible for what is in this assignment, not everything that's in the text. Questions~\ref{book1} and ~\ref{book2} are to be done with R. \emph{Please print the two sets of R output on separate pieces of paper. You may be asked to hand one of them in, but not the other.} Except for the R parts, these problems are preparation for the quiz in tutorial on Friday October 3d, and are not to be handed in. \begin{enumerate} \item \label{book1} Using R, do textbook question 2.14, just parts (c), (g), (h) and (m). Also do textbook question 2.15, just part (e). Label the parts of your output with comment statements. Bring the printout to the quiz. \item \label{book2} Using R, do textbook question 2.77. Bring the printout to the quiz. \item Recall the definition of linear independence. The columns of $\mathbf{X}$ are said to be \emph{linearly dependent} if there exists a $p \times 1$ vector $\mathbf{v} \neq \mathbf{0}$ with $\mathbf{Xv} = \mathbf{0}$. We will say that the columns of $\mathbf{X}$ are linearly \emph{independent} if $\mathbf{Xv} = \mathbf{0}$ implies $\mathbf{v} = \mathbf{0}$. Let $\mathbf{A}$ be a square matrix. Show that if the columns of $\mathbf{A}$ are linearly dependent, $\mathbf{A}^{-1}$ cannot exist. Hint: $\mathbf{v}$ cannot be both zero and not zero at the same time. \item Do problem 2.23 in the text. \item \label{ivt} Let $\mathbf{A}$ be a non-singular square matrix (meaning that the inverse exists). Prove $(\mathbf{A}^{-1})^\prime=(\mathbf{A}^\prime)^{-1}$. \item Using Question~\ref{ivt}, prove that the if the inverse of a symmetric matrix exists, it is also symmetric. \item \label{ss} Let $\mathbf{a}$ be an $n \times 1$ matrix of real constants. How do you know $\mathbf{a}^\prime\mathbf{a}\geq 0$? \item Recall the \emph{spectral decomposition} of a square symmetric matrix (For example, a variance-covariance matrix). Any such matrix $\boldsymbol{\Sigma}$ can be written as $\boldsymbol{\Sigma} = \mathbf{CD} \mathbf{C}^\prime$, where $\mathbf{C}$ is a matrix whose columns are the (orthonormal) eigenvectors of $\boldsymbol{\Sigma}$, $\mathbf{D}$ is a diagonal matrix of the corresponding eigenvalues, and $\mathbf{C}^\prime\mathbf{C} =~\mathbf{C}\mathbf{C}^\prime =~\mathbf{I}$. \begin{enumerate} \item Let $\boldsymbol{\Sigma}$ be a square symmetric matrix with eigenvalues that are all strictly positive. \begin{enumerate} \item What is $\mathbf{D}^{-1}$? \item Show $\boldsymbol{\Sigma}^{-1} = \mathbf{C} \mathbf{D}^{-1} \mathbf{C}^\prime$ \end{enumerate} \item Let $\boldsymbol{\Sigma}$ be a square symmetric matrix, and this time some of the eigenvalues might be zero. \begin{enumerate} \item What do you think $\mathbf{D}^{1/2}$ might be? \item Define $\boldsymbol{\Sigma}^{1/2}$ as $\mathbf{CD}^{1/2} \mathbf{C}^\prime$. Show $\boldsymbol{\Sigma}^{1/2}$ is symmetric. \item Show $\boldsymbol{\Sigma}^{1/2}\boldsymbol{\Sigma}^{1/2} = \boldsymbol{\Sigma}$. \end{enumerate} \item Now return to the situation where the eigenvalues of the square symmetric matrix $\boldsymbol{\Sigma}$ are all strictly positive. Define $\boldsymbol{\Sigma}^{-1/2}$ as $\mathbf{CD}^{-1/2} \mathbf{C}^\prime$, where the elements of the diagonal matrix $\mathbf{D}^{-1/2}$ are the reciprocals of the corresponding elements of $\mathbf{D}^{1/2}$. \begin{enumerate} \item Show that the inverse of $\boldsymbol{\Sigma}^{1/2}$ is $\boldsymbol{\Sigma}^{-1/2}$, justifying the notation. \item Show $\boldsymbol{\Sigma}^{-1/2} \boldsymbol{\Sigma}^{-1/2} = \boldsymbol{\Sigma}^{-1}$. \end{enumerate} \item The (square) matrix $\boldsymbol{\Sigma}$ is said to be \emph{positive definite} if $\mathbf{v}^\prime \boldsymbol{\Sigma} \mathbf{v} > 0$ for all vectors $\mathbf{v} \neq \mathbf{0}$. Show that the eigenvalues of a positive definite matrix are all strictly positive. Hint: start with the definition of an eigenvalue and the corresponding eigenvalue: $\boldsymbol{\Sigma}\mathbf{v} = \lambda \mathbf{v}$. This is \emph{much} cleaner than the way I did it in lecture. \item Let $\boldsymbol{\Sigma}$ be a symmetric, positive definite matrix. Putting together a couple of results you have proved above, establish that $\boldsymbol{\Sigma}^{-1}$ exists. \end{enumerate} \item Do problem 2.7 in the text. \vspace{3mm} \hrule %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Random vectors %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \item Do problem 3.9 in the text. %Let $\mathbf{X}$ and $\mathbf{Y}$ be random matrices of the same dimensions. Show %$E(\mathbf{X} + \mathbf{Y})=E(\mathbf{X})+E(\mathbf{Y})$. Recall the definition %$E(\mathbf{Z})=[E(Z_{i,j})]$. \item Let $\mathbf{X}$ be a random matrix, and $\mathbf{B}$ be a matrix of constants. Show $E(\mathbf{XB})=E(\mathbf{X})\mathbf{B}$. Recall the definition $\mathbf{AB}=[\sum_{k}a_{i,k}b_{k,j}]$. \item If the $p \times 1$ random vector $\mathbf{X}$ has variance-covariance matrix $\mathbf{\Sigma}$ and $\mathbf{A}$ is an $m \times p$ matrix of constants, prove that the variance-covariance matrix of $\mathbf{AX}$ is $\mathbf{A \Sigma A}^\prime$. Start with the definition of a variance-covariance matrix: \begin{displaymath} cov(\mathbf{Z})=E(\mathbf{Z}-\boldsymbol{\mu}_z)(\mathbf{Z}-\boldsymbol{\mu}_z)^\prime. \end{displaymath} \item Do problem 3.10 in the text. % If the $p \times 1$ random vector $\mathbf{X}$ has mean $\boldsymbol{\mu}$ and variance-covariance matrix $\mathbf{\Sigma}$, show $\mathbf{\Sigma} = E(\mathbf{XX}^\prime) - \boldsymbol{\mu \mu}^\prime$. \item Let the $p \times 1$ random vector $\mathbf{X}$ have mean $\boldsymbol{\mu}$ and variance-covariance matrix $\mathbf{\Sigma}$, and let $\mathbf{c}$ be a $p \times 1$ vector of constants. Find $cov(\mathbf{X}+\mathbf{c})$. Show your work. \item Let $\mathbf{X}$ be a $p \times 1$ random vector with mean $\boldsymbol{\mu}_x$ and variance-covariance matrix $\mathbf{\Sigma}_x$, and let $\mathbf{Y}$ be a $q \times 1$ random vector with mean $\boldsymbol{\mu}_y$ and variance-covariance matrix $\mathbf{\Sigma}_y$. Recall that $C(\mathbf{X},\mathbf{Y})$ is the $p \times q$ matrix $ C(\mathbf{X},\mathbf{Y}) = E\left((\mathbf{X}-\boldsymbol{\mu}_x)(\mathbf{Y}-\boldsymbol{\mu}_y)^\prime\right)$. \begin{enumerate} \item What is the $(i,j)$ element of $C(\mathbf{X},\mathbf{Y})$? \item Find an expression for $cov(\mathbf{X}+\mathbf{Y})$ in terms of $\mathbf{\Sigma}_x$, $\mathbf{\Sigma}_y$ and $C(\mathbf{X},\mathbf{Y})$. Show your work. \item Simplify further for the special case where $Cov(X_i,Y_j)=0$ for all $i$ and $j$. \item Let $\mathbf{c}$ be a $p \times 1$ vector of constants and $\mathbf{d}$ be a $q \times 1$ vector of constants. Find $ C(\mathbf{X}+\mathbf{c}, \mathbf{Y}+\mathbf{d})$. Show your work. \end{enumerate} \item Do problem 3.20 in the text. The answer is in the back of the book. \item Do problem 3.21 in the text. The answer is in the back of the book. \end{enumerate} % \vspace{5mm} \noindent \begin{center}\begin{tabular}{l} \hspace{6in} \\ \hline \end{tabular}\end{center} This assignment was prepared by \href{http://www.utstat.toronto.edu/~brunner}{Jerry Brunner}, Department of Statistical Sciences, University of Toronto. It is licensed under a \href{http://creativecommons.org/licenses/by-sa/3.0/deed.en_US} {Creative Commons Attribution - ShareAlike 3.0 Unported License}. Use any part of it as you like and share the result freely. The \LaTeX~source code is available from the course website: \href{http://www.utstat.toronto.edu/~brunner/oldclass/302f14} {\small\texttt{http://www.utstat.toronto.edu/$^\sim$brunner/oldclass/302f14}} \end{document}