% Sample Question document for STA312 \documentclass[12pt]{article} %\usepackage{amsbsy} % for \boldsymbol and \pmb %\usepackage{graphicx} % To include pdf files! \usepackage{amsmath} \usepackage{amsbsy} \usepackage{amsfonts} \usepackage[colorlinks=true, pdfstartview=FitV, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref} % For links \usepackage{fullpage} %\pagestyle{empty} % No page numbers \begin{document} %\enlargethispage*{1000 pt} \begin{center} {\Large \textbf{Sample Questions: Maximum Likelihood Part 2}}%\footnote{} \vspace{1 mm} STA312 Spring 2019. Copyright information is at the end of the last page. \end{center} \vspace{5mm} \begin{enumerate} \item Let $X_1, \ldots, X_n$ be independent $N(\mu,\sigma^2)$ random variables. \begin{enumerate} \item Derive formulas for the maximum likelihood estimates of $\mu$ and $\sigma^2$. We will establish that it's a maximum later. Show your work and \textbf{circle your final answer}. \pagebreak %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \item Calculate the Hessian of the minus log likelihood function: $\mathbf{H} = \left[\frac{\partial^2 (-\ell)} {\partial\theta_i\partial\theta_j}\right]$. Show your work. \pagebreak %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \item Give $\widehat{\mathbf{V}}_n$, the estimated asymptotic variance-covariance matrix of the MLE. Show some work. \pagebreak %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \item Consider a large-sample $Z$-test of $H_0:\mu=\mu_0$. Give an explicit formula for the test statistic. This is something you would be able to compute with a calculator given $\widehat{\mu}$ and $\widehat{\sigma}^2$. \vspace{100mm} \item Consider a large-sample $Z$-test of $H_0:\sigma^2=\sigma^2_0$. Give an explicit formula for the test statistic. This is something you would be able to compute with a calculator. \pagebreak %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \item Consider the large-sample likelihood ratio test of $H_0: \mu=\mu_0$. Derive an explicit formula for the test statistic $G^2$. Show your work and \emph{keep simplifying!}. \end{enumerate} % End of the first st of normal questions \pagebreak %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % I had lifted the dead pixels problem: Q7 of 2101f18 A5. But it took us too far afield. Stay with normal. \item The file \href{http://www.utstat.toronto.edu/~brunner/data/legal/normal.data.txt} {\texttt{http://www.utstat.toronto.edu/$\sim$brunner/data/legal/normal.data.txt}} has a random sample from a normal distribution. \begin{enumerate} \item Find the maximum likelihood estimates of $\widehat{\mu}$ and $\widehat{\sigma}^2$ numerically. Compare the answer to your closed-form solution. \item Show that the minus log likelihood is indeed minimized at $(\widehat{\mu}, \widehat{\sigma}^2)$ for this data set. \item Calculate the estimated asymptotic covariance matrix of the MLEs. \item Give a ``better" estimated asymptotic covariance matrix based on your closed-form solution. \item Calculate a large-sample 95\% confidence interval for $\sigma^2$. \item Test $H_0: \mu = 103$ with a \begin{enumerate} \item $Z$-test. \item Likelihood ratio chi-squared test. Compare the closed-form version. \item Wald chi-squared test. \end{enumerate} Give the test statistic and the $p$-value for each test. \item The coefficient of variation (used in sample surveys and business statistics) is the standard deviation divided by the mean. \begin{enumerate} \item Show that multiplication by a positive constant does not affect the coefficient of variation. This is a paper and pencil calculation. \item Give a numerical point estimate of the coefficient of variation for the normal data of this question. Actually, it's the maximum likelihood estimate, because \emph{the invariance principle of maximum likelihood estimation says that the MLE of a function is that function of the MLE}. \item Using the delta method, give a 95\% confidence interval for the coefficient of variation. Start with a paper and pencil calculation of \.{g}$(\boldsymbol{\theta}) = \left( \frac{\partial g}{\partial\theta_1}, \ldots , \frac{\partial g}{\partial\theta_k} \right)$. \end{enumerate} \end{enumerate} \end{enumerate} % End of all the questions \vspace{30mm} \noindent \begin{center}\begin{tabular}{l} \hspace{6in} \\ \hline \end{tabular}\end{center} This assignment was prepared by \href{http://www.utstat.toronto.edu/~brunner}{Jerry Brunner}, Department of Mathematical and Computational Sciences, University of Toronto. It is licensed under a \href{http://creativecommons.org/licenses/by-sa/3.0/deed.en_US} {Creative Commons Attribution - ShareAlike 3.0 Unported License}. Use any part of it as you like and share the result freely. The \LaTeX~source code is available from the course website: \begin{center} \href{http://www.utstat.toronto.edu/~brunner/oldclass/312s19} {\small\texttt{http://www.utstat.toronto.edu/$^\sim$brunner/oldclass/312s19}} \end{center} \end{document} \item Do you reject the null hypothesis? Answer Yes or No. \item Are the results statistically significant? Answer Yes or No. \item Do these data contradict claim that $\theta = 1.16$? Answer Yes or No.