UP

PDF (letter size)

PDF (legal size)

By theorem B, chapter 6, "Mathematical Statistics and Data Analysis", 2nd edition, John Rice, page 181, which states that the distribution of \(\frac {\left ( n-1\right ) S^{2}}{\sigma ^{2}}\) is a chi-square distribution with \(n-1\) degrees of freedom.

Hence

\[ Var\left ( \frac {\left ( n-1\right ) S^{2}}{\sigma ^{2}}\right ) =Var\left ( t_{n-1}\right ) \]

Since \(\frac {n-1}{\sigma ^{2}}\) is not random, then applying the property that \(Var\left ( c\ X\right ) =c^{2}Var\left ( X\right ) \) when \(c\) is not random to the above, where in this case \(c=\frac {\left ( n-1\right ) }{\sigma ^{2}}\)  and rearranging, we obtain

\[ Var\left ( S^{2}\right ) =\frac {\sigma ^{4}}{\left ( n-1\right ) ^{2}}Var\left ( t_{n-1}\right ) \]

However, \(Var\left ( t_{n-1}\right ) =2(n-1)\) 1, hence

\[ Var\left ( S^{2}\right ) =2\frac {\sigma ^{4}}{n-1}\]

Let

\[ S_{n}=\frac {1}{n}{\displaystyle \sum \limits _{i=1}^{n}} \left \vert X_{i}\right \vert \]

Find moment generation function

\begin{align*} M_{S_{n}}\left ( t\right ) & =E(e^{\frac {t}{n}{\displaystyle \sum \limits _{i=1}^{n}} \left \vert X_{i}\right \vert })\\ & =E\left ( \Pi e^{\frac {t}{n}\left \vert X_{i}\right \vert }\right ) \\ & =\Pi E\left ( e^{\frac {t}{n}\left \vert X_{i}\right \vert }\right ) \\ & =\left [ M_{\left \vert X\right \vert }\left ( \frac {t}{n}\right ) \right ] ^{n}\end{align*}

To find \(M_{\left \vert X\right \vert }\left ( \frac {t}{n}\right ) \), and noting that \(\mu =0\) and \(\sigma =1\) we obtain2

\[ M_{\left \vert X\right \vert }\left ( \frac {t}{n}\right ) =\frac {1}{\sqrt {2\pi }}\int _{-\infty }^{\infty }e^{\frac {t}{n}\left \vert x\right \vert }e^{\frac {-x^{2}}{2}}dx \]

Due to symmetry of normal distribution and since \(\left \vert x\right \vert \) is positive always the above can be written as3

\begin{align*} M_{\left \vert X\right \vert }\left ( \frac {t}{n}\right ) & =\frac {2}{\sqrt {2\pi }}\int _{0}^{\infty }e^{\frac {tx}{n}-\frac {x^{2}}{2}}dx\\ & =e^{\frac {t^{2}}{2n^{2}}}\left ( 1+\operatorname {erf}\left ( \frac {t}{\sqrt {2}n}\right ) \right ) \end{align*}

Hence

\[ M_{S_{n}}\left ( t\right ) =\left [ e^{\frac {t^{2}}{2n^{2}}}\left ( 1+\operatorname {erf}\left ( \frac {t}{\sqrt {2}n}\right ) \right ) \right ] ^{n}\]

The limit of the above as \(n\rightarrow \infty \) is \(e^{\sqrt {\frac {2}{\pi }}t}\). Therefore

\[ M_{S_{n}}\left ( t\right ) =e^{\sqrt {\frac {2}{\pi }}t}\]

We see now that \(E\left ( S_{n}\right ) =M^{\prime }\left ( 0\right ) =\sqrt {\frac {2}{\pi }}\) and \(E\left ( S_{n}^{2}\right ) =M^{\prime \prime }\left ( 0\right ) =\frac {2}{\pi }\), therefore \(Var\left ( S_{n}\right ) =\frac {2}{\pi }-\left ( \sqrt {\frac {2}{\pi }}\right ) ^{2}=0.\) (this means all sums add to same value for large \(n\), did I make a mistake? I did not expect this). Hence

\[ S_{n}\overset {\text {in distribution}}{\rightarrow }N\left ( \sqrt {\frac {2}{\pi }},0\right ) \]

For pivotal term use \(\frac {\left ( n-1\right ) s^{2}}{\sigma ^{2}} \sim \chi _{\left ( n-1\right ) }^{2}\), where \(s^{2}\) is sample variance \(\sigma ^{2}\) is population variance, and hence we write (following class notes on 10/29/07) the confidence interval as

\[ P\left [ -z_{p}<\theta <z_{p}\right ] =1-\alpha \]

Where from table A7, \(z_{p}=1.96\) for normal r.v. at \(95\%\) and Where \(\theta =\frac {\frac {\left ( n-1\right ) s_{2}^{2}}{\sigma _{2}^{2}}}{\frac {\left ( n-1\right ) s_{1}^{2}}{\sigma _{1}^{2}}}\)

Hence the C.I. becomes

\begin{align*} P\left [ -1.96<\frac {\sigma _{1}^{2}s_{2}^{2}}{\sigma _{2}^{2}s_{1}^{2}}<1.96\right ] & =1-\alpha \\ P\left [ -1.96\frac {s_{1}^{2}}{s_{2}^{2}}<\frac {\sigma _{1}^{2}}{\sigma _{2}^{2}}<1.96\frac {s_{1}^{2}}{s_{2}^{2}}\right ] & =1-\alpha \end{align*}

Where the sample variance \(s_{2}^{2}=\frac {1}{n-1}\sum _{i=1}^{n}\left ( X_{i}-\bar {X}\right ) ^{2}\), and \(s_{1}^{2}=\frac {1}{n-1}\sum _{i=1}^{n}\left ( Y_{i}-\bar {Y}\right ) ^{2}\)

For \(95\%\) confidence, \(\alpha =0.05\). Hence the the final answer for the C.I. is

\[ P\left [ -1.96\frac {s_{1}^{2}}{s_{2}^{2}}<\frac {\sigma _{1}^{2}} {\sigma _{2}^{2}}<1.96\frac {s_{1}^{2}}{s_{2}^{2}}\right ] =0.95 \]

Not sure what more I can do with the above so I think I will stop here.

First find the joint density of \(X,Y\). Since \(X,Y\) are independent, then the joint density \(f_{X,Y}\left ( x,y\right ) \) \(=f_{X}\left ( x\right ) f_{Y}\left ( y\right ) \ \)over\(\ -\infty <x<\infty \ \) and \(y>0\)

But \(f_{X}\left ( x\right ) =\frac {1}{\sqrt {2\pi }\sigma }e^{\frac {-\left ( x-\mu \right ) ^{2}}{2\sigma ^{2}}}\) and \(f_{Y}\left ( y\right ) =\lambda e^{-\lambda y}\), hence the joint density is (after substituting for \(\mu =0,\sigma ^{2}=2,\lambda =1\) is

\[ \fbox {$f_{X,Y}\left ( x,y\right ) =\frac {1}{2\sqrt {\pi }}e^{\frac {-x^{2}}{4}}e^{-y}\ \ \ \ \ \ \ \ \ \ \ -\infty <x<\infty ,\ y>0$}\]

Now Let \(Z=\frac {X}{\sqrt {Y}}\), and let \(U=Y\)

Hence

\begin{equation} f_{Z,U}\left ( z,u\right ) =\left \vert J\right \vert f_{X,Y}\left ( z,u\right ) \tag {1}\end{equation}

Where

\begin{align*} J & =\det \begin {bmatrix} \frac {\partial Z}{\partial X} & \frac {\partial Z}{\partial Y}\\ \frac {\partial U}{\partial X} & \frac {\partial U}{\partial Y}\end {bmatrix} =\det \begin {bmatrix} \frac {1}{\sqrt {Y}} & \frac {-X}{2\sqrt {Y}}\\ 0 & 1 \end {bmatrix} \\ & =\sqrt {\frac {1}{Y}}\end{align*}

so

\[ \left \vert J^{-1}\right \vert =\sqrt {Y}=\sqrt {U}\]

Hence, from (1) and substitute \(X=Z\sqrt {U}\) and \(Y=U\), we obtain\(\ \ \ \)

\[ f_{Z,U}\left ( z,u\right ) =\sqrt {u}\frac {1}{2\sqrt {\pi }}e^{\frac {-z^{2}u}{4}}e^{-u}\ \]

Hence the marginal density

\[ \fbox {$f_{Z}\left ( z\right ) =\int _{0}^{\infty }f_{Z,U}\left ( z,u\right ) \ du$}\]

Then

\begin{align*} f_{Z}\left ( z\right ) & =\frac {1}{2\sqrt {\pi }}\int _{0}^{\infty }u^{\frac {1}{2}}e^{\frac {-z^{2}u-4u}{4}}\ du\\ & =\frac {1}{2\sqrt {\pi }}\int _{0}^{\infty }u^{\frac {1}{2}}e^{-\left ( 1+\frac {z^{2}}{4}\right ) u}\ du \end{align*}

Now Gamma distribution is \(f\left ( w\right ) =\frac {\lambda ^{\alpha }}{\Gamma \left ( \alpha \right ) }w^{\alpha -1}e^{-\lambda w}\), hence if we replace \(\lambda =1+\frac {z^{2}}{4}\) and \(\alpha =\frac {3}{2},\) then we have

\begin{align*} f_{Z}\left ( z\right ) & =\frac {1}{2\sqrt {\pi }}\frac {\Gamma \left ( \alpha \right ) }{\lambda ^{\alpha }}\overset {=1}{\overbrace {\int _{0}^{\infty }\frac {\lambda ^{\alpha }}{\Gamma \left ( \alpha \right ) }w^{\alpha -1}e^{-\lambda w}\ dw}}\\ & =\frac {1}{2\sqrt {\pi }}\frac {\Gamma \left ( \alpha \right ) }{\lambda ^{\alpha }}\end{align*}

To simplify further,

\[ f_{Z}\left ( z\right ) =\frac {1}{2\sqrt {\pi }}\frac {\Gamma \left ( \frac {3}{2}\right ) }{\left ( 1+\frac {z^{2}}{4}\right ) ^{\frac {3}{2}}}\]

But \(\Gamma \left ( \frac {3}{2}\right ) =\frac {\sqrt {\pi }}{2}\), hence

\[ \fbox {$f_{Z}\left ( z\right ) =\frac {1}{4}\left ( 1+\frac {z^{2}}{4}\right ) ^{-\frac {3}{2}}$}\]

Hence the pdf of \(\frac {X}{\sqrt {Y}}\)is

\[ f_{X,Y}\left ( x,y\right ) =\frac {1}{4}\left ( 1+\frac {x^{2}}{4y}\right ) ^{-\frac {3}{2}}\]

To verify this is a pdf, I integrate it from \(-\infty \) to \(+\infty \) to see if I get 1:

Here is a plot of the distribution

Another attempt at problem (2)

\begin{align} \bar {X}_{n} & =\frac {1}{n}{\displaystyle \sum \limits _{i=1}^{n}} X_{i}\nonumber \\ \left \vert \bar {X}_{n}\right \vert & =\left \vert \frac {1}{n}{\displaystyle \sum \limits _{i=1}^{n}} X_{i}\right \vert \nonumber \\ & \leq \frac {1}{n}{\displaystyle \sum \limits _{i=1}^{n}} \left \vert X_{i}\right \vert \tag {1}\end{align}

By definition, the CDF of \(\left \vert \bar {X}_{n}\right \vert \) is

\begin{align*} F_{\left \vert \bar {X}_{n}\right \vert }\left ( \left \vert \bar {X}_{n}\right \vert =c\right ) & =P\left ( \left \vert \bar {X}_{n}\right \vert <c\right ) \\ & =P\left ( -c<\bar {X}_{n}<c\right ) \\ & =P\left ( \frac {-c-\mu }{\sigma /\sqrt {n}}<\frac {\bar {X}_{n}-\mu }{\sigma /\sqrt {n}}<\frac {c-\mu }{\sigma /\sqrt {n}}\right ) \end{align*}

Since \(\mu =0,\sigma =1\) we obtain

\begin{equation} F_{\left \vert \bar {X}_{n}\right \vert }\left ( \left \vert \bar {X}_{n}\right \vert =c\right ) =P\left ( \frac {-c}{1/\sqrt {n}}<\frac {\bar {X}_{n}}{1/\sqrt {n}}<\frac {c}{1/\sqrt {n}}\right ) \tag {2}\end{equation}

Now I need to combine (1) and (2). I am not sure how.

But central limit theorem tells us that as \(n\) gets large, the distribution of the sample mean \(\bar {X}_{n}\) approach normal distribution with mean \(\mu \) and variance \(\frac {\sigma ^{2}}{n}\), hence \(\bar {X}_{n}\overset {\text {in distribution}}{\rightarrow }N\left ( \mu ,\frac {\sigma ^{2}}{n}\right ) \), hence the above becomes

\[ F_{\left \vert \bar {X}_{n}\right \vert }\left ( \left \vert \bar {X}_{n}\right \vert =c\right ) \simeq \Phi \left ( c\sqrt {n}\right ) -\Phi \left ( -c\sqrt {n}\right ) \]