By theorem B, chapter 6, "Mathematical Statistics and Data Analysis", 2nd edition, John Rice,
page 181, which states that the distribution of \(\frac {\left ( n-1\right ) S^{2}}{\sigma ^{2}}\) is a chi-square distribution with \(n-1\) degrees of
freedom.
Since \(\frac {n-1}{\sigma ^{2}}\) is not random, then applying the property that \(Var\left ( c\ X\right ) =c^{2}Var\left ( X\right ) \) when \(c\) is not random to the above, where in
this case \(c=\frac {\left ( n-1\right ) }{\sigma ^{2}}\) and rearranging, we obtain
We see now that \(E\left ( S_{n}\right ) =M^{\prime }\left ( 0\right ) =\sqrt {\frac {2}{\pi }}\) and \(E\left ( S_{n}^{2}\right ) =M^{\prime \prime }\left ( 0\right ) =\frac {2}{\pi }\), therefore \(Var\left ( S_{n}\right ) =\frac {2}{\pi }-\left ( \sqrt {\frac {2}{\pi }}\right ) ^{2}=0.\) (this means all sums add to same value for large \(n\), did I make a
mistake? I did not expect this). Hence
For pivotal term use \(\frac {\left ( n-1\right ) s^{2}}{\sigma ^{2}} \sim \chi _{\left ( n-1\right ) }^{2}\), where \(s^{2}\) is sample variance \(\sigma ^{2}\) is population variance, and hence we write
(following class notes on 10/29/07) the confidence interval as
Where from table A7, \(z_{p}=1.96\) for normal r.v. at \(95\%\) and Where \(\theta =\frac {\frac {\left ( n-1\right ) s_{2}^{2}}{\sigma _{2}^{2}}}{\frac {\left ( n-1\right ) s_{1}^{2}}{\sigma _{1}^{2}}}\)
Not sure what more I can do with the above so I think I will stop here.
First find the joint density of \(X,Y\). Since \(X,Y\) are independent, then the joint density \(f_{X,Y}\left ( x,y\right ) \) \(=f_{X}\left ( x\right ) f_{Y}\left ( y\right ) \ \)over\(\ -\infty <x<\infty \ \) and
\(y>0\)
But \(f_{X}\left ( x\right ) =\frac {1}{\sqrt {2\pi }\sigma }e^{\frac {-\left ( x-\mu \right ) ^{2}}{2\sigma ^{2}}}\) and \(f_{Y}\left ( y\right ) =\lambda e^{-\lambda y}\), hence the joint density is (after substituting for \(\mu =0,\sigma ^{2}=2,\lambda =1\) is
Now Gamma distribution is \(f\left ( w\right ) =\frac {\lambda ^{\alpha }}{\Gamma \left ( \alpha \right ) }w^{\alpha -1}e^{-\lambda w}\), hence if we replace \(\lambda =1+\frac {z^{2}}{4}\) and \(\alpha =\frac {3}{2},\) then we have
Now I need to combine (1) and (2). I am not sure how.
But central limit theorem tells us that as \(n\) gets large, the distribution of the sample
mean \(\bar {X}_{n}\) approach normal distribution with mean \(\mu \) and variance \(\frac {\sigma ^{2}}{n}\), hence \(\bar {X}_{n}\overset {\text {in distribution}}{\rightarrow }N\left ( \mu ,\frac {\sigma ^{2}}{n}\right ) \), hence the above
becomes