definition: A sequence of numbers \(x_{n}\) converges to limit \(x^{\ast }\), if for every positive number \(\epsilon \) there exist a number \(r\) such that \(\left \vert x_{n}-x^{\ast }\right \vert <\epsilon \) for all \(n>r\)
note: Hence to show a sequence converges to \(x^{\ast }\), all what we have to do is find \(r\) such that for any given \(\epsilon \), we get \(\left \vert x_{n}-x^{\ast }\right \vert <\epsilon \) whenever \(n>r\)
Example: to show that \(x_{n}=\frac {n+1}{n}\) converges to \(1\), need to show that there exist a number \(r\) such that \(\left \vert \frac {n+1}{n}-1\right \vert <\epsilon \) whenever \(n>r\). Rewrite we have \(\left \vert \frac {1}{n}\right \vert <\epsilon \), hence we see that \(r=\epsilon ^{-1}\), because whenever \(n>r\), then \(\left \vert \frac {1}{n}\right \vert <\epsilon \). For example, assume \(\epsilon =0.1,\) then \(r=10,\) and whenever \(n>10\), we have \(\frac {1}{n}<0.1\)
It is clear that \(\lim _{n->\infty }\frac {n+1}{n}=1\).
The order of the convergence of a sequence is the largest number \(q\) s.t. \(\lim _{n\rightarrow \infty }\frac {\left \vert x_{n+1}-x^{\ast }\right \vert }{\left \vert x_{n}-x^{\ast }\right \vert ^{q}}\) exist
This is the same as writing \(\lim _{n\rightarrow \infty }\frac {\left \vert e_{n+1}\right \vert }{\left \vert e_{n}\right \vert ^{q}}\) where \(e_{n}\) is the error at \(x_{n}\)
The rate of convergence is linear if we can find constant \(c<1\) and integer \(N\) s.t. \[ \left \vert e_{n+1}\right \vert \leq c\ \left \vert e_{n}\right \vert \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ n\geq N \]
The rate of convergence is superlinear if we can find sequence \(\varepsilon _{n}\) tending to zero and integer \(N\) s.t.
\[ \left \vert e_{n+1}\right \vert \leq \varepsilon _{n}\ \left \vert e_{n}\right \vert \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ n\geq N \]
question: I do not understand this definition Can I say it as the linear, but an exponent \(1<\alpha <2\), and write
The rate of convergence is superlinear if we can find constant \(c<1\) and integer \(N\) s.t. \[ \left \vert e_{n+1}\right \vert \leq c\ \left \vert e_{n}\right \vert ^{\alpha }\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ n\geq N \]
The rate of convergence is quadratic if we can find constant \(c\) NOT Necessarily less than 1, and integer \(N\) s.t. \[ \left \vert e_{n+1}\right \vert \leq c\ \left \vert e_{n}\right \vert ^{2}\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ n\geq N \]
idea: To show that a sequence converges quadratically to some limit, start with the expression for \(e_{n+1}\) and manipulate it to show that that is it \(\leq \) some constant \(\times e_{n}\)
These are means by which to compare 2 sequences to each others.
def: big O: One sequence \(x_{n}\) is bounded by a linear scaled version of a second sequence
we say that \(x_{n}=O\alpha _{n}\) if there is a constant \(C\) and \(N\), s.t. \(x_{n}\leq C\alpha _{n}\) for all \(n>N\)
How to find if \(x_{n}=O\left (\alpha _{n}\right ) \)? start with \(x_{n}\) expression and manipulate it so that it has only terms that contain \(\alpha _{n}\) with some multipliers (the constant).
Or, easier, just look to see if \(x_{n}\) is bigger or smaller than \(\alpha _{n}\), if it is bigger, then it goes to zero AFTER \(\alpha _{n}\), hence use BIG O. if it is smaller, then it goes to zero BEFORE \(\alpha _{n}\), hence us little o.
def: little o: we say \(x_{n}=o\left (\alpha _{n}\right ) \) if \(\lim _{n\rightarrow \infty }\frac {x_{n}}{\alpha _{n}}=0\)
To test the above, given \(x_{n}=\frac {n+1}{n^{2}}\) and \(\alpha _{n}=\frac {1}{n}\), what is the relation between them? we see that \(x_{n}=\frac {1}{n}+\frac {1}{n^{2}}\), so it is BIGGER than \(\alpha _{n}\), so it goes to zero AFTER. Hence \(x_{n}=O\alpha _{n}\)
given \(x_{n}=\frac {1}{n\ln n}\), we see that this is SMALLER than \(\alpha _{n}\), hence it will go to zero BEFORE \(\alpha _{n}\), hence \(x_{n}=o\left ( \alpha _{n}\right ) \)
question: verify I can do this reasoning all the time.