5.5. Properties of Convergent Sequences
The first question to study will be if convergence has implications on monotony and boundedness. The results however are poor.
As we find monotone and nonmonotone sequences among the convergent sequences, and among the nonconvergent ones as well, there is no connexion between monotony and convergence.
A slight improvement is known with boundedness: Though there are divergent bounded sequences, take e.g. $({(1)}^{n})$, we can prove that convergence always implies boundedness.
Proposition:
Each convergent sequence is bounded. 
[5.5.1] 
Proof: If ${a}_{n}\to g$, then according to [5.4.2] we find especially for 1 a number ${n}_{0}\in {\mathbb{N}}^{\ast}$, such that
$g1<{a}_{n}<g+1\text{for all}n\ge {n}_{0}$.
From that we get for all $n\in {\mathbb{N}}^{\ast}$:
$\mathrm{min}\{g\mathrm{1,}{a}_{1},\dots ,{a}_{{n}_{0}1}\}\le {a}_{n}\le \mathrm{max}\{g+\mathrm{1,}{a}_{1},\dots ,{a}_{{n}_{0}1}\}$.

One conclusion is a new criterion for divergence:
Unbounded sequences are always divergent.
Thus the divergence of e.g. $(n)$ is also due to its unboundedness.
There are certain connexions between the position of the sequences members and the position of the limit.
More precisely we have:
Proposition: Let ${a}_{n}\to g$. Then we have for any $c\in \mathbb{R}$:


$g>c\Rightarrow {a}_{n}>c$ for all n as of an appropriate ${n}_{0}$
$g<c\Rightarrow {a}_{n}<c$ for all n as of an appropriate ${n}_{0}$

[5.5.2] 



${a}_{n}\ge c\text{for all}n\Rightarrow g\ge c$
${a}_{n}\le c\text{for all}n\Rightarrow g\le c$

[5.5.3] 
Proof: We only show the first variant in each case.
1. ►

Let $g>c$, i.e. $r\u2254gc>0$. As ${a}_{n}\to g$ the special $\epsilon $neighbourhood $]gr,g+r[$ contains all sequence members from a certain ${n}_{0}$ onwards. In particular this means:
${a}_{n}>gr=g(gc)=c\text{for all}n\ge {n}_{0}\text{.}$

2. ►

Suppose $g<c$. According to 1. we may conclude ${a}_{n}<c$ for nearly all n. But this contradicts the premise ${a}_{n}\ge c$ for all n.


Consider:
1. is often used, especially if $c=0$, the following way:
$g\ne c\Rightarrow {a}_{n}\ne c$ for all n as of an appropriate ${n}_{0}$

[5.5.4] 
The example $0<\frac{1}{n}\to 0$ shows that 2. cannot be tightened by replacing e.g. $\ge $ by $>$.
As ${a}_{n}\to g\iff {a}_{n+k}\to g$ we may however in 2. extenuate the premise to 'for nearly all n'.
Obviously both statements in 2. can be combined to:
${a}_{n}\in [a,b]\text{for all}n\Rightarrow g\in [a,b]$

[5.5.5] 
When testing zero sequences for convergence we may confine ourselves to positive sequences, which often provides some technical comfort.
Proposition: For any sequence $({a}_{n})$ we have:
${a}_{n}\to 0\iff {a}_{n}\to 0$

[5.5.6] 
Proof: From the equality ${a}_{n}0={a}_{n}={a}_{n}0$ we see that for each $\epsilon >0$ the following equivalence holds:
${a}_{n}0<\epsilon \iff {a}_{n}0<\epsilon $.
And from that our assertion is immediate.

Let's take $(\frac{{(1)}^{n}}{n})$ as an example. From $\frac{{(1)}^{n}}{n}=\frac{1}{n}\to 0$ we get:
$\frac{{(1)}^{n}}{n}\to 0$
Approaching the limit of a convergent sequence has implications for the sequence members among themselves:
Their distance to each other will also decrease arbitrarily. The following property of a convergent sequence is called the CauchyCondition.
Proposition: If ${a}_{n}\to g$ there is an ${n}_{0}\in {\mathbb{N}}^{\ast}$ for each $\epsilon >0$ such that
${a}_{n}{a}_{m}<\epsilon \text{for all}n,m\ge {n}_{0}$

[5.5.7] 
Proof: At first we get an ${n}_{0}\in {\mathbb{N}}^{\ast}$ such that
${a}_{n}g<\frac{\epsilon}{2}\text{for all}n\ge {n}_{0}\text{.}$
Thus, using the triangle inequality, the following estimate holds for all $n,m\ge {n}_{0}\text{:}$
${a}_{n}{a}_{m}={a}_{n}g+g{a}_{m}\le {a}_{n}g+{a}_{m}g<\frac{\epsilon}{2}+\frac{\epsilon}{2}=\epsilon$

It is interesting to ask if the CauchyCondition alone implies convergence. In general the answer is 'No'. There is a divergent sequence in $\mathbb{Q}$ satisfying the CauchyCondition! In $\mathbb{R}$ however we have peculiar circumstances and indeed real CauchySequences are always convergent, as we will prove in chapter 8.
Finaly we note the nesting theorem, a criterion that often copes successfully with 'difficult' sequences.
Proposition (nesting theorem): Let $({a}_{n})$, $({b}_{n})$ and $({c}_{n})$ be three sequences such that
${a}_{n}\le {b}_{n}\le {c}_{n}\text{for all}n\in {\mathbb{N}}^{\ast}$.
If $({a}_{n})$ and $({c}_{n})$ are convergent to the same limit g, i.e. ${a}_{n}\to g$ and ${c}_{n}\to g$, then $({b}_{n})$ also converges to g:
Proof: We use the criterion [5.4.2]. Let $\epsilon >0$ be arbitrary. As ${a}_{n}\to g$, we find an ${n}_{1}\in {\mathbb{N}}^{\ast}$ such that
$g\epsilon <{a}_{n}<g+\epsilon \text{for all}n\ge {n}_{1}\text{.}$
Similar we get an ${n}_{2}\in {\mathbb{N}}^{\ast}$ with
$g\epsilon <{c}_{n}<g+\epsilon \text{for all}n\ge {n}_{2}\text{.}$
Thus for all $n\ge {n}_{0}\u2254\mathrm{max}\{{n}_{1},{n}_{2}\}$ the following estimates hold:
$g\epsilon <{a}_{n}\le {b}_{n}\le {c}_{n}<g+\epsilon $
$g\epsilon <{b}_{n}<g+\epsilon $
The latter estimate however proves the convergence ${b}_{n}\to g$.

Consider:
Example:
$\frac{\mathrm{sin}n}{n}\to 0$

[5.5.9] 
Proof: All sine figures are between −1 and 1 so that the following estimate holds for all n. With the noted convergences we get our result from the nesting theorem.
$\begin{array}{ccc}\underset{\downarrow}{\frac{1}{n}}& \le \frac{\mathrm{sin}n}{n}\le & \underset{\downarrow}{\frac{1}{n}}\\ 0& & 0\end{array}$

This example is easily generalized. We only needed the boundedness of $(\mathrm{sin}n)$ and the fact that $(\frac{1}{n})$ is a zero sequence.
Proposition: If $({a}_{n})$ is bounded and $({b}_{n})$ convergent to zero, we have:
Proof: Due to [5.5.6] we only need to show: ${a}_{n}\cdot {b}_{n}\to 0$. As $({a}_{n})$ is bounded we get an $s\in {\mathbb{R}}^{>0}$ such that ${a}_{n}\le s$ for all n. So we have:
$0\le {a}_{n}\cdot {b}_{n}={a}_{n}\cdot {b}_{n}\le s\cdot {b}_{n}$
With [5.5.6] we see that both $({b}_{n})$ and $({b}_{n})$ are zero sequences as is $(s\cdot {b}_{n})$ according [5.6.5].
Furthermore we obviously have $0\to 0$ so that the assertion now holds by the nesting theorem.

