Limits of Sequences
A sequence of real numbers is a function \(X:\N\rightarrow\real\). Informally, the sequence \(X\) can be written as an infinite list of real numbers as \(X=(x_1,x_2,x_3,\ldots)\), where \(x_n=X(n)\). Other notations for sequences are \((x_n)\) or \(\{x_n\}_{n=1}^\infty\); we will use \((x_n)\). Some sequences can be written explicitly with a formula such as \(x_n = \frac{1}{n}\), \(x_n = \frac{1}{2^n}\), or \[ x_n = (-1)^n\cos(n^2+1), \] or we could be given the first few terms of the sequence, such as \[ X = (3, 3.1, 3.14, 3.141, 3.1415, \ldots). \] Some sequences may be given recursively. For example, \[ x_1 = 1, \qquad x_{n+1} = \frac{x_n}{n+1}, n\geq 1. \] Using the definition of \(x_{n+1}\) and the initial value \(x_1\) we can in principle find all the terms: \[ x_ 2 = \frac{1}{2}, x_3 = \frac{1/2}{3}, x_4 = \frac{1/6}{4},\; \ldots \] A famous sequence given recursively is the Fibonacci sequence which is defined as \(x_1=1\), \(x_2=1\), and \[ x_{n+1} = x_{n-1} + x_{n}, n\geq 2. \] Then \[ x_3 = 2, x_4 = 3, x_5 = 5, \ldots \] The range of a sequence \((x_n)\) is the set \[ \{x_n\;|\; n\in\N\}, \] that is, the usual range of a function. However, the range of a sequence is not the actual sequence (the range is a set and a sequence is a function). For example, if \(X=(1,2,3, 1,2,3, \ldots)\) then the range of \(X\) is \(\{1,2,3\}\). If \(x_n = \sin(\frac{n\pi}{2})\) then the range of \((x_n)\) is \(\{1, 0, -1\}\). Many concepts in analysis can be described using the long-term or limiting behavior of sequences. In calculus, you undoubtedly developed techniques to compute the limit of basic sequences (and hence show convergence) but you might have omitted the rigorous definition of the convergence of a sequence. Perhaps you were told that a given sequence \((x_n)\) converges to \(L\) if as \(n\rightarrow\infty\) the values \(x_n\) get closer to \(L\). Although this is intuitively sound, we need a more precise way to describe the meaning of the convergence of a sequence. Before we give the precise definition, we will consider an example.
Consider the sequence \((x_n)\) whose \(n\)th term is given by \(x_n = \frac{3n+2}{n+1}\). The values of \((x_n)\) for several values of \(n\) are displayed in Table 3.1.
The above data suggests that the values of the sequence \((x_n)\) become closer and closer to the number \(L=3\). For example, suppose that \(\eps = 0.005\) and consider the \(\eps\)-neighborhood of \(L=3\), that is, the interval \((3-\eps, 3+\eps) = (2.995, 3.005)\). Not all the terms of the sequence \((x_n)\) are in the \(\eps\)-neighborhood, however, it seems that all the terms of the sequence from \(x_{101}\) and onward are inside the \(\eps\)-neighborhood. In other words, IF \(n \geq 101\) then \(3-\eps \lt x_n \lt 3+\eps\), or equivalently \(|x_n - 3| \lt \eps\). Suppose now that \(\eps=0.00001\) and thus the new \(\eps\)-neighborhood is \((3-\eps,3+\eps)=(2.99999,3.000001)\). Then it is no longer true that \(|x_n-3| \lt \eps\) for all \(n\geq 101\). However, it seems that all the terms of the sequence from \(x_{1000000}\) and onward are inside the smaller \(\eps\)-neighborhood, in other words, \(|x_n - 3| \lt \eps\) for all \(n\geq 1,000,000\). We can extrapolate these findings and make the following hypothesis: For any given \(\eps \gt 0\) there exists a natural number \(K\in \N\) such that if \(n\geq K\) then \(|x_n - L| \lt \eps\).
The above example and our analysis motivates the following definition.
\(n\) | \(x_n\) |
---|---|
1 | 2.50000000 |
2 | 2.66666667 |
3 | 2.75000000 |
4 | 2.80000000 |
5 | 2.83333333 |
50 | 2.98039216 |
101 | 2.99019608 |
10,000 | 2.99990001 |
1,000,000 | 2.99999900 |
2,000,000 | 2.99999950 |
The sequence \((x_n)\) is said to converge if there exists a number \(L\in\real\) such that for any given \(\varepsilon \gt 0\) there exists \(K\in\N\) such that \(|x_n-L| \lt \varepsilon\) for all \(n\geq K\). In this case, we say that \((x_n)\) has limit \(L\) and we write
\[
\lim_{n\rightarrow\infty} x_n =L.
\]
If \((x_n)\) is not convergent then we say that it is divergent.
Hence, \(x_n\) converges to \(L\) if for any given \(\varepsilon \gt 0\) (no matter how small), there exists a point in the sequence \(x_K\) such that \(|x_K-L| \lt \varepsilon\), \(|x_{K+1}-L| \lt \varepsilon\), \(|x_{K+2}-L| \lt \varepsilon, \ldots\), that is, \(|x_n - L| \lt \eps\) for all \(n\geq K\). We will sometimes write \(\displaystyle\limi x_n = L\) simply as \(\lim x_n = L\) or \((x_n)\rightarrow L\).
Using the definition of the limit of a sequence, prove that \(\limi \frac{1}{n} = 0\).
Let \(\varepsilon \gt 0\) be arbitrary but fixed. By the Archimedean property of \(\real\), there exists \(K\in\N\) such that \(\frac{1}{K} \lt \varepsilon\). Then, if \(n\geq K\) then \(\frac{1}{n}\leq \frac{1}{K} \lt \eps\). Therefore, if \(n\geq K\) then
\begin{align*}
\left|x_n - 0\right| &= \left|\tfrac{1}{n}- 0\right|\\
&= \frac{1}{n}\\
& \leq \frac{1}{K}\\
& \lt \eps.
\end{align*}
This proves, by definition, that \(\limi \frac{1}{n} = 0\).
Using the definition of the limit of a sequence, prove that \(\limi \frac{3n+2}{n+1} = 3\).
Given an arbitrary \(\varepsilon \gt 0\), we want to prove that there exists \(K\in\N\) such that
\[
\left|\frac{3n+2}{n+1} - 3\right| \lt \varepsilon, \forall\; n\geq K.
\]
Start by analyzing \(|x_n-L|\):
\begin{align*}
|x_n-L| &=\left|\frac{3n+2}{n+1} - 3\right|\\
& = \left|\frac{-1}{n+1}\right|\\
& = \frac{1}{n+1}.
\end{align*}
Now, the condition that
\[
\left|\frac{3n+2}{n+1} - 3\right| = \frac{1}{n+1} \lt \varepsilon
\]
it is equivalent to
\[
\frac{1}{\varepsilon} - 1 \lt n.
\]
Now let's write the formal proof.
Let \(\varepsilon \gt 0\) be arbitrary and let \(K\in\N\) be such that \(\frac{1}{\varepsilon} - 1 \lt K\). Then, \(\frac{1}{K+1} \lt \varepsilon\). Now if \(n\geq K\) then \(\frac{1}{n+1} \leq \frac{1}{K+1}\) and thus if \(n\geq K\) then
\begin{align*}
\left|\frac{3n+2}{n+1} - 3\right| &= \left|\frac{-1}{n+1}\right|\\
&= \frac{1}{n+1}\\
& \leq \frac{1}{K+1}\\
& \lt \varepsilon.
\end{align*}
By definition, this proves that \((x_n) \rightarrow 3\).
Using the definition of the limit of a sequence, prove that \(\limi \frac{4n^3+3n}{n^3+6} = 4\).
Let \(x_n = \frac{4n^3+3n}{n^3+6} \). We want to show that for any given \(\varepsilon \gt 0\), there exists \(K\in\N\) such that if \(n\geq K\) then
\[
|x_n-4|=\left|\frac{4n^3+3n}{n^3+6} - 4\right| \lt \varepsilon.
\]
Start by analyzing \(|x_n-4|\):
\begin{align*}
|x_n-4|&=\left|\frac{4n^3+3n}{n^3+6} - 4\right| \\
&= \left|\frac{3n-24}{n^3+6}\right|
\end{align*}
In this case, it is difficult to explicitly isolate for \(n\) in terms of \(\varepsilon\). Instead we take a different approach; we find an upper bound for \(|x_n-4|=\left|\frac{3n-24}{n^3+6}\right|\):
\begin{align*}
\left|\frac{3n-24}{n^3+6}\right| &\leq \frac{3n+24}{n^3+6}\\[2ex]
& \leq \frac{27n}{n^3 + 6}\\[2ex]
& \lt \frac{27n}{n^3} \\
&= \frac{27}{n^2}.
\end{align*}
Hence, if \(\frac{27}{n^2} \lt \varepsilon\) then also \(|x_n-4| \lt \varepsilon\) by the transitivity property of inequalities. The inequality \(\frac{27}{n^2} \lt \varepsilon\) holds true if and only if \(\sqrt{\frac{27}{\varepsilon}} \lt n\). Now that we have done a detailed preliminary analysis, we can proceed with the proof.
Suppose that \(\varepsilon \gt 0\) is given and let \(K\in\N\) be such that \(\sqrt{\frac{27}{\varepsilon}} \lt K \). Then \(\frac{27}{\varepsilon} \lt K^2\), and thus \(\frac{27}{K^2} \lt \varepsilon\). Then, if \(n\geq K\) then \(\frac{27}{n^2}\leq \frac{27}{K^2}\) and therefore
\begin{align*}
|x_n -4| &= \left|\frac{3n-24}{n^3+6}\right|\\[2ex]
& \leq \frac{3n+24}{n^3+6}\\
& \leq \frac{27n}{n^3 + 6}\\
& \lt \frac{27n}{n^3}\\
& = \frac{27}{n^2}\\
&\leq \frac{27}{K^2}\\
& \lt \varepsilon.
\end{align*}
This proves that \(\limi x_n = 4\).
Prove that for any irrational number \(\zeta\) there exists a sequence of rational numbers \((x_n)\) converging to \(\zeta\).
Let \((\delta_n)\) be any sequence of positive numbers converging to zero, for example, \(\delta_n = \frac{1}{n}\). Now since \(\zeta-\delta_n \lt \zeta+\delta_n\) for each \(n\in\N\), then by the Density theorem there exists a rational number \(x_n\) such that \(\zeta-\delta_n \lt x_n \lt \zeta+\delta_n\). In other words, \(|x_n-\zeta| \lt \delta_n\). Now let \(\eps \gt 0\) be arbitrary. Since \((\delta_n)\) converges to zero, there exists \(K\in\N\) such that \(|\delta_n - 0| \lt \eps\) for all \(n\geq K\), or since \(\delta_n \gt 0\), then \(\delta_n \lt \eps\) for all \(n\geq K\). Therefore, if \(n\geq K\) then \(|x_n-\zeta| \lt \delta_n \lt \eps\). Thus, for arbitrary \(\eps \gt 0\) there exists \(K\in\N\) such that if \(n\geq K\) then \(|x_n-\zeta| \lt \eps\). This proves that \((x_n)\) converges to \(\zeta\).
Let \(x_n = \frac{\cos(n)}{n^2-1}\) where \(n\geq 2\). Prove that \(\limi x_n = 0\).
We want to prove given any \(\varepsilon \gt 0\) there exists \(K\in\N\) such that
\[
\left|\frac{\cos(n)}{n^2-1}\right| \lt \varepsilon, n\geq K.
\]
Now, since \(|\cos(x)|\leq 1\) for all \(x\in\real\) we have that
\begin{align*}
\left|\frac{\cos(n)}{n^2-1}\right| &= \frac{|\cos(n)|}{n^2-1}\\
& \leq \frac{1}{n^2-1}\\
& \leq \frac{1}{n^2-\frac{1}{2}n^2}\\
& = \frac{2}{n^2}.
\end{align*}
Thus, if \(\frac{2}{n^2} \lt \varepsilon\) then \(\left|\frac{\cos(n)}{n^2-1}\right|\).
Let \(\varepsilon \gt 0\) be arbitrary. Let \(K\in\N\) be such that \(\sqrt{\frac{2}{\varepsilon}} \lt K\). Then \(\frac{2}{K^2} \lt \varepsilon\). Therefore, if \(n\geq K\) then \(\frac{2}{n^2}\leq \frac{2}{K^2}\) and therefore
\begin{align*}
\left|\frac{\cos(n)}{n^2-1}\right| &= \frac{|\cos(n)|}{n^2-1}\\
& \leq \frac{1}{n^2-1}\\
&\leq \frac{1}{n^2-\frac{1}{2}n^2}\\
& = \frac{2}{n^2} \leq \frac{2}{K^2}\\
& \lt \varepsilon
\end{align*}
This proves that \(\limi x_n = 0\).
Does the sequence \((x_n)\) defined by \(x_n = \frac{(-1)^n n}{n+1}\) converge?
A useful tool for proving convergence is the following.
Let \((x_n)\) be a sequence and let \(L\in\real\). Let \((a_n)\) be a sequence of positive numbers such that\(\displaystyle\limi a_n = 0\). Suppose that there exists \(M\in\N\) such that
\[
|x_n -L|\leq a_n,\qquad\forall\; n\geq M.
\]
Then \(\displaystyle\limi x_n = L\).
Let \(\varepsilon \gt 0\) be arbitrary. Since \(a_n\rightarrow 0\), there exists \(K_1\in\N\) such that \(a_n \lt \varepsilon\) for all \(n\geq K_1\). Let \(K=\max\{M, K_1\}\). Then, if \(n\geq K\) then \(a_n \lt \varepsilon\) and \(|x_n-L|\leq a_n\). Thus, if \(n\geq K\) then
\[
|x_n - L|\leq a_n \lt \varepsilon.
\]
Suppose that \(0 \lt r \lt 1\). Prove that \(\limi r^n = 0\).
We first note that
\[
r = \frac{1}{\tfrac{1}{r}} = \frac{1}{1 + x}
\]
where \(x=\frac{1}{r}-1\) and since \(r \lt 1\) then \(x \gt 0\). Now, by Bernoulli's inequality (Example 1.2.5) it holds that \((1+x)^n \geq 1 + x n \) for all \(n\in \N\) and therefore
\begin{align*}
r^n &= \frac{1}{\left(1 + x\right)^n}\\
& \leq \frac{1}{1+nx}\\
& \lt \frac{1}{n x}.
\end{align*}
Now since \(\lim_{n\rightarrow\infty} \frac{1}{n x} = 0\) then it follows by Theorem 3.1.9 that
\[
\limi r^n = 0.
\]
Consider the sequence \(x_n=\frac{n^2-1}{2n^2+3}\). Prove that \(\limi x_n = \frac{1}{2}\).
We have that
\begin{align*}
\left| \frac{n^2-1}{2n^2+3}-\frac{1}{2} \right| &= \frac{5}{2}\frac{1}{(2n^2+3)}\\
& \lt \frac{5/2}{2n^2}.
\end{align*}
Using the definition of the limit of a sequence, one can show that \(\limi \frac{5}{4n^2} = 0\) and therefore \(\limi x_n = \frac{1}{2}\).
Notice that in the definition of the limit of a sequence, we wrote there exists a number \(L\). Could there be more than one number \(L\) satisfying the definition of convergence of a sequence? Before we go any further, we prove that if a sequence converges then it has a unique limit.
A convergent sequence can have at most one limit.
Suppose that \((x_n)\rightarrow L_1\) and that \((x_n)\rightarrow L_2\). Let \(\varepsilon \gt 0\) be arbitrary. Then there exists \(K_1\) such that \(|x_n-L_1| \lt \varepsilon/2\) for all \(n\geq K_1\) and there exists \(K_2\) such that \(|x_n-L_2| \lt \varepsilon/2\) for all \(n\geq K_2\). Let \(K=\max\{K_1,K_2\}\). Then for \(n\geq K\) it holds that \(|x_n-L_1| \lt \eps/2\) and also \(|x_n-L_2| \lt \eps/2\) and therefore
\begin{align*}
|L_1-L_2| &= |L_1-x_n+x_n-L_2|\\
& \lt |x_n-L_1| + |x_n-L_2|\\
& \lt \varepsilon/2+\varepsilon/2\\
& = \varepsilon.
\end{align*}
Hence, \(|L_1-L_2| \lt \varepsilon\) for all \(\varepsilon \gt 0\), and therefore by Theoreom 2.2.7 we conclude that \(|L_1-L_2|=0\), that is, \(L_1-L_2=0\).
The ultimate long-time behavior of a sequence will not change if we discard a finite number of terms of the sequence. To be precise, suppose that \(X=(x_1,x_2,x_3,\ldots)\) is a sequence and let \(Y=(x_{m+1},x_{m+2},x_{m+3},\ldots)\), that is, \(Y\) is the sequence obtained from \(X\) by discarding the first \(m\) terms of \(X\). In this case, we will call \(Y\) the \(m\)-tail of \(X\). The next theorem states, not surprisingly, that the convergence properties of \(X\) and \(Y\) are the same.
Let \(X:\N\rightarrow\real\) be a sequence and let \(Y:\N\rightarrow\real\) be the sequence obtained from \(X\) by discarding the first \(m\in\N\) terms of \(X\), in other words, \(Y(n) = X(m+n)\). Then \(X\) converges to \(L\) if and only if \(Y\) converges to \(L\).
Exercises
Write the first three terms of the recursively defined sequence \(x_1=1\), \(x_{n+1}=\tfrac{1}{2}(x_n+\tfrac{2}{x_n})\) for \(n\geq 1\).
Use the definition of the limit of a sequence to establish the following limits:
- \(\displaystyle\lim_{n\rightarrow\infty} \frac{n+1}{3n} = \frac{1}{3}\)
- \(\displaystyle \lim_{n\rightarrow\infty} \frac{3n^2+2}{4n^2+1} = \frac{3}{4}\)
- \(\displaystyle \lim_{n\rightarrow\infty} \frac{(-1)^nn}{n^2+1} = 0\)
\mbox{}
- Prove that \(\lim_{n\rightarrow\infty} |x_n|=0\) if and only if \(\lim_{n\rightarrow\infty} x_n = 0\).
- Combining the previous result and Example 3.1.10, prove that if \(1 \lt r \lt 0\) then \(\limi r^n = 0\).
- Conclude that for any real number \(r\in \real\), if \(|r| \lt 1\) then \(\limi r^n = 0\).
Let \(m\in\N\) and assume that \(m \geq 2\).
- Prove that \(\frac{1}{m^n} \lt \frac{1}{n}\) for all \(n\in\N\).
- Use Theorem 3.1.9 to show that \(\lim_{n\rightarrow\infty} \frac{1}{m^n} = 0\).
Note: Do not use Example 3.1.10 to show that \(\lim_{n\rightarrow\infty} \frac{1}{m^n} = 0\).
Suppose that \(S\subset\real\) is non-empty and bounded above and let \(u=\sup S\). Show that there exists a sequence \((x_n)\) such that \(x_n\in S\) for all \(n\in\mathbb{N}\) and \(\lim_{n\rightarrow\infty} x_n = u\). Hint: If \(\varepsilon \gt 0\) then clearly \(u-\varepsilon \lt u\). Since \(u=\sup(S)\) there exists \(x\in S\) such that \(u-\varepsilon \lt x \lt u\). Example 3.1.6 is similar.
Let \((x_n)\) be the sequence defined as
\[
x_n = \begin{cases}
2n^2 + 1, & n \lt 50\\[2ex]
\frac{\sin(2n)}{n^2 + 1}, & n \geq 50.
\end{cases}
\]
Using the definition of the limit of a sequence, find \(\limi x_n\).
- \(\displaystyle\lim_{n\rightarrow\infty} \frac{n+1}{3n} = \frac{1}{3}\)
- \(\displaystyle \lim_{n\rightarrow\infty} \frac{3n^2+2}{4n^2+1} = \frac{3}{4}\)
- \(\displaystyle \lim_{n\rightarrow\infty} \frac{(-1)^nn}{n^2+1} = 0\)
- Prove that \(\lim_{n\rightarrow\infty} |x_n|=0\) if and only if \(\lim_{n\rightarrow\infty} x_n = 0\).
- Combining the previous result and Example 3.1.10, prove that if \(1 \lt r \lt 0\) then \(\limi r^n = 0\).
- Conclude that for any real number \(r\in \real\), if \(|r| \lt 1\) then \(\limi r^n = 0\).
- Prove that \(\frac{1}{m^n} \lt \frac{1}{n}\) for all \(n\in\N\).
- Use Theorem 3.1.9 to show that \(\lim_{n\rightarrow\infty} \frac{1}{m^n} = 0\).
Limit Theorems
Proving that a particular number \(L\in\real\) is the limit of a given sequence is usually not easy because there is no systematic way to determine a candidate limit \(L\) for a given arbitrary sequence. Instead, we are frequently interested in just knowing if a given sequence converges or not, and not so much on finding the actual limit. The theorems in this section help us do just that. We begin with a definition.
A sequence \((x_n)\) is said to be bounded if there exists \(R\geq 0\) such that \(|x_n|\leq R\) for all \(n\in\N\).
Prove that \((x_n)\) is bounded if and only if there exists numbers \(R_1\) and \(R_2\) such that \(R_1\leq x_n \leq R_2\) for all \(n\in \N\).
A convergent sequence is bounded.
Suppose that \((x_n)\) converges to \(L\). Then there exists \(K\in\N\) such that \(|x_n-L| \lt 1\) for all \(n\geq K\). Let
\[
R = 1 + \max\{|x_1-L|,|x_2-L|,\ldots,|x_{K-1}-L|\},
\]
and we note that \(R\geq 1\). Then for all \(n\geq 1\) it holds that \(|x_n - L| \leq R\). Indeed, if \(n\geq K\) then \(|x_n-L| \lt 1\leq R\) and if \(1\leq n\leq K-1\) then
\[
|x_n - L|\leq \max\{|x_1-L|, \ldots,|x_{K-1}-L|\} \leq R.
\]
Thus, for all \(n\geq 1\) it holds that \(L-R \leq x_n \leq R+L\) and this proves that \((x_n)\) is bounded.
If \((x_n)\rightarrow L\) then \((|x_n|)\rightarrow |L|\).
Follows by the inequality \(||x_n|-|L||\leq |x_n-L|\) (see Corollary 2.3.6). Indeed, for any given \(\eps \gt 0\) there exists \(K\in\N\) such that \(|x_n-L| \lt \eps\) for all \(n\geq K\) and therefore \(||x_n|-|L||\leq |x_n - L| \lt \eps\) for all \(n\geq K\).
The following theorem describes how the basic operations of arithmetic preserve convergence.
Suppose that \((x_n)\rightarrow L\) and \((y_n)\rightarrow M\).
- Then \((x_n + y_n)\rightarrow L + M\) and \((x_n-y_n)\rightarrow L - M\).
- Then \((x_ny_n)\rightarrow LM\).
- If \(y_n\neq 0\) and \(M\neq 0\) then \(\left(\frac{x_n}{y_n}\right)\rightarrow\frac{L}{M}\).
(i) By the triangle inequality
\begin{align*}
|x_n+y_n-(L+M)| &= |x_n-L + y_n-M|\\
& \lt |x_n-L| + |y_n-M|.
\end{align*}
Let \(\varepsilon \gt 0\). There exists \(K_1\) such that \(|x_n-L| \lt \varepsilon/2\) for \(n\geq K_1\) and there exists \(K_2\) such that \(|y_n-M| \lt \varepsilon/2\) for \(n\geq K_2\). Let \(K=\max\{K_1,K_2\}\). Then for \(n\geq K\)
\begin{align*}
|x_n+y_n-(L+M)| & \leq |x_n-L| + |y_n-M| \\
& \lt \varepsilon/2 + \varepsilon/2\\
&= \varepsilon.
\end{align*}
The proof for \((x_n-y_n)\rightarrow L-M\) is similar.
(ii) We have that \begin{align*} |x_ny_n-LM| &= |x_ny_n - y_nL + y_nL - LM|\\ &\leq |x_ny_n-y_nL| + |y_nL-LM|\\ &= |y_n||x_n-L| + |L||y_n-M|. \end{align*} Now, \((y_n)\) is bounded because it is convergent, and therefore \(|y_n|\leq R\) for all \(n\in\N\) for some \(R \gt 0\). By convergence of \((y_n)\) and \((x_n)\), there exists \(K\in\N\) such that \(|x_n-L| \lt \frac{\varepsilon}{2R}\) and \(|y_n-M| \lt \frac{\varepsilon}{2(|L|+1)}\) for all \(n\geq K\). Therefore, if \(n\geq K\) then \begin{align*} |x_ny_n-LM| & \lt |y_n||x_n-L| + |L||y_n-M|\\[2ex] & \lt R |x_n-L| + (|L|+1)|y_n-M|\\[2ex] & \lt R\frac{\varepsilon}{2R} + (|L|+1)\frac{\varepsilon}{2(|L|+1)}\\[2ex] &= \varepsilon. \end{align*} (iii) It is enough to prove that \(\left(\frac{1}{y_n}\right)\rightarrow\frac{1}{M}\) and then use (ii). Now, since \(M\neq 0\) and \(y_n\neq 0\) then \(|y_n|\) is bounded below by some positive number, say \(R \gt 0\). Indeed, \((|y_n|)\rightarrow |M|\) and \(|y_n| \gt 0\). Thus, \(\frac{1}{|y_n|} \lt \frac{1}{R}\) for all \(n\in\N\). Now, \begin{align*} \left|\frac{1}{y_n}-\frac{1}{M}\right| &= \frac{1}{|y_n||M|}|y_n-M|\\[2ex] & \lt \frac{1}{R|M|} |y_n-M|. \end{align*} For \(\varepsilon \gt 0\), there exists \(K\in\N\) such that \(|y_n-M| \lt R|M|\varepsilon\) for all \(n\geq K\). Therefore, for \(n\geq K\) we have that \begin{align*} \left|\frac{1}{y_n}-\frac{1}{M}\right| & \lt \frac{1}{R|M|} |y_n-M|\\[2ex] & \lt \frac{1}{R|M|} R|M|\varepsilon\\[2ex] &= \varepsilon. \end{align*}
(ii) We have that \begin{align*} |x_ny_n-LM| &= |x_ny_n - y_nL + y_nL - LM|\\ &\leq |x_ny_n-y_nL| + |y_nL-LM|\\ &= |y_n||x_n-L| + |L||y_n-M|. \end{align*} Now, \((y_n)\) is bounded because it is convergent, and therefore \(|y_n|\leq R\) for all \(n\in\N\) for some \(R \gt 0\). By convergence of \((y_n)\) and \((x_n)\), there exists \(K\in\N\) such that \(|x_n-L| \lt \frac{\varepsilon}{2R}\) and \(|y_n-M| \lt \frac{\varepsilon}{2(|L|+1)}\) for all \(n\geq K\). Therefore, if \(n\geq K\) then \begin{align*} |x_ny_n-LM| & \lt |y_n||x_n-L| + |L||y_n-M|\\[2ex] & \lt R |x_n-L| + (|L|+1)|y_n-M|\\[2ex] & \lt R\frac{\varepsilon}{2R} + (|L|+1)\frac{\varepsilon}{2(|L|+1)}\\[2ex] &= \varepsilon. \end{align*} (iii) It is enough to prove that \(\left(\frac{1}{y_n}\right)\rightarrow\frac{1}{M}\) and then use (ii). Now, since \(M\neq 0\) and \(y_n\neq 0\) then \(|y_n|\) is bounded below by some positive number, say \(R \gt 0\). Indeed, \((|y_n|)\rightarrow |M|\) and \(|y_n| \gt 0\). Thus, \(\frac{1}{|y_n|} \lt \frac{1}{R}\) for all \(n\in\N\). Now, \begin{align*} \left|\frac{1}{y_n}-\frac{1}{M}\right| &= \frac{1}{|y_n||M|}|y_n-M|\\[2ex] & \lt \frac{1}{R|M|} |y_n-M|. \end{align*} For \(\varepsilon \gt 0\), there exists \(K\in\N\) such that \(|y_n-M| \lt R|M|\varepsilon\) for all \(n\geq K\). Therefore, for \(n\geq K\) we have that \begin{align*} \left|\frac{1}{y_n}-\frac{1}{M}\right| & \lt \frac{1}{R|M|} |y_n-M|\\[2ex] & \lt \frac{1}{R|M|} R|M|\varepsilon\\[2ex] &= \varepsilon. \end{align*}
Suppose that \((x_n)\rightarrow L\). Then \((x_n^k)\rightarrow L^k\) for any \(k\in\N\).
The next theorem states that the limit of a convergent sequence of non-negative terms is non-negative.
Suppose that \((x_n)\rightarrow L\). If \(x_n\geq 0\) for all \(n\in \N\) then \(L\geq 0\).
We prove the contrapositive, that is, we prove that if \(L \lt 0\) then there exists \(K\in\N\) such that \(x_K \lt 0\). Suppose then that \(L \lt 0\). Let \(\varepsilon \gt 0\) be such that \(L+\varepsilon \lt 0\). Since \((x_n)\rightarrow L\), there exists \(K\in\N\) such that \(x_K \lt L+\varepsilon\), and thus by transitivity we have \(x_K \lt 0\).
Suppose that \((x_n)\) and \((y_n)\) are convergent and suppose that there exists \(M\in\N\) such that \(x_n\leq y_n\) for all \(n\geq M\). Then \(\displaystyle\limi x_n \leq \limi y_n\).
Suppose for now that \(M=1\), that is, \(x_n\leq y_n\) for all \(n\in\N\). Consider the sequence \(z_n = y_n-x_n\). Then \(z_n\geq 0\) for all \(n\in \N\) and \((z_n)\) is convergent since it is the difference of convergent sequences. By Theorem 3.2.7, we conclude that \(\limi z_n \geq 0\). But
\begin{align*}
\limi z_n &= \limi(y_n-x_n)\\
& = \limi y_n -\limi x_n
\end{align*}
and therefore \(\limi y_n - \limi x_n \geq 0\), which is the same as \(\limi y_n \geq \limi x_n\). If \(M \gt 1\), then we can apply the theorem to the \(M\)-tail of the sequences of \((x_n)\) and \((y_n)\) and the result follows.
Suppose that \(a\leq x_n \leq b\) and \(\displaystyle \limi x_n = L\). Then \(a\leq L\leq b\).
We have that \(0\leq x_n -a \leq b-a\). The sequence \(y_n = b-a\) is constant and converges to \(b-a\). The sequence \(z_n = x_n - a\) converges to \(L-a\). Therefore, by the previous theorem, \(0\leq L-a\leq b-a\), or \(a\leq L\leq b\).
Suppose that \(y_n \leq x_n \leq z_n\) for all \(n\in \N\). Assume that \((y_n)\rightarrow L\) and also \((z_n)\rightarrow L\). Then \((x_n)\) is convergent and \((x_n)\rightarrow L\).
Let \(\varepsilon \gt 0\) be arbitrary. There exists \(K_1\in\N\) such that \(L-\varepsilon \lt y_n \lt L+\varepsilon\) for all \(n\geq K_1\) and there exists \(K_2\in\N\) such that \(L-\varepsilon \lt z_n \lt L+\varepsilon\) for all \(n\geq K_2\). Let \(K=\max\{K_1,K_2\}\). Then if \(n\geq K\) then
\[
L-\varepsilon \lt y_n \leq x_n \leq z_n \lt L+\varepsilon.
\]
Therefore, for \(n\geq K\) we have that
\[
L-\varepsilon \lt x_n \lt L+\varepsilon
\]
and thus \(\limi x_n = L\).
Some people call the Squeeze Theorem the Sandwich Theorem; we are not those people.
Let \(0 \lt a \lt b\) and let \(x_n = (a^n + b^n)^{1/n}\). Prove that \(\displaystyle\lim_{n\rightarrow\infty} x_n = b\).
Let \((x_n)\) be a sequence such that \(x_n \gt 0\) for all \(n\in\N\) and suppose that \(\displaystyle L = \limi \frac{x_{n+1}}{x_n}\) exists. If \(L \lt 1\) then \((x_n)\) converges and \(\displaystyle \limi x_n = 0\).
Let \(r\in\real\) be such that \(L \lt r \lt 1\) and set \(\varepsilon = r-L\). There exists \(K\in\N\) such that
\[
\frac{x_{n+1}}{x_n} \lt L+\varepsilon = r
\]
for all \(n\geq K\). Therefore, for all \(n\geq K\) we have that
\[
0 \lt x_{n+1} \lt r x_n .
\]
Thus, \(x_{K+1} \lt r x_K\), and therefore \(x_{K+2} \lt r x_{K+1} \lt r^2 x_K\), and inductively for \(m\geq 1\) it holds that
\[
x_{K+m} \lt r^m x_K.
\]
Hence, the tail of the sequence \((x_n)\) given by \((y_m) = (x_{K+1}, x_{K+2}, \ldots,)\) satisfies
\[
0 \lt y_m \lt r^m x_K.
\]
Since \(0 \lt r \lt 1\) it follows that \(\lim_{m\rightarrow\infty} r^m = 0\) and therefore \(\lim_{m\rightarrow\infty}y_m = 0\) by the Squeeze theorem. This implies that \((x_n)\) converges to \(0\) also.
Exercises
Use the Limit Theorems to prove that if \((x_n)\) converges and \((x_n+y_n)\) converges then \((y_n)\) converges. Give an example of two sequences \((x_n)\) and \((y_n)\) such that both \((x_n)\) and \((y_n)\) diverge but \((x_n+y_n)\) converges.
Is the sequence \(y_n=(-1)^n n^4\) convergent? Explain.
Let \((x_n)\) and \((y_n)\) be sequences in \(\real\). Suppose that \(\lim_{n\rightarrow\infty} x_n = 0\) and that \((y_n)\) is bounded. Prove that \(\lim_{n\rightarrow\infty} x_ny_n =0\).
Show that if \((x_n)\) and \((y_n)\) are sequences such that \((x_n)\) and \((x_n+y_n)\) are convergent, then \((y_n)\) is convergent.
Give examples of the following:
- Divergent sequences \((x_n)\) and \((y_n)\) such that \(z_n = x_ny_n\) converges.
- Divergent sequences \((x_n)\) and \((y_n)\) such that \(z_n = x_ny_n\) diverges.
- A divergent sequence \((x_n)\) and a convergent sequence \((y_n)\) such that \(z_n=x_ny_n\) converges.
- A divergent sequence \((x_n)\) and a convergent sequence \((y_n)\) such that \(z_n=x_ny_n\) diverges.
Let \((x_n)\) and \((y_n)\) be sequences and suppose that \((x_n)\) converges to \(L\). Assume that for every \(\varepsilon \gt 0\) there exists \(M\in\mathbb{N}\) such that \(|x_n-y_n| \lt \varepsilon\) for all \(n\geq M\). Prove that \((y_n)\) also converges to \(L\).
Let \((x_n)\) be a sequence and define a sequence \((y_n)\) as
\[
y_n = \frac{x_1+x_2+\cdots+x_n}{n}
\]
for \(n\in\mathbb{N}\). Show that if \(\lim_{n\rightarrow\infty} x_n = 0\) then \(\lim_{n\rightarrow\infty} y_n = 0\).
Let \((x_n)\) be a convergent sequence with limit \(L\). Let \(f(x) = a_0+a_1x+\cdots +a_kx^k\) be a polynomial. Use the Limit Theorems to prove that the sequence \((y_n)\) defined by \(y_n=f(x_n)\) is convergent and find the limit of \((y_n)\).
Apply the Limit Theorems to find the limits of the following sequences:
- \(\displaystyle x_n=\sqrt{\frac{2n^2+3}{n^2+1}}\)
- \(\displaystyle x_n=(2+1/n)^2\)
- \(\displaystyle x_n=\frac{n+1}{n\sqrt{n}}\)
- \(\displaystyle x_n=2^n/n!\)
Let \((x_n)\) be a sequence such that \(x_n\neq 0\) for all \(n\in\N\). Suppose that \(\lim_{n\rightarrow\infty} x_n = L\) and \(L \gt 0\). Let \(w=\inf\{|x_n|\;:\;n\in\mathbb{N}\}.\) Prove that \(w \gt 0\).
Let \((x_n)\) be a sequence of positive numbers such that \(\displaystyle \lim_{n\rightarrow\infty}\frac{x_{n+1}}{x_n} = L \gt 1\). Show that \((x_n)\) is not bounded and hence is not convergent.
- Divergent sequences \((x_n)\) and \((y_n)\) such that \(z_n = x_ny_n\) converges.
- Divergent sequences \((x_n)\) and \((y_n)\) such that \(z_n = x_ny_n\) diverges.
- A divergent sequence \((x_n)\) and a convergent sequence \((y_n)\) such that \(z_n=x_ny_n\) converges.
- A divergent sequence \((x_n)\) and a convergent sequence \((y_n)\) such that \(z_n=x_ny_n\) diverges.
- \(\displaystyle x_n=\sqrt{\frac{2n^2+3}{n^2+1}}\)
- \(\displaystyle x_n=(2+1/n)^2\)
- \(\displaystyle x_n=\frac{n+1}{n\sqrt{n}}\)
- \(\displaystyle x_n=2^n/n!\)
Monotone Sequences
As we have seen, a convergent sequence is necessarily bounded, and it is straightforward to construct examples of sequences that are bounded but not convergent, for example, \((x_n) = (1,0,1,0,1,0,\ldots)\). In this section, we prove the Monotone Convergence Theorem which says that a bounded sequence whose terms increase (or decrease) must necessarily converge.
Let \((x_n)\) be a sequence.
- We say that \((x_n)\) is increasing if \(x_n\leq x_{n+1}\) for all \(n\in\mathbb{N}\).
- We say that \((x_n)\) is decreasing if \(x_{n+1}\leq x_n\) for all \(n\in\mathbb{N}\).
- We say that \((x_n)\) is monotone if \((x_n)\) is either increasing or decreasing.
Prove that if \((x_n)\) is increasing then \((x_n)\) is bounded below. Similarly, prove that if \((x_n)\) is decreasing then \((x_n)\) is bounded above.
If \((x_n)\) is bounded and monotone then \((x_n)\) is convergent. In particular:
- if \((x_n)\) is bounded above and increasing then \[ \lim_{n\rightarrow\infty} x_n = \sup\{x_n\;:\; n\in\N\}, \]
- if \((x_n)\) is bounded below and decreasing then \[ \lim_{n\rightarrow\infty} x_n = \inf\{x_n\;:\; n\in\N\}. \]
Suppose that \((x_n)\) is bounded above and increasing. Let \(u=\sup\{x_n\;|\;n\in\N\}\) and let \(\varepsilon \gt 0\) be arbitrary. Then by the properties of the supremum, there exists \(x_K\) such that \(u-\varepsilon \lt x_K\leq u\). Since \((x_n)\) is increasing, and \(u\) is an upper bound for the range of the sequence, it follows that \(x_K\leq x_n\leq u\) for all \(n\geq K\). Therefore, \(u-\varepsilon \lt x_n\leq u\) for all \(n\geq K\). Clearly, this implies that \(u-\varepsilon \lt x_n \lt u+\varepsilon\) for all \(n\geq K\). Since \(\varepsilon \gt 0\) was arbitrary, this proves that \((x_n)\) converges to \(u\).
Suppose now that \((x_n)\) is bounded below and decreasing. Let \(w=\inf\{x_n\;|\; n\in\N\}\) and let \(\varepsilon \gt 0\) be arbitrary. Then by the properties of the infimum, there exists \(x_K\) such that \(w\leq x_K \lt w+\varepsilon\). Since \((x_n)\) is decreasing, and \(w\) is a lower bound for the range of the sequence, it follows that \(w\leq x_n\leq x_K\) for all \(n\geq K\). Therefore, \(w\leq x_n \lt w+\varepsilon\) for all \(n\geq K\). Hence, \(w-\varepsilon \lt x_n \lt w+\varepsilon\) for all \(n\geq K\). Since \(\varepsilon \gt 0\) was arbitrary, this proves that \((x_n)\) converges to \(w\).
The Monotone Convergence Theorem (MCT) is an important tool in real analysis and we will use it frequently; notice that it is more-or-less a direct consequence of the Completeness Axiom. In fact, we could have taken as our starting axiom the MCT and then proved the Completeness property of \(\real\).
Suppose now that \((x_n)\) is bounded below and decreasing. Let \(w=\inf\{x_n\;|\; n\in\N\}\) and let \(\varepsilon \gt 0\) be arbitrary. Then by the properties of the infimum, there exists \(x_K\) such that \(w\leq x_K \lt w+\varepsilon\). Since \((x_n)\) is decreasing, and \(w\) is a lower bound for the range of the sequence, it follows that \(w\leq x_n\leq x_K\) for all \(n\geq K\). Therefore, \(w\leq x_n \lt w+\varepsilon\) for all \(n\geq K\). Hence, \(w-\varepsilon \lt x_n \lt w+\varepsilon\) for all \(n\geq K\). Since \(\varepsilon \gt 0\) was arbitrary, this proves that \((x_n)\) converges to \(w\).
By the MCT, a bounded sequence that is also monotone is convergent. However, it is easy to construct a convergent sequence that is not monotone. Provide such an example.
The MCT can be used to show convergence of recursively defined sequences. To see how, suppose that \((x_n)\) is defined recursively as \(x_1=a\) and
\[
x_{n+1} = f(x_n)
\]
where \(f\) is some given function. For example, say \(x_1=2\) and
\[
x_{n+1} = 2 + \frac{1}{x_n}.
\]
Hence, in this case \(f(x) = 2 + \frac{1}{x}\). If \((x_{n})\) is bounded and increasing then by the MCT \((x_n)\) converges, but we do not know what the limit is. However, for example, if \(f\) is a polynomial/rational function of \(x\) then we can conclude that \(L=\limi x_n\) must satisfy the equation
\[
L = f( L ).
\]
Indeed, if \(f\) is a polynomial/rational function then by the Limit Laws we have
\[
\lim_{n\rightarrow\infty} f(x_n) = f(\lim_{n\rightarrow\infty} x_n) = f(L).
\]
But \(x_{n+1}=f(x_n)\) and therefore \(\lim_{n\rightarrow\infty} x_{n+1} = f(L)\), which is equivalent to \(\lim_{n\rightarrow\infty} x_n = f(L)\) since \((x_{n+1})\) is just the \(1\)-tail of the sequence \((x_n)\). Therefore, \(L=f(L)\) as claimed. From the equation \(L = f(L)\) we can solve for \(L\) if possible.
Consider the sequence \((x_n)\) defined recursively as \(x_1=1\) and
\[
x_{n+1} = \tfrac{1}{2} x_n + \frac{1}{4},\qquad\forall\;\; n\geq 1.
\]
Prove that \((x_n)\) converges and find the limit.
We prove by induction that \(\frac{1}{2}\leq x_n\) for all \(n\in\N\), that is, \((x_n)\) is bounded below by \(\frac{1}{2}\). First of all, it is clear that \(\frac{1}{2}\leq x_1\). Now assume that \(\frac{1}{2}\leq x_n\) for some \(n\in\N\). Then
\begin{align*}
x_{n+1} &=\tfrac{1}{2}x_n + \frac{1}{4}\\
& \geq \frac{1}{2}\cdot\frac{1}{2} + \frac{1}{4}\\
& = \frac{1}{2}.
\end{align*}
Hence, \((x_n)\) is bounded below by \(\tfrac{1}{2}\). We now prove that \((x_n)\) is decreasing. We compute that \(x_2 = \frac{1}{2}+\frac{1}{4} = \frac{3}{4}\), and thus \(x_2 \lt x_1\). Assume now that \(x_n \lt x_{n-1}\) for some \(n\in\N\). Then
\[
x_{n+1} \lt \frac{1}{2} x_{n-1} + \frac{1}{4} = x_n.
\]
Hence, by induction we have shown that \((x_n)\) is decreasing. By the MCT, \((x_n)\) is convergent. Suppose that \((x_n)\rightarrow L\). Then also \((x_{n+1})\rightarrow L\) and the sequence \(y_n = \tfrac{1}{2}x_n + \frac{1}{4}\) converges to \(\tfrac{1}{2}L + \frac{1}{4}\). Therefore, \(L=\frac{1}{2} L + \frac{1}{4}\) and thus \(L = 1/2\).
Before we embark on the next example, we recall that
\[
1 + r + r^2 + \cdots + r^{n-1} = \frac{1-r^n}{1-r}
\]
and if \(0 \lt r \lt 1\) then \(0 \lt r^n \lt r \lt 1\) and therefore
\[
\frac{1-r^n}{1-r} \lt \frac{1}{1-r}.
\]
Consider the sequence
\[
x_n = 1 + \frac{1}{1!} + \frac{1}{2!} + \cdots + \frac{1}{n!}.
\]
Note that this can be defined recursively as \(x_1=1\) and \(x_{n+1} = x_n + \frac{1}{(n+1)!}\). Prove that \((x_n)\) converges.
We will prove by the MCT that \((x_n)\) converges. By induction, one can show that \(2^{n-1} \lt n!\) for all \(n\geq 3\). Therefore,
\begin{align*}
x_n & \lt 1 + \frac{1}{1}+\frac{1}{2} + \cdots + \frac{1}{2^{n-1}}\\
& = 1 + \frac{1-(1/2)^n}{1-(1/2)}\\
& \lt 1 + \frac{1}{1-(1/2)}\\
& = 3.
\end{align*}
Hence, \(x_n \lt 3\) and therefore \((x_n)\) is bounded. Now, since \(x_{n+1} = x_n + \frac{1}{(n+1)!}\) then clearly \(x_n \lt x_{n+1}\). Thus \((x_n)\) is increasing. By the MCT, \((x_n)\) converges. You might recognize that \(\limi x_n = e = 2.71828\ldots\).
Let \(x_1=0\) and let \(x_{n+1} = \sqrt{2+x_n}\) for \(n\geq 1\). Prove that \((x_n)\) converges and find its limit.
Clearly, \(x_1 \lt x_2 = \sqrt{2}\). Assume by induction that \(x_k \gt x_{k-1}\) for some \(k\in\N\). Then
\begin{align*}
x_{k+1} &= \sqrt{2+x_k}\\
& \gt \sqrt{2+x_{k-1}}\\
& = x_k.
\end{align*}
Hence, \((x_n)\) is an increasing sequence. We now prove that \((x_n)\) is bounded above. Clearly, \(x_1 \lt 2\). Assume that \(x_k \lt 2\) for some \(k\in\N\). Then \(x_{k+1} = \sqrt{2+x_k} \lt \sqrt{2+2} = 2\). This proves that \((x_n)\) is bounded above. By the MCT, \((x_n)\) converges, say to \(L\). Moreover, since \(x_n \geq 0\) (as can be proved by induction), then \(L\geq 0\). Therefore, \(L = \sqrt{2+L}\) and then \(L^2 - L - 2 = 0\). Hence, \(L=\frac{1\pm\sqrt{1+8}}{2} = \frac{1\pm 3}{2}\). Since \(L\geq 0\) then \(L=2\).
Consider the sequence \(x_n = \left(1+\frac{1}{n}\right)^n\). We will show that \((x_n)\) is bounded and increasing, and therefore by the MCT \((x_n)\) convergent. The limit of this sequence is the number \(e\). From the binomial theorem
\begin{align*}
x_n &= \sum_{k=0}^n\binom{n}{k}\frac{1}{n^k}\\
& = 1 + \frac{n}{n} + \frac{1}{2!} \frac{n(n-1)}{n^2} +\frac{1}{3!} \frac{n(n-1)(n-2)}{n^3} + \cdots \\
& + \frac{1}{n!}\frac{n(n-1)(n-2)\cdots (n-(n-1))}{n^k} \\[2ex]
&= 1 + 1 + \frac{1}{2!} (1-\tfrac{1}{n}) + \frac{1}{3!}(1-\tfrac{1}{n})(1-\tfrac{1}{n}) + \cdots \\
& + \frac{1}{n!}(1-\tfrac{1}{n})(1-\tfrac{1}{n})\cdots (1-(n-1)/n)\\[2ex]
& \lt 1 + 1 + \frac{1}{2!} + \frac{1}{3!} + \cdots + \frac{1}{n!}\\[2ex]
& \lt 1 + 1 + \frac{1}{2} + \frac{1}{2^2} + \cdots + \frac{1}{2^{n-1}}\\[2ex]
&= 1 + \frac{1-(1/2)^{n}}{1-1/2}\\[2ex]
& \lt 3
\end{align*}
where we used that \(2^{n-1} \lt n!\) for all \(n\geq 3\). This shows that \((x_n)\) is bounded. Now, for each \(1\leq k\leq n\), we have that
\begin{align*}
\binom{n}{k}\frac{1}{n^k} &= \frac{n(n-1)(n-2)\cdots (n-(k-1))}{n^k} \\
&= (1-\tfrac{1}{n})(1-\tfrac{2}{n})\cdots (1-\tfrac{k-1}{n}).
\end{align*}
And similarly,
\begin{align*}
\binom{n+1}{k}\frac{1}{(n+1)^k} &= (1-\tfrac{1}{n+1})(1-\tfrac{2}{n+1})\cdots (1-\tfrac{k-1}{n+1}).
\end{align*}
It is clear that \((1-\tfrac{j}{n}) \lt (1-\tfrac{j}{n+1})\) for all \(1\leq j\leq n\). Hence, \(\binom{n}{k}\frac{1}{n^k} \lt \binom{n+1}{k}\frac{1}{(n+1)^k}\). Therefore, \(x_n \lt x_{n+1}\), that is, \((x_n)\) is increasing. By the MCT, \((x_n)\) converges to \(\sup\{x_n\,:\, n\in\N\}\).
Exercises
Let \((x_n)\) be an increasing sequence, let \((y_n)\) be a decreasing sequence, and assume that \(x_n\leq y_n\) for all \(n\in\mathbb{N}\). Prove that \(\displaystyle \lim_{n\rightarrow\infty}x_n\) and \(\displaystyle \lim_{n\rightarrow\infty} y_n\) exist, and that \(\displaystyle\lim_{n\rightarrow\infty} x_n\leq \lim_{n\rightarrow\infty} y_n\). {Note: Recall that a sequence \((x_n)\) is bounded if there exist constants \(R_1, R_2 \gt 0\) (independent of \(n\)) such that \(R_1\leq x_n \leq R_2\) for all \(n\in\N\).}
Let \(x_1=8\) and let \(x_{n+1}=\tfrac{1}{2}x_n+2\) for \(n\geq 1\). Prove that \((x_n)\) is bounded and monotone. Find the limit of \((x_n)\).
Let \(x_1=1\) and let \(\displaystyle x_{n+1}=\frac{n}{n+1}x_n^2\) for \(n\geq 1\). Prove that \((x_n)\) is bounded and monotone. Find the limit of \((x_n)\). {Hint: Using induction to prove that \((x_n)\) is monotone will not work with this sequence. Instead, work with \(x_{n+1}\) directly to prove that \((x_n)\) is monotone. }
True or false, a convergent sequence is necessarily monotone? If it is true, prove it. If it is false, give an example.
Bolzano-Weierstrass Theorem
We can gather information about a sequence by studying its subsequences. Loosely speaking, a subsequence of \((x_n)\) is a new sequence \((y_k)\) such that each term \(y_k\) is from the original sequence \((x_n)\) and the term \(y_{k+1}\) appears to therightof the term \(y_k\) in the original sequence \((x_n)\). Let us be precise about what we mean to the
right.
Let \((x_n)\) be a sequence. A subsequence of \((x_n)\) is a sequence of the form \((x_{n_1},x_{n_2},x_{n_3},\ldots )\) where \(n_1 \lt n_2 \lt n_3 \lt \cdots\) is a sequence of strictly increasing natural numbers. A subsequence of \((x_n)\) will be denoted by \((x_{n_k})\).
The notation \((x_{n_k})\) of a subsequence indicates that the indexing variable is \(k\in\N\). The selection of the elements of \((x_n)\) to form a subsequence \((x_{n_k})\) does not need to follow any particular well-defined pattern but only that \(n_1 \lt n_2 \lt n_3 \lt \cdots\). Notice that for any increasing sequence \(n_1 \lt n_2 \lt n_3 \lt \cdots\) of natural numbers, we have
\[
k\leq n_k
\]
for all \(k\geq 1\).
An example of a subsequence of \(x_n=\frac{1}{n}\) is the sequence \((y_k)=(1,\tfrac{1}{3},\tfrac{1}{5},\tfrac{1}{7},\ldots)\). Here we have chosen the odd terms of the sequence \((x_n)\) to create \((y_k)\). In other words, if we write that \(y_k = x_{n_k}\) then \(n_k=2k-1\). Another example of a subsequence of \((x_n)\) is obtained by taking the even terms to get the subsequence \((\tfrac{1}{2},\tfrac{1}{4},\tfrac{1}{6},\ldots)\), so that here \(n_k = 2k\). In general, we can take any increasing selection \(n_1 \lt n_2 \lt n_3 \lt \cdots\), such as
\[
(\tfrac{1}{11}, \tfrac{1}{303}, \tfrac{1}{2000}, \ldots)
\]
to form a subsequence of \((x_n)\).
Two subsequences of
\[
(x_n)=(1,-1,\tfrac{1}{2},-1,\tfrac{1}{3},-1,\tfrac{1}{4},-1,\ldots)
\]
\((-1,-1,-1,\ldots,)\) and \((1,\tfrac{1}{2},\tfrac{1}{3},\ldots,)\). Both of these subsequences converge to distinct limits.
We proved that \(\rat\) is countable and hence there is a bijection \(f:\N\rightarrow\rat\). The bijection \(f\) defines a sequence
\[
(x_n) = (x_1,x_2,x_3,x_4,\ldots)
\]
where \(x_n = f(n)\). Let \(L\in\real\) be arbitrary. By the density of \(\rat\) in \(\real\), there exists \(x_{n_1}\in\rat\) such that \(x_{n_1}\in (L-1, L+1)\). Now consider the interval \((L-\tfrac{1}{2}, L+\tfrac{1}{2})\). It has infinitely many distinct rational numbers (by the Density Theorem). Therefore, there exists \(n_2 \gt n_1\) such that \(x_{n_2}\in (L-\tfrac{1}{2}, L+\tfrac{1}{2})\). Consider now the interval \((L-\tfrac{1}{3}, L+\tfrac{1}{3})\). It has infinitely many rational numbers, and therefore there exists \(n_3 \gt n_2\) such that \(x_{n_3} \in (L-\tfrac{1}{3}, L+\tfrac{1}{3})\). By induction, there exists a subsequence \((x_{n_k})\) of \((x_n)\) such that \(|x_{n_k} - L| \lt \frac{1}{k}\) for all \(k\geq 1\). Therefore, \(\lim_{k\rightarrow\infty} x_{n_k} = L\). We proved the following: For any real number \(L\) there exists a sequence of rational numbers that converges to \(L\).
The following theorem is a necessary condition for convergence.
If \((x_n)\rightarrow L\) then every subsequence of \((x_n)\) converges to \(L\).
Let \(\veps \gt 0\). Then there exists \(K\in\N\) such that \(|x_n-L| \lt \veps\) for all \(n\geq K\). Since \(n_K \geq K\) and \(n_{K} \lt n_{K+1} \lt \ldots\), then \(|x_{n_k}-L| \lt \veps\) for all \(k\geq K\).
The contrapositive of the previous theorem is worth stating.
Let \((x_n)\) be a sequence.
The following is a very neat result that will supply us with a very short proof of the main result of this section, namely, the Bolzano-Weierstrass Theorem.
- If \((x_n)\) has two subsequences converging to distinct limits then \((x_n)\) is divergent.
- If \((x_n)\) has a subsequence that diverges then \((x_n)\) diverges.
Every sequence has a monotone subsequence.
Let \((x_n)\) be an arbitrary sequence. We will say that the term \(x_m\) is a peak if \(x_m\geq x_n\) for all \(n\geq m\). In other words, \(x_m\) is a peak if it is an upper bound of all the terms coming after it. There are two possible cases for \((x_n)\), either it has an infinite number of peaks or it has a finite number of peaks. Suppose that it has an infinite number of peaks, say \(x_{m_1}, x_{m_2},\ldots\), and we may assume that \(m_1 \lt m_2 \lt m_3 \lt \cdots\). Then, \(x_{m_1}\geq x_{m_2}\geq x_{m_3}\geq \cdots\), and therefore \((x_{m_k})\) is a decreasing sequence. Now suppose that there are only a finite number of peaks and that \(x_m\) is the last peak. Then \(x_{n_1}=x_{m+1}\) is not a peak and therefore there exists \(n_2 \gt n_1\) such that \(x_{n_2}\geq x_{n_1}\). Similarly, \(x_{n_2}\) is not a peak and therefore there exists \(n_3 \gt n_2\) such that \(x_{n_3}\geq x_{n_2}\). Hence, by induction, there exists a subsequence \((x_{n_k})\) that is increasing.
Every bounded sequence contains a convergent subsequence.
Let \((x_n)\) be an arbitrary bounded sequence. By Theorem 3.4.7, \((x_n)\) has a monotone subsequence \((x_{n_k})\). Since \((x_n)\) is bounded then so is \((x_{n_k})\). By the MCT applied to \((x_{n_k})\) we conclude that \((x_{n_k})\) is convergent.
We will give a second proof of the Bolzano-Weierstrass Theorem that is more hands-on.
If \((x_n)\) is a bounded sequence, then there exists \(a_1,b_1\in\real\) such that \(a_1\leq x_n\leq b_1\) for all \(n\in \N\). We will apply a recursive bisection algorithm to hunt down a converging subsequence of \((x_n)\). Let \(m_1 = \frac{(a_1+b_1)}{2}\) be the mid-point of the interval \([a_1,b_1]\). Then at least one of the subsets \(I_1=\{n\in\N\;:\; a_1\leq x_n \leq m_1\}\) or \(J_1=\{n\in\N\;:\; m_1\leq x_n \leq b_1\}\) is infinite; if it is \(I_1\) then choose some \(x_{n_1} \in [a_1,m_1]\) and let \(a_2=a_1\) and \(b_2= m_1\); otherwise choose some \(x_{n_1} \in [m_1,b_1]\) and let \(a_2=m_1\), \(b_2=b_1\). In any case, it is clear that \((b_2-a_2) = \frac{(b_1-a_1)}{2}\), that \(a_1\leq a_2\) and that \(b_1\geq b_2\). Now let \(m_2=\frac{(a_2+b_2)}{2}\) be the mid-point of the interval \([a_2,b_2]\) and let \(I_2=\{n\in\N\;:\; a_2\leq x_n \leq m_2\}\) and let \(J_2=\{n\in\N\;:\; m_2\leq x_n \leq b_2\}\). If \(I_2\) is infinite then choose some \(x_{n_2}\in [a_2,m_2]\) and let \(a_3=a_2\), \(b_3=m_2\); otherwise choose some \(x_{n_2}\in [m_2,b_2]\) and let \(a_3=m_2\) and \(b_3=b_2\). In any case, it is clear that \((b_3-a_3) = \frac{(b_1-a_1)}{2^2}\), that \(a_2\leq a_3\) and \(b_3\leq b_2\). By induction, there exists sequences \((a_k)\), \((b_k)\), and \((x_{n_k})\) such that \(a_k \leq x_{n_k}\leq b_k\) and \((b_k-a_k) = \frac{(b_1-a_1)}{2^{k-1}}\), \((a_k)\) is increasing and \((b_k)\) is decreasing. It is clear that \(a_k \leq b\) and \(a \leq b_k\) for all \(k\in \N\). Hence, by the MCT, \((a_k)\) and \((b_k)\) are convergent. Moreover, since \((b_k-a_k) = \frac{(b_1-a_1)}{2^{k-1}}\) then \(\lim (b_k-a_k) = 0\) and consequently \(\lim a_k = \lim b_k = L\). By the Squeeze theorem we conclude that \(\lim_{k\rightarrow\infty} x_{n_k} = L\).
Notice that the proofs of the Bolzano-Weierstrass Theorem rely on the Monotone Convergence Theorem and the latter relies on the Completeness Axiom. We therefore have the following chain of implications:
\[
\text{Completeness \(\Longrightarrow\) MCT \(\Longrightarrow\) Bol-Wei}
\]
It turns out that if we had taken as our starting axiom the Bolzano-Weierstrass theorem then we could prove the Completeness property and then of course the MCT. In other words, all three statements are equivalent:
\[
\text{Completeness \(\Longleftrightarrow\) MCT \(\Longleftrightarrow\) Bol-Wei}
\]
Exercises
Prove that the following sequences are divergent.
- \(x_n=1-(-1)^n+1/n\)
- \(x_n=\sin(n\pi/4)\)
(Hint: Theorem 3.4.6)
Suppose that \(x_n\geq 0\) for all \(n\in\mathbb{N}\) and suppose that \(\displaystyle\lim_{n\rightarrow\infty} (-1)^nx_n=L\) exists. Prove that \(L=0\) and that also \(\displaystyle\lim_{n\rightarrow\infty} x_n=L\). (Hint: Consider subsequences of \((-1)^n x_n\).)
Let \((x_n)\) be a sequence.
- Suppose that \((x_n)\) is increasing. Prove that if \((x_n)\) has a subsequence \((x_{n_k})\) that is bounded above then \((x_n)\) is also bounded above.
- Suppose that \((x_n)\) is decreasing. Prove that if \((x_n)\) has a subsequence \((x_{n_k})\) that is bounded below then \((x_n)\) is also bounded below.
True or false: If \((x_n)\) is bounded and diverges then \((x_n)\) has two subsequences that converge to distinct limits. Explain.
Give an example of a sequence \((x_n)\) with the following property: For each number \(L\in\left\{1, \tfrac{1}{2}, \tfrac{1}{3}, \tfrac{1}{4}, \ldots \right\}\) there exists a subsequence \((x_{n_k})\) such that \(x_{n_k}\rightarrow L\). Hint: If you are spending a lot of time on this question then you have not been reading this textbook carefully.
Suppose that \(x_1\) and \(y_1\) satisfy \(0 \lt x_1 \lt y_1\) and define
\[
x_{n+1} = \sqrt{x_ny_n},\qquad y_{n+1} = \frac{x_n+y_n}{2}
\]
for \(n\geq 1\). Prove that \((x_n)\) and \((y_n)\) are convergent and that \(\lim x_n = \lim y_n\). (Hint: First show that \(\sqrt{xy}\leq (x+y)/2\) for any \(x,y \gt 0\).)
Let \((x_n)\) be a sequence and define the sequence \((y_n)\) as
\[
y_n = \frac{x_1+x_2+\cdots+x_n}{n}
\]
Prove that if \((x_n)\rightarrow L\) then \((y_n)\rightarrow L\).
- \(x_n=1-(-1)^n+1/n\)
- \(x_n=\sin(n\pi/4)\)
- Suppose that \((x_n)\) is increasing. Prove that if \((x_n)\) has a subsequence \((x_{n_k})\) that is bounded above then \((x_n)\) is also bounded above.
- Suppose that \((x_n)\) is decreasing. Prove that if \((x_n)\) has a subsequence \((x_{n_k})\) that is bounded below then \((x_n)\) is also bounded below.
limsup and liminf
The behavior of a convergent sequence is easy to understand. Indeed, if \((x_n)\rightarrow L\) then eventually the terms of \((x_n)\) will be arbitrarily close to \(L\) for \(n\) sufficiently large. What else is there to say? In this section, we focus on bounded sequences that do not necessarily converge. The idea is that we would like to develop a limit concept for these sequences, and in particular, alimiting upper bound. Let \((x_n)\) be an arbitrary sequence and introduce the set \(S\) defined as the set of all the limits of convergent subsequences of \((x_n)\), that is, \[ S=\left\{L\in\real\;|\; (x_{n_k})\rightarrow L \right\}. \] We will call \(S\) the subsequences limit set of \((x_n)\).
If \((x_n)\) is bounded and \(S\) is the subsequences limit set of \((x_n)\) explain why \(S\) is non-empty.
Here are six examples of sequences and the corresponding subsequences limit set.
Notice that in the cases where \((x_n)\) is bounded, the set \(S\) is also bounded, which is as expected since if \(a\leq x_n\leq b\) then for any convergent subsequence \((x_{n_k})\) of \((x_n)\) we necessarily have \(a\leq x_{n_k}\leq b\) and therefore \(a\leq \lim_{k\rightarrow\infty} x_{n_k}\leq b\).
In general, we have seen that for a general set \(S\), \(\sup(S)\) and \(\inf(S)\) are not necessarily in \(S\). This, however, is not the case for the subsequences limit set.
\((x_n)\) | \(S\) |
---|---|
\((1,\tfrac{1}{2},\tfrac{1}{3},\ldots)\) | \(\{0\}\) |
\((1,-1,1,-1,\ldots)\) | \(\{1,-1\}\) |
\((1,2,3,4,\ldots)\) | \(\emptyset\) |
\((1,\tfrac{3}{2}, \tfrac{1}{3}, \tfrac{5}{4},\tfrac{1}{5}, \tfrac{7}{6},\ldots)\) | \(\{0,1\}\) |
\((0, 1, \tfrac{1}{2}, \tfrac{1}{4},\tfrac{3}{4},\tfrac{1}{8},\ldots,\tfrac{7}{8},\ldots)\) | \([0,1]\) |
\((r_n)\) enumeration of \(\rat\) | \(\real\) |
Let \((x_n)\) be a bounded sequence and let \(S\) be its subsequences limit set. Then \(\sup(S)\in S\) and \(\inf(S)\in S\). In other words, there exists a subsequence \((x_{n_k})\) of \((x_n)\) such that \(\displaystyle\lim_{k\rightarrow\infty} x_{n_k} = \sup(S)\) and similarly there exists a subsequence \((y_{n_k})\) of \((x_n)\) such that \(\displaystyle\lim_{k\rightarrow\infty} y_{n_k} =\inf(S)\).
Let \(u=\sup(S)\). If \(\veps \gt 0\) then there exists \(s\in S\) such that \(u-\veps \lt s \leq u\). Since \(s\in S\), there exists a subsequence \((x_{n_k})\) of \((x_n)\) that converges to \(s\). Therefore, there exists \(K\in\N\) such that \(u-\veps \lt x_{n_k} \lt u+\veps\) for all \(k\geq K\). Hence, for each \(\veps\), the inequality \(u-\veps \lt x_n \lt u+\veps\) holds for infinitely many \(n\). Consider \(\veps_1=1\). Then there exists \(x_{n_1}\) such that \(u-\veps_1 \lt x_{n_1} \lt u+\veps_1\). Now take \(\veps_2 =\frac{1}{2}\). Since \(u-\veps_2 \lt x_n \lt u+\veps\) holds for infinitely many \(n\), there exists \(x_{n_2}\) such that \(u-\veps_2 \lt x_{n_2} \lt u+\veps_2\) and \(n_2 \gt n_1\). By induction, for \(\eps_k = \frac{1}{k}\), there exists \(x_{n_k}\) such that \(u-\eps_k \lt x_{n_k} \lt u+\eps_k\) and \(n_k \gt n_{k-1}\). Hence, the subsequence \((x_{n_k})\) satisfies \(|x_{n_k} - u| \lt \frac{1}{k}\) and therefore \((x_{n_k})\rightarrow u\). Therefore, \(u = \sup(S) \in S\).
By Lemma 3.5.3, if \((x_n)\) is a bounded sequence then there exists a convergent subsequence of \((x_n)\) whose limit is larger than any other limit of a convergent subsequence of \((x_n)\). This leads to the following definition.
Let \((x_n)\) be a bounded sequence and let \(S\) be its subsequences limit set. We define the limit superior of \((x_n)\) as
\[
\limsup x_n = \sup S
\]
and the limit inferior of \((x_n)\) as
\[
\liminf x_n = \inf S.
\]
By Lemma 3.5.3, \(\limsup x_n\) is simply the largest limit of all convergent subsequences of \((x_n)\) while \(\liminf x_n\) is the smallest limit of all convergent subsequences of \((x_n)\). Notice that by definition it is clear that \(\liminf x_n \leq \limsup x_n\). The next theorem gives an alternative characterization of \(\limsup x_n\) and \(\liminf x_n\). The idea is that \(\limsup x_n\) is a sort of limiting supremum and \(\liminf x_n\) is a sort of limiting infimum of a bounded sequence \((x_n)\).
Let \((x_n)\) be a bounded sequence and let \(L^*\in \real\). The following are equivalent:
- \(L^*=\limsup x_n\)
- If \(\varepsilon \gt 0\) then there are at most finitely many \(x_n\) such that \(L^*+\veps \lt x_n\) and infinitely many \(x_n\) such that \(L^*-\veps \lt x_n\).
- Let \(u_m=\sup\{x_n\;:\;n\geq m\}\). Then \(\displaystyle L^*=\lim_{m\rightarrow\infty} u_m = \inf\{u_m\;:\; m\in\N\}\).
(i)\(\rightarrow\)(ii) Let \(S\) denote the subsequences limit set of \((x_n)\). By definition, \(L^*=\limsup x_n = \sup(S)\) and by Lemma 3.5.3 we have that \(L^* \in S\). Hence, there exists a subsequence of \((x_n)\) converging to \(L^*\) and thus \(L^*-\varepsilon \lt x_{n} \lt L^*+\varepsilon\) holds for infinitely many \(n\). In particular \(L^*-\varepsilon \lt x_{n}\) holds for infinitely many \(n\). Suppose that \(L^*+\varepsilon \lt x_n\) holds infinitely often. Now \(x_n\leq M\) for all \(n\) and some \(M \gt 0\). Since the inequality \(L^*+\varepsilon \lt x_n\) holds infinitely often, there exists a sequence \(n_1 \lt n_2 \lt \cdots\) such that \(L^*+\varepsilon \lt x_{n_k}\leq M\) for all \(k\in\N\). We can assume that \((x_{n_k})\) is convergent (because it is bounded and we can pass to a subsequence by the MCT) and thus \(\displaystyle L^*+\varepsilon\leq \limi x_{n_k}\leq M\). Hence we have proved that the subsequence \((x_{n_k})\) converges to a number greater than \(L^*\) which contradicts the definition of \(L^*=\sup(S)\).
(ii)\(\rightarrow\)(iii) Let \(\varepsilon \gt 0\). Since \(L^*+\varepsilon/2 \lt x_m\) holds for finitely many \(m\), there exists \(M\) such that \(x_m\leq L^*+\varepsilon/2\) for all \(m\geq M\). Hence, \(L^*+\varepsilon/2\) is an upper bound of \(\{x_n\;|\; n\geq m\}\) and thus \(u_m \lt L^*+\varepsilon\). Since \((u_m)\) is decreasing, we have that \(u_m \lt L^*+\varepsilon\) for all \(m\geq M\). Now, \(L^*-\varepsilon/2 \lt x_n\) holds infinitely often and thus \(L^*-\varepsilon \lt u_m\) for all \(m\in\N\). Hence, \(L^*-\varepsilon \lt u_m \lt L^*+\varepsilon\) for all \(m\geq M\). This proves the claim.
(iii)\(\rightarrow\)(i) Let \((x_{n_k})\) be a convergent subsequence. Since \(n_k\geq k\), by definition of \(u_k\), we have that \(x_{n_k}\leq u_k\). Therefore, \(\lim x_{n_k} \leq \lim u_k = L^*\). Hence, \(L^*\) is an upper bound of \(S\). By definition of \(u_k\), there exists \(x_{n_1}\) such that \(u_1-1 \lt x_{n_1}\leq u_1\). By induction, there exists a subsequence \((x_{n_k})\) such that \(u_k-\frac{1}{k} \lt x_{n_k}\leq u_k\). Hence, by the Squeeze Theorem, \(L^*=\lim x_{n_k}\). Hence, \(L^*\in S\) and thus \(L^*=\sup S=\liminf x_n\).
(ii)\(\rightarrow\)(iii) Let \(\varepsilon \gt 0\). Since \(L^*+\varepsilon/2 \lt x_m\) holds for finitely many \(m\), there exists \(M\) such that \(x_m\leq L^*+\varepsilon/2\) for all \(m\geq M\). Hence, \(L^*+\varepsilon/2\) is an upper bound of \(\{x_n\;|\; n\geq m\}\) and thus \(u_m \lt L^*+\varepsilon\). Since \((u_m)\) is decreasing, we have that \(u_m \lt L^*+\varepsilon\) for all \(m\geq M\). Now, \(L^*-\varepsilon/2 \lt x_n\) holds infinitely often and thus \(L^*-\varepsilon \lt u_m\) for all \(m\in\N\). Hence, \(L^*-\varepsilon \lt u_m \lt L^*+\varepsilon\) for all \(m\geq M\). This proves the claim.
(iii)\(\rightarrow\)(i) Let \((x_{n_k})\) be a convergent subsequence. Since \(n_k\geq k\), by definition of \(u_k\), we have that \(x_{n_k}\leq u_k\). Therefore, \(\lim x_{n_k} \leq \lim u_k = L^*\). Hence, \(L^*\) is an upper bound of \(S\). By definition of \(u_k\), there exists \(x_{n_1}\) such that \(u_1-1 \lt x_{n_1}\leq u_1\). By induction, there exists a subsequence \((x_{n_k})\) such that \(u_k-\frac{1}{k} \lt x_{n_k}\leq u_k\). Hence, by the Squeeze Theorem, \(L^*=\lim x_{n_k}\). Hence, \(L^*\in S\) and thus \(L^*=\sup S=\liminf x_n\).
Let \(p_n\) denote the \(n\)th prime number, that is \(p_1=2\), \(p_2=3\), \(p_3=5\), and so on. The numbers \(p_n\) and \(p_{n+1}\) are called twin primes if \(p_{n+1}-p_n=2\). The Twin Prime Conjecture is that
\[
\liminf (p_{n+1}-p_n) = 2
\]
In other words, the Twin Prime Conjecture is that there are infinitely many pairs of twin primes.
We end this section with the following interesting theorem that says that if the subsequences limit set of a bounded sequence \((x_n)\) consists of a single number \(L\) then the sequence \((x_n)\) also converges to \(L\).
Let \((x_n)\) be a bounded sequence and let \(L\in\real\). If every convergent subsequence of \((x_n)\) converges to \(L\) then \((x_n)\) converges to \(L\).
Suppose that \((x_n)\) does not converge to \(L\). Then, there exists \(\veps \gt 0\) such that for every \(K\in\N\) there exists \(n\geq K\) such that \(|x_n-L|\geq \veps\). Let \(K_1\in\mathbb{N}\). Then there exists \(n_1\geq K_1\) such that \(|x_{n_1}-L|\geq \veps\). Then there exists \(n_2 \gt n_1+1\) such that \(|x_{n_2}-L|\geq\veps\). By induction, there exists a subsequence \((x_{n_k})\) of \((x_n)\) such that \(|x_{n_k}-L|\geq \veps\) for all \(k\in\N\). Now \((x_{n_k})\) is bounded and therefore by Bolzano-Weierstrass has a convergent subsequence, say \((z_k)\), which is also a subsequence of \((x_n)\). By assumption, \((z_k)\) converges to \(L\), which contradicts that \(|x_{n_k}-L|\geq \veps\) for all \(k\in\N\).
Another way to say Theorem 3.5.7 is that if \((x_n)\) is bounded and \(L=\limsup x_n = \liminf x_n\) then \((x_n)\rightarrow L\). The converse, by the way, has already been proved: if \((x_n)\rightarrow L\) then every subsequence of \((x_n)\) converges to \(L\) and therefore \(L=\limsup x_n = \liminf x_n\).
Exercises
Determine the \(\displaystyle\limsup_{n\rightarrow\infty} x_n\) and \(\displaystyle\liminf_{n\rightarrow\infty} x_n\) for each case:
- \(x_n=3+(-1)^n(1+1/n)\)
- \(x_n=1+\sin(n\pi/2)\)
- \(x_n=(2-1/n)(-1)^n\)
Let \((x_n)\) and \((y_n)\) be bounded sequences. Let \((z_n)\) be the sequence \(z_n=x_n+y_n\). Show that
\[
\limsup_{n\rightarrow\infty} z_n\leq \limsup_{n\rightarrow\infty} x_n + \limsup_{n\rightarrow\infty} y_n
\]
In other words, prove that
\[
\limsup_{n\rightarrow\infty} (x_n+y_n)\leq \limsup_{n\rightarrow\infty} x_n + \limsup_{n\rightarrow\infty} y_n
\]
Let \((x_n)\) and \((y_n)\) be bounded sequences. Show that if \(x_n\leq y_n\) for all \(n\) then
\[
\limsup_{n\rightarrow\infty} x_n \leq \limsup_{n\rightarrow\infty} y_n.
\]
- \(x_n=3+(-1)^n(1+1/n)\)
- \(x_n=1+\sin(n\pi/2)\)
- \(x_n=(2-1/n)(-1)^n\)
Cauchy Sequences
Up until now, the Monotone Convergence theorem is our main tool for determining that a sequence converges without actually knowing what the the limit is. It is a general sufficient condition for convergence. In this section, we prove another groundbreaking general sufficient condition for convergence known as the Cauchy criterion. Roughly speaking, the idea is that if the terms of a sequence \((x_n)\) become closer and closer to one another as \(n\rightarrow\infty\) then the sequence ought to converge. A sequence whose terms become closer and closer to one another is called a Cauchy sequence.
A sequence \((x_n)\) is said to be a Cauchy sequence if for every \(\varepsilon \gt 0\) there exists a natural number \(K\) such that if \(n,m\geq K\) then \(|x_n-x_m| \lt \varepsilon\).
In other words, \((x_n)\) is a Cauchy sequence if the difference \(|x_n-x_m|\) is arbitrarily small provided that both \(n\) and \(m\) are sufficiently large.
Prove directly using the definition of a Cauchy sequence that if \((x_n)\) and \((y_n)\) are Cauchy sequences then the sequence \(z_n = |x_n - y_n|\) is a Cauchy sequence.
Not surprisingly, a convergent sequence is indeed a Cauchy sequence.
If \((x_n)\) is convergent then it is a Cauchy sequence.
Suppose that \((x_n)\rightarrow L\). Let \(\varepsilon \gt 0\) and let \(K\) be sufficiently large so that \(|x_n-L| \lt \varepsilon/2\) for all \(n\geq K\). If \(n,m\geq K\) then by the triangle inequality,
\begin{align*}
|x_n-x_m| &= |x_n-L+L-x_m|\\
&\leq |x_n-L| + |x_m-L|\\
& \lt \varepsilon/2 + \varepsilon/2\\
&= \varepsilon.
\end{align*}
This proves that \((x_n)\) is Cauchy.
A Cauchy sequence is bounded.
If \((x_n)\) is a Cauchy sequence then \((x_n)\) is bounded.
The proof is similar to the proof that a convergent sequence is bounded.
The sequence \((x_n)\) is convergent if and only if \((x_n)\) is a Cauchy sequence.
In Lemma 3.6.3 we already showed that if \((x_n)\) converges then it is a Cauchy sequence. To prove the converse, suppose that \((x_n)\) is a Cauchy sequence. By Lemma 3.6.4, \((x_n)\) is bounded. Therefore, by the Bolzano-Weierstrass theorem there is a subsequence \((x_{n_k})\) of \((x_n)\) that converges, say it converges to \(L\). We will prove that \((x_n)\) also converges to \(L\). Let \(\eps \gt 0\) be arbitrary. Since \((x_n)\) is Cauchy there exists \(K\in\N\) such that if \(n,m\geq K\) then \(|x_n-x_m| \lt \eps/2\). On the other hand, since \((x_{n_k})\rightarrow L\) there exists \(n_M \geq K\) such that \(|x_{n_M}-L| \lt \eps/2\). Therefore, if \(n\geq K\) then
\begin{align*}
|x_n - L| &= |x_n - x_{n_M} + x_{n_M} - L|\\[2ex]
&\leq |x_n - x_{n_M}| + |x_{n_M} - L| \\[2ex]
& \lt \eps/2 + \eps/2\\
&= \eps.
\end{align*}
This proves that \((x_n)\) converges to \(L\).
Let \(0 \lt r \lt 1\) and suppose that \(|x_n - x_{n+1}| \leq r^n\) for all \(n\in \N\). Using the fact that \(1+r+r^2+\cdots+ r^k \lt \frac{1}{1-r}\) for all \(k\in\N\) prove that if \(m \gt n\) then \(|x_n - x_m| \leq \frac{r^n}{1-r}\). Deduce that \((x_n)\) is a Cauchy sequence.
When the MCT is not applicable, the Cauchy criterion is another possible tool to show convergence of a sequence.
Consider the sequence \((x_n)\) defined by \(x_1=1\), \(x_2=2\), and
\[
x_n=\tfrac{1}{2}(x_{n-2}+x_{n-1})
\]
for \(n\geq 2\). One can show that \((x_n)\) is not monotone and therefore the MCT is not applicable.
Notice that the main result used in the Cauchy Criterion is the Bolzano--Weierstrass (B--W) theorem. We therefore have the following chain of implications:
\[
\text{Completeness \(\Longrightarrow\) MCT \(\Longrightarrow\) B-W \(\Longrightarrow\) Cauchy}
\]
A close inspection of the Cauchy Criterion reveals that it is really a statement about the real numbers not having any gaps or holes. In fact, the same can be said about the MCT and the Bolzano-Weierstrass theorem. Regarding the Cauchy Criterion, if \((x_n)\) is a Cauchy sequence then the terms of \((x_n)\) are clustering around a number and that number must be in \(\real\) if \(\real\) has no holes. It is natural to ask then if we could have used the Cauchy Criterion as our starting axiom (instead of the Completeness Axiom) and then prove the Completeness property, and then the MCT and the Bolzano-Weierstrass theorem. Unfortunately, the Cauchy Criterion is not enough and we also need to take as an axiom the Archimedean Property.
- Prove that \(1\leq x_n \leq 2\) for all \(n\in\N\).
- Prove that \(|x_n-x_{n+1}| = \frac{1}{2^{n-1}}\) for all \(n\in\N\).
- Prove that if \(m \gt n\) then \[ |x_n-x_m| \lt \frac{1}{2^{n-2}} \] Hint: Use part (b) and the Triangle inequality.
- Deduce that \((x_n)\) is a Cauchy sequence and thus convergent.
- Show by induction that \(x_{2n+1}=1+\tfrac{2}{3}\left(1-\tfrac{1}{4^n}\right)\) and deduce that \(\lim x_n = \tfrac{5}{3}\).
Suppose that every Cauchy sequence in \(\real\) converges to a number in \(\real\) and the Archimedean Property holds in \(\real\). Then \(\real\) satisfies the Completeness property, that is, every non-empty bounded above subset of \(\real\) has a least upper bound in \(\real\).
Let \(S\subset\real\) be a non-empty set that is bounded above. If \(u\) is an upper bound of \(S\) and \(u\in S\) then \(u\) is the least upper bound of \(S\) and since \(u\in \real\) there is nothing to prove. Suppose then that no upper bound of \(S\) is an element of \(S\). Let \(a_1\in S\) be arbitrary and let \(b_1\in\real\) be an upper bound of \(S\). Then \(a_1 \lt b_1\), and we set \(M = b_1 - a_1 \gt 0\). Consider the mid-point \(m_1=\tfrac{a_1+b_1}{2}\) of the interval \([a_1,b_1]\). If \(m_1\) is an upper bound of \(S\) then set \(b_2 = m_1\) and set \(a_2 = a_1\), otherwise set \(b_2 = b_1\) and \(a_2 = m_1\). In any case, we have \(|a_2 - a_1| \leq \frac{M}{2}\), \(|b_2-b_1|\leq \frac{M}{2}\), and \(|b_2-a_2| = \frac{M}{2}\). Now consider the mid-point \(m_2 = \tfrac{a_2+b_2}{2}\) of the interval \([a_2,b_2]\). If \(m_2\) is an upper bound of \(S\) then set \(b_3 = m_2\) and \(a_3=a_2\), otherwise set \(b_3 = b_2\) and \(a_3 = m_2\). In any case, we have \(|a_3-a_2| \leq \frac{M}{2^2}\), \(|b_3-b_2| \leq \frac{M}{2^2}\), and \(|b_3-a_3| = \frac{M}{2^2}\). By induction, there exists a sequence \((a_n)\) such that \(a_n\) is not an upper bound of \(S\) and a sequence \((b_n)\) such that \(b_n\) is an upper bound of \(S\), and \(|a_n - a_{n+1}| \leq \frac{M}{2^n}\), \(|b_n - b_{n+1}| \leq \frac{M}{2^n}\), and \(|b_n - a_n| = \frac{M}{2^{n-1}}\). By Exercise 3.6.6, and using the fact that \(\limi r^n = 0\) if \(0 \lt r \lt 1\) (this is where the Archimedean property is needed), it follows that \((a_n)\) and \((b_n)\) are Cauchy sequences and therefore by assumption both \((a_n)\) and \((b_n)\) are convergent. Since \(|b_n - a_n| = \frac{M}{2^{n-1}}\) it follows that \(u = \lim a_n = \lim b_n\). We claim that \(u\) is the least upper bound of \(S\). First of all, for fixed \(x\in S\) we have that \(x \lt b_n\) for all \(n\in \N\) and therefore \(x\leq \lim b_n\), that is, \(u\) is an upper bound of \(S\). Since \(a_n\) is not an upper bound of \(S\), there exists \(x_n \in S\) such that \(a_n \lt x_n \lt b_n\) and therefore by the Squeeze theorem we have \(u = \lim x_n\). Given an arbitrary \(\eps \gt 0\) then there exists \(K\in \N\) such that \(u - \eps \lt x_K\) and thus \(u-\eps\) is not an upper bound of \(S\). This proves that \(u\) is the least upper bound of \(S\).
Exercises
Show that if \((x_n)\) is a Cauchy sequence then \((x_n)\) is bounded. (Note: Do not use the fact that a Cauchy sequence converges but show directly that if \((x_n)\) is Cauchy then \((x_n)\) is bounded.)
Show that if \((x_n)\) converges then it is a Cauchy sequence.
Show by definition that \(x_n=\frac{n+1}{n}\) is a Cauchy sequence.
Show by definition that \(x_n=\frac{\cos(n^2+1)}{n}\) is a Cauchy sequence.
Suppose that \((x_n)\) and \((y_n)\) are sequences such that \(|x_m-x_n|\leq |y_m-y_n|\) for all \(n,m\in\mathbb{N}\). Show that if the sequence \((y_n)\) is convergent then so is the sequence \((x_n)\).
Suppose that \(0 \lt r \lt 1\). Show that if the sequence \((x_n)\) satisfies \(|x_{n}-x_{n-1}| \lt r^{n-1}\) for all \(n\geq 2\) then \((x_n)\) is a Cauchy sequence and therefore convergent. Hint: If \(m \gt n\) then
\[
|x_m - x_n| = |x_m - x_{m-1} + x_{m-1} - x_{m-2} + x_{m-2} - \cdots + x_{n+1} - x_n|
\]
Also, if \(0 \lt r \lt 1\) then \(\sum_{j=0}^k r^j = \frac{1-r^{k+1}}{1-r} \lt \frac{1}{1-r}\).
Infinite Series
Informally speaking, a series is an infinite sum: \[ x_1 + x_2 + x_3 + \cdots \] Using summation notation: \[ \sum_{n=1}^\infty x_n = x_1 + x_2 + x_3 + \cdots \] The series \(\sum_{n=1}^\infty x_n\) can be thought of as the sum of the sequence \((x_n) = (x_1,x_2,x_3,\ldots)\). It is of course not possible to actually sum an infinite number of terms and so we need a precise way to talk about what it means for a series to have a finite value. Take for instance \[ \sumo (\tfrac{2}{3})^{n-1} = 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3 + \cdots \] so that the sequence being summed is \(x_n = (\tfrac{2}{3})^{n-1}\). Let's compute the first 10 terms of the sequence of partial sums \(\{s_n\}_{n=1}^\infty=(s_1,s_2,s_3,s_4,s_5,\ldots)\) defined as follows: \begin{align*} s_1 &= 1\\ s_2 &= 1 + (\tfrac{2}{3}) = 1.6666\\ s_3 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 = 2.1111\\ s_4 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3 = 2.4074\\ s_5 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4 = 2.6049\\ s_6 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4+(\tfrac{2}{3})^5 = 2.7366\\ s_7 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4+(\tfrac{2}{3})^5+(\tfrac{2}{3})^6 = 2.8244\\ s_8 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4+(\tfrac{2}{3})^5+(\tfrac{2}{3})^6+(\tfrac{2}{3})^7 = 2.8829\\ s_9 &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4+(\tfrac{2}{3})^5+(\tfrac{2}{3})^6+(\tfrac{2}{3})^7+(\tfrac{2}{3})^8 = 2.9219\\ s_{10} &= 1 + (\tfrac{2}{3}) + (\tfrac{2}{3})^2 + (\tfrac{2}{3})^3+(\tfrac{2}{3})^4+(\tfrac{2}{3})^5+(\tfrac{2}{3})^6+(\tfrac{2}{3})^7+(\tfrac{2}{3})^8+(\tfrac{2}{3})^9 =2.9479 \end{align*} With the help of a computer, one can compute \begin{align*} s_{20} &= \sum_{k=1}^{20} (\tfrac{2}{3})^{k-1} = 2.999097813\ldots\\[2ex] s_{50} &= \sum_{k=1}^{50} (\tfrac{2}{3})^{k-1} = 2.999999994\ldots\\[2ex] s_{100} &= \sum_{k=1}^{100} (\tfrac{2}{3})^{k-1} = 2.999999998\ldots. \end{align*} It seems as though the sequence \((s_n)\) is converging to \(L=3\), that is, \(\limi s_n = 3\). It is then reasonable to say that the infinite series sums or converges to \[ \sumo (\tfrac{2}{3})^{n-1} = 3 = \limi s_n. \] We now introduce some definitions to formalize our example.
Let \((x_n)\) be a sequence. The infinite series generated by \((x_n)\) is the sequence \((s_n)\) defined by
\[
s_n = x_1 + x_2 + \cdots + x_n
\]
or recursively,
\begin{align*}
s_1 &= x_1 \\
s_{n+1} &= s_n + x_{n+1}, \; n\geq 1.
\end{align*}
The sequence \((s_n)\) is also called the sequence of partials sums generated by \((x_n)\). The \(n\)th term of the sequence of partial sums \((s_n)\) can instead be written using summation notation:
\[
s_n = x_1+x_2+\cdots + x_n = \sum_{k=1}^n x_k.
\]
Let \((x_n)\) be the sequence \(x_n = \frac{3}{n}\). The first few terms of the sequence of partials \((s_n)\) is
\begin{align*}
s_1 &= x_1 = 3\\
s_2 &= x_1 + x_2 = 3 + \tfrac{3}{2} = \tfrac{9}{2}\\
s_3 &= x_1+x_2+x_3 = \tfrac{9}{2} + 1 = \tfrac{11}{2}\\
s_4 &= s_3 + x_4 = \tfrac{11}{2} + \tfrac{3}{4} = \tfrac{25}{4}
\end{align*}
In both examples above, we make the following important observation: if \((x_n)\) is a sequence of non-negative terms then the sequence of partials sums \((s_n)\) is increasing. Indeed, if \(x_n\geq 0\) then
\[
s_{n+1} = s_n + x_{n+1} \geq s_n
\]
and thus \(s_{n+1} \geq s_n\).
Find the sequence of partial sums generated by \(x_n = (-1)^n\).
We compute:
\begin{align*}
s_1 &= x_1 = -1\\
s_2 &= x_1 + x_2 = -1 + 1 = 0\\
s_3 &= s_2 + x_3 = 0 - 1 = -1
\end{align*}
Hence, \((s_n) = (-1, 0 , -1, 0, -1, 0, \ldots)\).
The limit of the sequence \((s_n)\), if it exists, makes precise what it means for an infinite series
\[
\sumo x_n = x_1 + x_2 + x_3 + \cdots
\]
to converge.
Let \((x_n)\) be a sequence and let \((s_n)\) be the sequence of partial sums generated by \((x_n)\). If \(\limi s_n\) exists and equals \(L\) then we say that the series generated by \((x_n)\) converges to \(L\) and we write that
\[
\sumo x_n = L = \limi s_n = \limi\sum_{k=1}^n x_k.
\]
The notation \(\sumo x_n\) is therefore a compact way of writing
\[
\limi \sum_{k=1}^n x_k,
\]
and the question of whether the series \(\sumo x_n\) converges is really about whether the limit \(\limi s_n\) exists. Often we will write a series such as \(\sumo x_n\) simply as \(\sum x_n\) when either the initial value of \(n\) is understood or is unimportant. Sometimes, the initial \(n\) value may be \(n=0\), \(n=2\), or some other \(n=n_0\).
The geometric series is perhaps the most important series we will encounter. Let \(x_n = r^n\) where \(r\in\real\) is a constant. The generated series is
\[
\sumz x_n = \sumz r^n = 1 + r + r^2 + r^3 + \cdots
\]
and is called the geometric series. The \(n\)th term of the sequence of partial sums is
\[
s_n = 1 + r + r^2 + \cdots + r^n.
\]
If \(r=1\) then \(\limi s_n\) does not exist (why>, so suppose that \(r \neq 1\). Using the fact that \((1-r)(1+r+r^2+\cdots+r^n) = \frac{1-r^{n+1}}{1-r}\) we can write
\[
s_n = \frac{1-r^{n+1}}{1-r}.
\]
Now if \(|r| \lt 1\) then \(\limi r^n = 0\), while if \(|r| \gt 1\) then \(\limi r^n\) does not exist. Therefore, if \(|r| \lt 1\) then
\[
\limi s_n = \limi \frac{1-r^{n+1}}{1-r} = \frac{1}{1-r}.
\]
Therefore,
\[
\sumz r^n = \limi s_n = \frac{1}{1-r}.
\]
In summary: The series \(\sumz r^n\) is called the geometric series and converges if and only if \(|r| \lt 1\) and in this case converges to \(\frac{1}{1-r}\).
Consider the series \(\sumz \frac{(-1)^n2^n}{3^n}\). The series can be written as \(\sumz \left(\frac{-2}{3}\right)^n\) and thus it is a geometric series with \(r=-\frac{2}{3}\). Since \(|r|=|-\tfrac{2}{3}| \lt 1\), the series converges and it converges to
\[
\sumz \frac{(-1)^n 2^n}{3^n} = \frac{1}{1-(-2/3)} = \frac{3}{5}
\]
Use the geometric series to show that
\[
0.999999\ldots = 1.
\]
We can write
\begin{align*}
0.999999\ldots &= 0.9 + 0.09 + 0.009 + \cdots \\
&= \frac{9}{10} + \frac{9}{100} + \frac{9}{1000} + \cdots\\[2ex]
&= \frac{9}{10} + \frac{9}{10^2} + \frac{9}{10^3} + \cdots\\[2ex]
&= \frac{9}{10} \left(1+\frac{1}{10}+\frac{1}{10^2} + \cdots\right)\\[2ex]
&= \frac{9}{10} \sumz \frac{1}{10^n}\\[2ex]
&= \frac{9}{10} \left(\frac{1}{1-\tfrac{1}{10}}\right)\\[2ex]
&= 1.
\end{align*}
Consider \(\sumo \frac{1}{n(n+1)} = \frac{1}{1\cdot 2} + \frac{1}{2\cdot 3} + \frac{1}{3\cdot 4} + \cdots\). Using partial fraction decomposition
\[
\frac{1}{n(n+1)} = \frac{1}{n} - \frac{1}{n+1}.
\]
Therefore, the \(n\)th term of the sequence of partial sums \((s_n)\) is
\[
s_n = \sum_{k=1}^n \frac{1}{k(k+1)} = \sum_{k=1}^n \left[\frac{1}{k} - \frac{1}{k+1}\right]
\]
This is a telescoping sum because all terms in the middle cancel and only the first and last remain. For example:
\begin{align*}
s_1 &= 1 - \tfrac{1}{2}\\
s_2 &= 1-\tfrac{1}{2} + \tfrac{1}{2} - \tfrac{1}{3} = 1 - \tfrac{1}{3}\\
s_3 &= 1-\tfrac{1}{2} + \tfrac{1}{2} - \tfrac{1}{3} + \tfrac{1}{3} - \tfrac{1}{4} = 1 - \tfrac{1}{4}.
\end{align*}
By induction one can show that \(s_n = 1 - \frac{1}{n+1}\) and therefore \(\limi s_n =1\). Therefore, the given series converges and it converges to \(\sumo \frac{1}{n(n+1)} = \limi s_n = 1\).
Consider the series \(\sumo \frac{1}{n} = 1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \cdots\). We are going to analyze a subsequence \((s_{n_k})\) of the sequence of partial sums \((s_n)\). We will show that \((s_{n_k})\) is unbounded and thus \((s_n)\) is unbounded and therefore divergent. Consequently, the series \(\sumo \frac{1}{n}\) is divergent. Consider
\[
s_4 = 1 + \tfrac{1}{2} + \left(\tfrac{1}{3} + \tfrac{1}{4}\right) \gt 1 + \tfrac{1}{2} + \tfrac{1}{2} = 1 + 2\tfrac{1}{2}.
\]
Now consider
\[
s_8 = s_4 + \tfrac{1}{5}+ \tfrac{1}{6} + \tfrac{1}{7} + \tfrac{1}{8} \gt 1 + 2\tfrac{1}{2} + 4\tfrac{1}{8} = 1 + 3\tfrac{1}{2}.
\]
Lastly consider,
\[
s_{16} = s_8 + \tfrac{1}{9}+ \tfrac{1}{10} + \cdots + \tfrac{1}{16} \gt 1 + 3\tfrac{1}{2} + 8\tfrac{1}{16} = 1 + 4\tfrac{1}{2}.
\]
In general, one an show by induction that for \(k\geq 2\) we have
\[
s_{2^k} \gt 1 + \frac{k}{2}.
\]
Therefore, the subsequence \((s_4, s_8, s_{16}, \ldots)\) is unbounded and thus the series \(\sumo \frac{1}{n}\) is divergent.
We now present some basic theorems on the convergence of series; most of them are a direct consequence of results from limit theorems for sequences. The first theorem we present can be used to show that a series diverges.
If \(\sum x_n\) converges then \(\displaystyle\limi x_n = 0\). Equivalently, if \(\displaystyle\limi x_n \neq 0\) then \(\sum x_n\) diverges.
By definition, if \(\sum x_n\) converges then the sequence of partial sums \((s_n)\) is convergent. Suppose then that \(L = \limi s_n = \sum x_n\). Recall that \((s_n)\) has the recursive definition \(s_{n+1} = s_n + x_{n+1}\). The sequences \((s_{n+1})\) and \((s_n)\) both converge to \(L\) and thus
\begin{align*}
\limi x_{n+1} &= \limi (s_{n+1} - s_n)\\
& = L - L\\
& = 0.
\end{align*}
The series \(\sumo \frac{3n+1}{2n+5}\) is divergent because \(\limi\frac{3n+1}{2n+5} = \frac{3}{2} \gt 1\).
The Series Divergence Test can only be used to show that a series diverges. For example, consider the harmonic series \(\sumo \frac{1}{n}\). Clearly \(\limi \frac{1}{n}=0\). However, we already know that \(\sumo \frac{1}{n}\) is a divergent series. Hence, in general, the condition \(\limi x_n = 0\) is not sufficient to establish convergence of the series \(\sumo x_n\).
A certain series \(\sum_{k=1}^\infty x_k\) has sequence of partial sums \((s_n)\) whose general \(n\)th term is \(s_n = \frac{2n}{n+1}\).
The next theorem is just an application of the Cauchy criterion for convergence of sequences to the sequence of partial sums \((s_n)\).
- What is \(\sum_{k=1}^{10} x_k\)?
- Does the series \(\sum_{k=1}^\infty x_k\) converge? If yes, what does it converge to? Explain.
- Does the sequence \((x_n)\) converge? If yes, what does it converge to? Explain.
The series \(\sum x_n\) converges if and only if for every \(\veps \gt 0\) there exists \(K\in\N\) such that if \(m \gt n\geq K\) then
\[
|s_m - s_n| = |x_{n+1}+x_{n+2}+\cdots+x_m| \lt \veps.
\]
The following theorem is very useful and is a direct application of the Monotone convergence theorem.
Suppose that \((x_n)\) is a sequence of non-negative terms, that is, \(x_n\geq 0\) for all \(n\in\N\). Then \(\sum x_n\) converges if and only if \((s_n)\) is bounded. In this case,
\[
\sum_{n=1}^\infty x_n = \sup\{ s_n \;|\; n\geq 1\}.
\]
Clearly, if \(\sum x_n = \limi s_n\) exists then \((s_n)\) is bounded. Now suppose that \((s_n)\) is bounded. Since \(x_n\geq 0\) for all \(n\in\N\) then \(s_{n+1} = s_n + x_{n+1} \geq s_n\) and thus \(s_{n+1}\geq s_n\) shows that \((s_n)\) is an increasing sequence. By the Monotone convergence theorem, \((s_n)\) converges and \(\limi s_n = \sup\{s_n\;|\; n\geq 1\}\).
Consider the series \(\sumo \frac{1}{n^2}\). Since \(\frac{1}{n^2} \gt 0\), to prove that the series converges it is enough to show that the sequence of partial sums \((s_n)\) is bounded. We will consider a subsequence of \((s_n)\), namely, \((s_{n_k})\) where \(n_k = 2^k - 1\) for \(k\geq 2\). We have
\begin{align*}
s_3 &= 1 + \frac{1}{2^2} + \frac{1}{3^2}\\
& \lt 1 + \frac{1}{2^2} + \frac{1}{2^2}\\
& = 1 + \frac{1}{2}
\end{align*}
and
\begin{align*}
s_7 &= s_3 + \frac{1}{4^2} + \frac{1}{5^2} + \frac{1}{6^2} + \frac{1}{7^2}\\
& \lt 1 + \frac{1}{2} + \frac{4}{4^2}\\
& = 1+\frac{1}{2}+\frac{1}{2^2}.
\end{align*}
By induction, one can show that
\[
s_{n_k} \lt 1 + \frac{1}{2} + \frac{1}{2^2} + \cdots + \frac{1}{2^{k-1}}
\]
and therefore using the geometric series with \(r=1/2\) we have
\begin{align*}
s_{n_k} & \lt 1 + \frac{1}{2} + \frac{1}{2^2} + \cdots + \frac{1}{2^{k-1}}\\
& \lt \sumz \frac{1}{2^n} = 2.
\end{align*}
This shows that the subsequence \((s_{n_k})\) is bounded. In general, the existence of a bounded subsequence does not imply that the original sequence is bounded but if the original sequence is increasing then it does. In this case, \((s_n)\) is indeed increasing, and thus since \((s_{n_k})\) is a bounded subsequence then \((s_n)\) is also bounded. Therefore, \((s_n)\) converges, that is, \(\sumo \frac{1}{n^2}\) is a convergent series.
Suppose that \(\sum x_n\) and \(\sum y_n\) are convergent series.
- Then \(\sum (x_n + y_n)\) and \(\sum (x_n - y_n)\) are also convergent and \[ \sum (x_n\pm y_n) = \sum x_n \pm \sum y_n. \]
- For any constant \(c\in\real\), \(\sum cx_n\) is also convergent and \(\sum cx_n = c\sum x_n\).
Using the limit laws:
\begin{align*}
\limi \sum_{k=1}^n (x_k \pm y_k) &= \limi \left(\sum_{k=1}^n x_k \pm \sum_{k=1}^n y_k\right) \\[2ex]
&= \limi \sum_{k=1}^n x_k \pm \limi \sum_{k=1}^n y_k\\[2ex]
&= \sumo x_n \pm \sumo y_n.
\end{align*}
Hence,
\[
\sumo (x_n \pm y_n) = \sumo x_n \pm \sumo y_n.
\]
If \(c\) is a constant then by the Limit Laws,
\[
\limi \sum_{k=1}^n c x_k = c \limi \sum_{k=1}^n x_k = c \sumo x_n.
\]
Therefore,
\[
\sumo c x_n = c \sumo x_n.
\]
Once establishing the convergence/divergence of some sequences, we can use comparison tests to the determine convergence/divergence properties of new sequences.
Let \((x_n)\) and \((y_n)\) be non-negative sequences and suppose that \(x_n \leq y_n\) for all \(n\geq 1\).
- If \(\sum y_n\) converges then \(\sum x_n\) converges.
- If \(\sum x_n\) diverges then \(\sum y_n\) diverges.
Let \(t_n = \sum_{k=1}^n x_n\) and \(s_n = \sum_{k=1}^n y_k\) be the sequences of partial sums. Since \(x_n\leq y_n\) then \(t_n \leq s_n\). To prove (a), if \(\sum y_n\) converges then \((s_n)\) is bounded. Thus, \((t_n)\) is also bounded. Since \((t_n)\) is increasing and bounded it is convergent by the MCT. To prove (b), if \(\sum x_n\) diverges then \((t_n)\) is necessarily unbounded and thus \((s_n)\) is also unbounded and therefore \((s_n)\) is divergent.
Let \(p\geq 2\) be an integer and let \((d_n)\) be a sequence of integers such that \(0\leq d_n \leq p-1\). Use the comparison test to show that the series \(\sum_{n=1}^\infty \frac{d_n}{p^n}\) converges and converges to a point in \([0, 1]\).
Determine whether the given series converge.
- \(\sumo \frac{n^2}{3n^2+n}\): First compute \(\limi\frac{n^2}{3n^2 + n} = 1/3\). Therefore, by the divergence test, the series diverges.
- \(\sumo \frac{1}{n^p}\) where \(p \geq 2\): We know that \(\sumo \frac{1}{n^2}\) is convergent. Now, if \(p\geq 2\) then \(\frac{1}{n^p}\leq \frac{1}{n^2}\). Therefore by the Comparison test, \(\sumo \frac{1}{n^p}\) is also convergent. This is called a \(p\)-series and actually converges for any \(p \gt 1\).
- \(\sumo \frac{n+7}{n^3+3}\): We will use the comparison test. We have \begin{align*} \frac{n+7}{n^3+3} &\leq \frac{n+7n}{n^3+3}\\ & \lt \frac{8n}{n^3}\\ & = \frac{8}{n^2}. \end{align*} The series \(\sumo \frac{8}{n^2}\) converges and thus the original series \(\sumo \frac{n+7}{n^3+3}\) also converges.
Suppose that \(0\leq x_n \leq 1\) for all \(n\in\N\). If \(\sum x_n\) converges prove that \(\sum (x_n)^2\) also converges. Does the claim hold if we only assume that \(x_n \gt 0\)?
Since \(0\leq x_n \leq 1\) then \(0\leq x_n^2 \leq x_n\). Since \(\sum x_n\) converges then by the Comparison test then \(\sum (x_n)^2\) also converges. More generally, suppose that \(x_n \gt 0\) and \(\sum x_n\) converges. Then \((x_n)\) converges to zero and thus there exists \(K\in\mathbb{N}\) such that \(0 \lt x_n \lt 1\) for all \(n\geq K\). Then since the series \(\sum_{n=1}^\infty x_{n+K}\) converges it follows that \(\sum_{n=1}^\infty x_{n+K}^2\) converges and consequently \(\sum_{n=1}^\infty x_n^2\) converges.
The following two tests can be used for series whose terms are not necessarily non-negative.
If the series \(\sum |x_n|\) converges then \(\sum x_n\) converges.
From \(-|x_n| \leq x_n \leq |x_n|\) we obtain that
\[
0 \leq x_n + |x_n| \leq 2|x_n|.
\]
If \(\sum |x_n|\) converges then so does \(\sum 2|x_n|\). By the Comparison test 3.7.18, the series \(\sum (x_n + |x_n|)\) converges also. Then the following series is a difference of two converging series and therefore converges:
\[
\sum (x_n + |x_n|) - \sum |x_n| = \sum x_n
\]
and the proof is complete.
In the case that \(\sum |x_n|\) converges we say that \(\sum x_n\) converges absolutely. We end the section with the Ratio test for series.
Consider the series \(\sum x_n\) and let \(L = \lim_{n\rightarrow\infty} \frac{|x_{n+1}|}{|x_n|}\). If \(L \lt 1\) then the series \(\sum x_n\) converges absolutely and if \(L \gt 1\) then the series diverges. If \(L=1\) or the limit does not exist then the test is inconclusive.
Suppose that \(L \lt 1\). Let \(\eps \gt 0\) be such that \(r = L+\eps \lt 1\). There exists \(K\in \N\) such that \(\frac{|x_{n+1}|}{|x_n|} \lt L+\eps = r\) for all \(n\geq K\) and thus \(|x_{n+1}| \lt |x_n| r\) for all \(n\geq K\). By induction, it follows that \(|x_{K+m}| \lt |x_K| r^m\) for all \(m\geq 1\). Since \(\sum_{m=1}^\infty |x_k| r^m\) is a geometric series with \(r \lt 1\) then, by the comparison test, the series \(\sum_{m=1}^\infty |x_{K+m}|\) converges. Therefore, the series \(\sum |x_n|\) converges and by the Absolute convergence criterion we conclude that \(\sum x_n\) converges absolutely. If \(L \gt 1\) then a similar argument shows that \(\sum x_n\) diverges. The case \(L=1\) follows from the fact that some series converge and some diverge when \(L=1\).
Exercises
Suppose that \(\sum x_n\) is a convergent series. Is it true that if \(\sum y_n\) is divergent then \(\sum (x_n+y_n)\) is divergent? If it is true, prove it, otherwise give an example to show that it is not true.
Suppose that \(x_n \geq 0\) for all \(n\in\mathbb{N}\). Prove that if \(\sum_{n=1}^\infty x_n\) converges then \(\sum_{n=1}^\infty \frac{x_n}{n}\) also converges. Is the converse true? That is, if \(\sum_{n=1}^\infty \frac{x_n}{n}\) converges then does it necessarily follow that \(\sum_{n=1}^\infty x_n\) converges?
Using only the tests derived in this section, determine whether the given series converge or diverge:
- \(\displaystyle \sum_{n=1}^\infty \frac{2n^2+3}{\sqrt{n^2+3n+2}}\)
- \(\displaystyle \sum_{n=1}^\infty \frac{\cos(n\pi) 3^n}{2^n}\)
- \(\displaystyle\sum_{n=1}^\infty \frac{n-3}{n^3+1}\)
Suppose that \((x_n)\) and \((y_n)\) are non-negative sequences. Prove that if \(\sum x_n\) and \(\sum y_n\) are convergent then \(\sum x_n y_n\) is convergent. (Hint: Recall that if \(z_n\geq 0\) then \(\sum z_n\) converges iff \((s_n)\) is bounded, where \(s_n = \sum_{k=1}^n z_k\) is the sequence of partial sums. Alternatively, use the identity \((x+y)^2=x^2+2xy+y^2\).)
Suppose that \(x_n \gt 0\) for all \(n\in\mathbb{N}\) and \(\sum_{n=1}^\infty x_n\) converges. You will prove that \(\sum_{n=1}^\infty \sqrt{x_nx_{n+1}}\) converges.
- Let \(y_n = x_n + x_{n+1}\). Prove that \(\sum_{n=1}^\infty y_n \) converges.
- Using the fact that \((a+b)^2 = a^2+2ab+b^2\), prove that \(\sqrt{ab}\leq a+b\) if \(a, b \gt 0\). Deduce that
\[
\sqrt{x_nx_{n+1}} \leq x_n + x_{n+1} = y_n.
\]
- Deduce that \(\sum_{n=1}^\infty \sqrt{x_nx_{n+1}}\) converges.
Any number of the form \(x=0.a_1a_2a_3a_4\ldots\) can be written as
\[
x= \frac{a_1}{10} + \frac{a_2}{10^2} + \frac{a_3}{10^3} + \frac{a_4}{10^4} + \cdots
\]
Using this fact, and a geometric series, prove that
\[
0.25555555555555555\ldots\ldots\ldots = \frac{23}{90}.
\]
Show that the following series converge and find their sum.
- \(\displaystyle\sum_{n=0}^\infty \frac{(-1)^n}{e^{2n}}\)
- \(\displaystyle\sum_{j=2}^\infty \frac{3}{2^j}\)
- \(\displaystyle\sum_{k=-2}^\infty \frac{1}{3^k}\)
Using the fact that \(2^{k-1} \lt k!\) for all \(k\geq 3\), prove that the series \(\sum_{n=0}^\infty \frac{1}{n!}\) converges.
Let \(X:\mathbb{N}\rightarrow\real\) be a decreasing sequence of non-negative terms. Let \((s_n)\) be the sequence of partial sums of the series \(\sum_{n=1}^\infty X(n)\), let \(n_k = 2^k - 1\) for \(k\in\mathbb{N}\), and consider the subsequence \((s_{n_k})\).
- Show by induction that
\[
s_{n_k} \lt \sum_{n=0}^{k-1} 2^n X(2^n)
\]
for all \(k\in\mathbb{N}\).
- Conclude that if \(\sum_{n=0}^\infty 2^n X(2^n)\) converges then \(\sum_{n=1}^\infty X(n)\).
(Note: This is a generalization of Example 3.7.16.)
- \(\displaystyle \sum_{n=1}^\infty \frac{2n^2+3}{\sqrt{n^2+3n+2}}\)
- \(\displaystyle \sum_{n=1}^\infty \frac{\cos(n\pi) 3^n}{2^n}\)
- \(\displaystyle\sum_{n=1}^\infty \frac{n-3}{n^3+1}\)
- Let \(y_n = x_n + x_{n+1}\). Prove that \(\sum_{n=1}^\infty y_n \) converges.
- Using the fact that \((a+b)^2 = a^2+2ab+b^2\), prove that \(\sqrt{ab}\leq a+b\) if \(a, b \gt 0\). Deduce that \[ \sqrt{x_nx_{n+1}} \leq x_n + x_{n+1} = y_n. \]
- Deduce that \(\sum_{n=1}^\infty \sqrt{x_nx_{n+1}}\) converges.
- \(\displaystyle\sum_{n=0}^\infty \frac{(-1)^n}{e^{2n}}\)
- \(\displaystyle\sum_{j=2}^\infty \frac{3}{2^j}\)
- \(\displaystyle\sum_{k=-2}^\infty \frac{1}{3^k}\)
- Show by induction that \[ s_{n_k} \lt \sum_{n=0}^{k-1} 2^n X(2^n) \] for all \(k\in\mathbb{N}\).
- Conclude that if \(\sum_{n=0}^\infty 2^n X(2^n)\) converges then \(\sum_{n=1}^\infty X(n)\).