$\newcommand{\Rr}{\mathbb{R}}$ $\newcommand{\Zz}{\mathbb{Z}}$ $\newcommand{\Nn}{\mathbb{N}}$ $\newcommand{\Qq}{\mathbb{Q}}$ $\newcommand{\ve}{\varepsilon}$ $\newcommand{\dp}[2]{\langle #1, #2\rangle}$ $\newcommand{\va}{\mathbf{a}}$ $\newcommand{\vb}{\mathbf{b}}$ $\newcommand{\vc}{\mathbf{c}}$ $\newcommand{\vx}{\mathbf{x}}$ $\newcommand{\vy}{\mathbf{y}}$ $\newcommand{\vz}{\mathbf{z}}$ $\newcommand{\norm}[1]{\left\| #1 \right\|}$ $\newcommand{\ds}{\displaystyle}$

Series¶

Read Section 2.5. Check your understanding against the following questions:

  • What is a series?
  • What means by a series is convergent?
  • What means by a series is Cauchy?
  • What means by a series converges absolutely?
  • What are comparison test, $p$-test and ratio test for convergence for series?

Proposition 1. Suppose $\sum x_n$ converges then $x_n \to 0$.

Proof. Since $\sum x_n$ converges, its sequence of partial sums $(s_k)$ is convergent and hence Cauchy. In particular, $|x_n| = |s_{n} -s_{n-1}|$ can be arbitrarily small for all $n \gg 0$. Thus, $(x_n)$ is a null sequence.

The converse is not true, e.g. the Harmonic series $\sum \frac{1}{n}$ diverges (see Example 2.5.11) but $\frac{1}{n} \to 0$.

Proposition 2. The series $\ds \sum_{n=0}^\infty r^n$ converges to $\dfrac{1}{1-r}$ for $|r| < 1$ and diverges for $|r| \ge 1$.

Many results about sequences have their counterparts for series (when applying to the sequence of partial sums).

For example, linearity of summation (Proposition 2.5.12). Another example, as a consequence of the monotonic convergence theorem, is that:

Proposition 3 (Convergence criteria for series with non-negative terms).

A series $\sum x_n$ with non-negative terms (i.e. $x_n \ge 0$) converges if and only if its sequence of partial sums is bounded above.

Proof. Under the assumption that $x_n \ge 0$ ($n \in \Nn$), the sequence of partial sums $(s_k)$ of $\sum x_n$ is monotonic increasing. So $(s_k)$ is convergent if and only if its is bounded above. (Proposition 2 and 3 of Week 05)

Proposition 4. Absolute convergence for series implies convergence for series (i.e. if $\sum |x_n|$ converges then $\sum x_n$ converges.)

Proof. Let $(u_k)$ and $(s_k)$ be the sequence of partial sums of $\sum |x_n|$ and $\sum x_n$, respectively. For any $m, n$,

$$ |s_m-s_n| = |x_n + x_{n+1} + \cdots + x_m| \le |x_n| + |x_{n+1}| + \cdots + |x_m| = |u_m -u_n|. $$

Since $(u_k)$ is convergent and hence Cauchy, so $|u_m-u_n|$ and hence $|s_m-s_n|$ can be arbitraily small for all $m, n \gg 0$. Therefore, $(s_k)$ is also Cauchy and hence convergent.

The converse of this proposition is not true. E.g. $\sum (-1)^n/n$ converges but $\sum |(-1)^n|/n = \sum 1/n$, which is the Harmonic series, diverges.

A series converges conditionally if it converges but not absolutely converges.

Proposition 5 (Comparision Test 2.5.16) Suppose $0 \le x_n \le y_n$, then $\sum x_n$ converges if $\sum y_n$ does.

Proof. Let $(s_k)$ and $(t_k)$ be the sequence of partial sums of $\sum x_n$ and $\sum y_n$ respectively. The conditions $0 \le x_n \le y_n$ for all $n$ imply $0 \le s_k \le t_k$ for all $k$. If so, $\sum y_n$ converges, then $(s_k)$ is bounded above by the $\sum y_n$ and so $\sum x_n$ converges (Prop. 3)

Proposition 6. (p-test 2.5.17) For $p \in \Rr$, the series $\ds \sum \frac{1}{n^p}$ converges if and only if $p > 1$.

Idea. For $p \le 1$ and $n > 1$, $\dfrac{1}{n^p} \ge \dfrac{1}{n} > 0$. So, it follows from the comparison test and the fact that the Harmonic series diverges that $\ds \sum \dfrac{1}{n^p}$ diverges.

For $p > 1$ and $k \ge 2$,

$$\sum_{n=1}^k \dfrac{1}{n^p} = 1+ \sum_{n=2}^k \frac{1}{n^p} < 1 + \int_{1}^{k} \frac{1}{x^p} dx \le 1+\frac{1}{p-1}.$$

Therefore, $\ds \sum \frac{1}{n^p}$ converges.

Here we use integration which we have not even defined rigorously. The advantage is that this argument is more geometric (compare the series with the area under a curve). For a proof without using an integral see the textbook.

The following variation of the comparison test often easier to apply in practice.

Proposition 7 (Limit Comparison Test).

Suppose $0 < x_n, y_n$ for all $n$ and $\ds \lim_{n \to \infty} \dfrac{x_n}{y_n} = L > 0$. Then $\sum x_n$ converges if and only if $\sum y_n$ converges.

If $L = 0$, then the convergence of $\sum y_n$ implies the convergence of $\sum x_n$.

Proof. Since $\ds \lim_{n\to \infty} \dfrac{x_n}{y_n} = L$ and $L > 0$. for all sufficiently large $n$, $$ \frac{L}{2} < \frac{x_n}{y_n} < \frac{3L}{2}. $$ That is, (since $y_n >0$), $\ds 0< \frac{L}{2}y_n < x_n < \frac{3L}{2}y_n$. So by the Comparison Test, if $\sum x_n$ converges then so is $\ds \sum \frac{L}{2}y_n$ and consequently $\ds \frac{2}{L}\sum\frac{L}{2}y_n = \sum y_n$ converges. Conversely, if $\sum y_n$ converges, then so is $\ds \sum \frac{3L}{2}y_n$ and so $\sum x_n$ converges as well by the Comparison Test.

If $L = 0$, then for all sufficiently large $n$, $\dfrac{x_n}{y_n} < 1$ and so $0 < x_n < y_n$. Thus, the convergence of $\sum x_n$ follows from the convergence of $\sum y_n$ by the comparison test.

Example. Let us show that $\ds \sum \frac{1}{n^3 -1}$ is convergent. Since $\ds \frac{1}{n^3-1}$ is essentially $\dfrac{1}{n^3}$, so one would expect the series converges by the p-test. However, since $\dfrac{1}{n^3} < \dfrac{1}{n^3-1}$ ($n \ge 2$), the comparison test does not apply directly. That said, the since the limit $$ \dfrac{\dfrac{1}{n^3}}{\dfrac{1}{n^3-1}} = \dfrac{n^3-1}{n^3} = 1 - \dfrac{1}{n^3} \to 1. $$ So $\ds \sum \frac{1}{n^3-1}$ converges (since $\ds \sum \frac{1}{n^3}$ does) by the limit comparison test.

Another way of showing the convergence of $\ds \sum \frac{1}{n^3 -1}$ is as follows: Note that $\ds \sum \frac{1}{n^2}$ converges (p-test) and $$ \dfrac{\dfrac{1}{n^3-1}}{\dfrac{1}{n^2}} = \dfrac{n^2}{n^3-1} = \dfrac{1}{n-1/n^2} \to 0. $$ so the convergence of $\ds \sum \frac{1}{n^3 -1}$ follows from the limit comparison test ($L=0$ case) as well.

Proposition 8.(Ratio test for series) Suppose $\ds L:=\lim_{n} \frac{|x_{n+1}|}{|x_n|}$ exists. Then

  1. $\sum x_n$ converges absolutely if $L < 1$.
  2. $\sum x_n$ diverges if $L > 1$.

Proof. If $L > 1$, then $|x_{n+1}|/|x_n| > (1+L)/2 > 1$ for all sufficiently large $n$. Then means $(|x_n|)$ is eventually a strictly increasing of positive numbers and hence the sequence $(x_n)$ cannot be null. Therefore, $\sum x_n$ diverges.

If $L < 1$, then for all $n$ sufficiently large, $\ds \frac{|x_{n+1}|}{|x_n|} < r:=\frac{1+L}{2} < 1$. Since convergence of series is not affected by finitely many terms. We can assume $|x_{n+1}| < r|x_n|$ for all $n \ge 1$. Consequently,

$$ \sum |x_{n}| < |x_1|\sum r^{n-1}.$$

The geometric series on the right converges since $r < 1$ and so does $\sum |x_n|$ by the Comparison test.

If $L=1$, then further investigation is necessarily for deciding the convergency. E.g. consider $\sum 1/n$ and $\sum 1/n^2$. Both of them has 1 as the limit of the ratio of consecutive terms. But the Harmonic series diverges while $\sum 1/n^2$ converges (absolutely) by the p-test.

Proposition 9. (Root test for series)

Suppose $L:=\limsup |x_n|^{1/n}$ exists. Then

  1. $\sum x_n$ converges absolutely if $L < 1$.
  2. $\sum x_n$ diverges if $L > 1$.

Proof. If $L < 1$, then for some $N \ge 0$,

$$ \sup_{n \ge N} |x_n|^{1/n} < r:=\frac{1+L}{2} < 1$$

That means $|x_n| < r^n$ for all $n \ge N$. Thus, $\sum |x_n|$ converges by the comparison test.

If $L > 1$, for all $N$ sufficiently large,

$$ \sup_{n \ge N} |x_n|^{1/n} > r:=\frac{1+L}{2} > 1$$

In particular $|x_n| > r^n > 1$ for infinitely many $n$. Therefore, $(x_n)$ cannot be null and hence $\sum x_n$ diverges.

Just like the Ratio test, the Root test is also inconclusive when $L =1$. Again, for $x_n = 1/n$ and $y_n= 1/n^2$,

$$\lim|x_n|^{1/n} =1 = \lim |y_n|^{1/n}$$

But $\sum x_n$ diverges while $\sum y_n$ converges.

The Root test is strictly stronger than the Ratio test.

We will discuss that in MAT 403.

Proposition 10. (Alternating series test) Suppose $(x_n)$ is a sequence satisfying

  1. $x_n \ge 0$,
  2. $x_n \ge x_{n+1}$ (i.e. $(x_n)$ is decreasing) and
  3. $x_n \to 0$. (In other words, $(x_n)$ is a decreasing null sequence of non-negative numbers) then the series

$$ \sum (-1)^{n+1} x_n = x_1 - x_2 + x_3 - x_4 + \cdots $$ converges.

Proof. From the assumptions $x_n \ge 0$ and $x_n \ge x_{n+1}$, it follows that the partial sums $s_k$'s of the series $\sum (-1)^{n+1}x_n$ satisfy

$$ s_1 \ge s_3 \ge \cdots s_4 \ge s_2 \tag{*}$$

Note that for all $m \ge n$,

$$ |s_m - s_n| \le |s_{n+1} -s_n| = x_{n+1}$$

Since $(x_n)$ is null, $(s_k)$ is Cauchy and hence convergent.

Remarks

  1. Under the same assumptions the series $-x_1 + x_2 - x_3 + x_4 - \cdots$ is convergent as well.

  2. The sum $L$ of the alternating series $\sum (-1)^{n+1} x_n$, is the limit of both $(s_{2k})$ and $(s_{2k-1})$. And so, by ($*$) $L$ is always between two consecutive partial sums. That means for any $n \ge 1$.

$$| L -s_n| \le |s_{n+1} -s_n| = x_{n+1} $$

Further Reading¶

Here we just touch upon tests of convergence. For more information read this wiki page