$\newcommand{\ds}{\displaystyle}$ $\newcommand{\Rr}{\mathbb{R}}$
Power Series¶
Read Section 2.6.5
A power series (in $x$) about $x_0 \in \Rr$ (or centered at $x_0$) is a series of the form
$$\sum_{n=0}^\infty a_n(x-x_0)^n.$$
Clearly, the series above always converges at $x=x_0$.
A convergent power series is a power series that converges at some $x$ other than its center $x_0$.
In other words, a power series is divergent if it only converges at its center.
Note the behavior of the three power series in Example 2.6.7, 2.6.8 and 2.6.9.
The general statement about convergence of power series is
Theorem 1. (Proposition 2.6.10, 2.6.11) The power series $\sum_{n=0}^\infty a_n(x-x_0)^n$
converges absolutely on $(x_0-\rho, x_0+\rho)$
diverges on $(-\infty, x-\rho) \cup (x+\rho, \infty)$
where $\rho = \ds \frac{1}{\limsup_n |a_n|^{1/n}}$ with the convention that $1/\infty = 0$ and $1/0 = \infty$.
Proof. Since $\limsup|a_n(x-x_0)^n|^{1/n} = \limsup |a_n|^{1/n}|x-x_0|^{1/n} = |x-x_0|\limsup|a_n|^{1/n}$, the asserts follow from the Root test.
Remarks.
The $0 \le \rho \le \infty$ in the theorem is the radius of convergence of the power series. Note that a power series is convergent if and only if $\rho >0$.
If $\rho = \infty$, then the power series converges everywhere.
It follows from the theorem that a power series converges on an interval. The convergence at the boundary points of this interval, i.e. at $x_0 \pm \rho$, as needs to be investigated on a case-by-case bases.
By a simple change of coordinates (in fact, just a horizontal translation), we can assume $x_0 = 0$. That simplify the statment of several results.
Exercise. Find examples that demonstrate item 3. in the remarks.
Read Proposition 2.6.12 for the algebraic properties of power series.
Morally, power series can be regarded as nice functions on the interior of its interval of convergence, i.e. on $(x_0 - \rho, x_0 + \rho)$.
Theorem 2. Suppose $f(x) = \sum a_n x^n$ has radius of convergence $\rho > 0$. Then $f(x)$ is both differentiable and has antiderivatives on $(-\rho, \rho)$. Moreover,
$\ds f'(x) = \sum_{n=1}^{\infty} na_nx^{n-1}$.
$\ds \int_0^x f(t) dt = \sum_{n=0}^{\infty} \frac{a_n}{n+1}x^{n+1}$
and these power series all have radius of convergence $\rho$.
We will see a prove of this after developing theory of series of functions. At this point, let us justify the radius of convergence of these series are still $\rho$.
Proof. First note that, for any $x \neq 0$, the series $\sum_{n=1}^{\infty} na_nx^{n-1}$ converges if and only if the series $\sum_{n=0}^{\infty} na_nx^n$ converges (they are just off by a factor of $x$). Thus, as power series they have the same radius of convergence. Since $n^{1/n} \to 1$ (see 3. in Limits of some sequences in this set of notes), so
$$ \limsup|na_n|^{1/n} = \lim n^{1/n} \limsup |a_n|^{1/n} = \limsup |a_n|^{1/n}$$
Thus, we concluded that the radius of convergence of $\sum_{n=0}^{\infty} na_n x^{n}$ and hence $\sum_{n=1}^{\infty} na_n x^{n-1}$ is also $\rho$.
Likewise, since $(n+1)^{1/n} \to 1$, so $$ \limsup \frac{|a_n|^{1/n}}{(n+1)^{1/n}} = \limsup |a_n|^{1/n}$$ hence the radius of convergence of
$$\sum_{n=0}^{\infty} \frac{a_n}{n+1}x^{n+1}$$ is also $\rho$.
Now we give an application of Theorem 2.
Example The series $\ds \sum \frac{n}{2^n}$ is convergent by the ratio test. In fact we can compute its sum. For $|x| < 1$,
$$ \frac{1}{1-x} = \sum_{n=0}^{\infty} x^n $$
So by Theorem 2, for $|x| \lt 1$,
$$ \frac{1}{(1-x)^2} = \frac{d}{dx}\frac{1}{1-x} = \sum_{n=0}^{\infty} \frac{d}{dx} x^n = \sum_{n=0}^{\infty} nx^{n-1} $$
and so $\ds \frac{x}{(1-x)^2} = \sum_{n=0}^{\infty} n x^n$. Thus, evaluating at $x = 1/2$, we get
$$ \sum_{n=0}^{\infty} \frac{n}{2^n} = \frac{1/2}{(1-1/2)^2} = 2. $$
This technique can be generalized for computing the sum of series of the form $\ds \sum_{n=0}^{\infty} p(n)/b^n$ where $p(x)$ is a polynomial in $x$ and $b \gt 1$. See Exercise 6 for hints.
The $C^n$ and $D^n$ heirarchy¶
Let $I$ be a open interval. Let
$D^n(I)$ be the set of $n$ times differentiable functions on $I$ and
$C^n(I)$ be the set of functions on $I$ with continuous $n$-th derivative.
So $C^0(I)$ is the set of continuous functions on $I$. Since differentiable functions are continuous, we have for each $n \ge 0$,
$$ C^0 \supseteq D^{1} \supseteq C^{1} \supseteq D^{2} \supseteq C^{2} \supseteq \cdots$$
Let $C^{\infty}(I) = \bigcap_n C^n(I)$. Note that $C^{\infty}(I) = \bigcap_n D^n(I)$ as well.
We demonstrate that each of the inclusion in the heirarchy above is strict, by examples:
The function ($n \ge 1$)
$$ f_n(x) = \begin{cases} x^n & x \ge 0 \\ -x^n & x < 0\end{cases} $$
belongs to $C^{n-1}(\Rr) \setminus D^{n}(\Rr)$. Note that $f_1(x) = |x|$ and $f_{k+1}(x) = \int_0^x f_k(t) dt$.
The function ($n \ge 1$)
$$ g_n(x) = \begin{cases} x^{2n}\sin(1/x) & x \neq 0 \\ 0 & x = 0 \end{cases} $$
is in $D^{n}(\Rr) \setminus C^{n}(\Rr)$.
Exercise Compute and graph a few derivatives of $f_n$ and $g_n$ (say for $n = 4$). Convince yourself that the assertions above are true.
Analytic Functions¶
A function $f(x)$ is (real) analytic at $x_0$ if $f(x) = \sum a_n(x-x_0)^n$ for some power series on an open neighborhood of $x_0$.
$f(x)$ is analytic on an open subset $D$ of $\Rr$ if it is analytic at every point in $D$.
Most functions appear in a Calculus course are analytic (or at least piecewise analytic).
If a function $f(x)$ is analytic at $x_0$ then $f(x)$ agrees with some power series $\sum a_n(x-x_0)^n$ on an open interval containing $x_0$. So by Theorem 2, $f$ is $C^{\infty}$ (i.e. having derivatives of all orders) on $I$ and that
\begin{align*} a_0 = f(x_0), a_1 = f'(x_0), 2a_2 = f''(x_0), (3)(2)a_3 = f^{(3)}(x_0), \ldots \end{align*}
In general, $\ds (n!)a_n = f^{(n)}(x_0)$. Thus, if $f(x)$ analytic at $x_0$, then it must be represented by its Taylor series at $x_0$: $$ \sum_{n=0}^{\infty} \frac{f^{(n)}(x_0)}{n!}(x-x_0)^n $$
Consequently, a function analytic at $x_0$ is represented by a unique power series (centered at $x_0$) around $x_0$.
Example/Exercise. A $C^{\infty}$ function on $\Rr$ may not be analytic everywhere. The following function is an example $$ f(x) = \begin{cases} e^{-1/x} & x > 0;\\ 0 & x\le 0. \end{cases} $$
Check that $f$ is $C^{\infty}$ and $f^{(n)}(0) = 0$ for all $n$ and hence the Taylor series for $f$ at $0$ is constantly 0 and hence $f$ is not analytic at $0$.
See the page for more the details.
PP = plot(exp(-1/x), (x,0,10)); PN = plot(0,(x,-2,0)); G=PP+PN; G.show()
For $x \in (-1,1)$,
\begin{align*} 1+x+x^2 + \cdots + x^n = \frac{1-x^{n+1}}{1-x} \to \frac{1}{1-x} \qquad (n\to \infty). \end{align*}
Moreover, the series $\sum x^n$ diverges at both $\pm 1$. And so,
$$ \frac{1}{1-x} = \sum_{n=0}^{\infty} x^n \qquad x \in (-1,1).$$
We will define $e^x$ carefully and show that
$$ e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!} \qquad x \in \Rr.$$
Exercise. Show that the power series $\sum x^n/n!$ converges everywhere on $\Rr$. (One can take the above equation as the definition of $e^x$.)
Exercise. Using Theorem 2 and the two power series above deduce the power series the represent the following functions about $x=0$ and for each of them find the interval on which the power series represents the function.
$\arctan(x)$.
$\ln(1+x)$.
Theorem 3. (Abel Theorem on power series) If a power series $f(x)=\sum a_n x^n$ with radius of convergence $1$ converges at $1$, i.e. $L:=\sum a_n$ exists, then $\lim_{x \to 1^-} f(x) = L$.
Proof. Replacing $a_0$ by $a_0 - L$, we can assume $L =0$. For $|x| \lt 1$, since $\sum x^n$ converges absolutely, so by Merten's Theorem
$$ \frac{f(x)}{1-x} = \sum x^m \sum a_nx^n = \sum_{k=0}^{\infty} s_k x^k $$ where $s_k = \sum_{n=0}^k a_n$ is the $k$-th partial sum of $\sum a_n$. So, $f(x) = (1-x)\sum_{k=0}^{\infty} s_k x^k$.
Since $s_k \to 0$, for any $\varepsilon > 0$, there exists $N$, such that $|s_k| < \varepsilon$ whenever $k \ge N$. Thus, for $0 < x < 1$,
$$ \left|(1-x)\sum_{k=N}^{\infty} s_k x^k\right| < \varepsilon |1-x| \frac{|x|^N}{1- |x|} < \varepsilon $$
and
$$ \left|(1-x)\sum_{k=0}^{N-1} s_k x^k\right| \le B |1-x| \frac{1- |x|^{N}}{1-|x|} < B(1-|x|^N) $$ where $B = \max\{s_0, \ldots, s_{N-1}\}$. And for $0< x< 1$ sufficiently closed to $1$, $1-|x|^N < \varepsilon$ and so $|f(x)| < \varepsilon(B+1)$.
Since $\varepsilon >0$ is arbitrary, this shows that $\lim_{x\to 1^-} f(x) = 0$.
Remark.
- This can easily be generalized to the following result:
suppose $f(x) = \sum_{n=0}^{\infty} a_n(x-x_0)^n$ has radius of convergence $R > 0$ and the power series converges at $x_0+R$ (resp. $x_0-R$), then $\lim_{x \to (x_0+R)^-} f(x) = f(R)$ (resp. $\lim_{x \to (x_0-R)^+} f(x) = f(-R))$.
Example
On $(-1,1)$
$$\ln(1+x) = \sum \frac{(-1)^{n+1}x^n}{n} = x - \frac{x^2}{2} + \frac{x^3}{3} - \cdots$$
and the radius of convergence of the series is $1$. Since $\sum \frac{(-1)^{n+1}}{n}$ converges (Alternate Series Test), so by Abel's Theorem on power series and the fact that $\ln(1+x)$ is continuous at $1$,
$$1- \frac{1}{2} + \frac{1}{3} - \cdots = \lim_{x \to 1^-} \ln(1+x) = \ln(2).$$