$\newcommand{\ds}{\displaystyle}$ $\newcommand{\Rr}{\mathbb{R}}$
Taylor's Theorem¶
Read Section 4.3
For an $n$-times differentiable function defined near a point $x_0$, the $n$-th Taylor polynomial for $f$ at $x_0$ is
$$ P_n(x;x_0) := \sum_{k=0}^n \frac{f^{(k)}(x_0)}{k!}(x-x_0)^k. $$
Taylor's Theorem Suppose $f$ has derivatives up to order $n+1$ on some interval $(a,b)$ containing $x_0$. Then for any $x \in (a,b)$, there exists $c$ between $x_0$ and $x$ such that
$$ f(x) = P_n(x;x_0) + \frac{f^{(n+1)}(c)}{(n+1)!}(x-x_0)^{n+1}. $$
The term $R_n(x;x_0):=\dfrac{f^{(n+1)}(c)}{(n+1)!}(x-x_0)^{n+1}$ is Lagrange form of the remainder term.
There is also an integral form of the remainder term that we will meet later.
Remarks
- When $n=0$, i.e. we simply assume $f$ is differentiable on $(a,b)$, Taylor's Theorem reduced to the MVT.
- For simplicity, we often suppress $x_0$ from appearing in $P_n(x;x_0)$ and $R_n(x;x_0)$.
Let $f$ be a function having at least $n$ derivates in an open interval $I$.
We say that $x_0 \in I$ is a zero of multiplicity $n$ of a function $f$ if
$$ f(x_0) = f'(x_0) = \cdots = f^{(n-1)}(x_0) = 0$$
but $f^{(n)}(x_0) \neq 0$.
N.B. a zero of $f$ of multiplicity 0 is simply not a zero of $f$.
Lemma. Suppose $g \colon I \to \Rr$ has $n+1$ derivatives on $I$. Suppose
- $g(t)$ has a zero $x_0 \in I$ of multiplicity of at least $n+1$, and
- $g(t)$ has a zero $x \in I$ other than $x_0$.
Then $g^{(n+1)}(t)$ has a zero somewhere strictly between $x$ and $x_0$.
Proof. By Rolle's Theorem, $g'(t_1) = 0$ for some $t_1$ strictly between $x$ and $x_0$.
Since $x_0$ is a zero of multiplicity at least $n+1-1$ for $g'$, by Rolle's Theorem again, $g''$ has a zero $t_2$ strictly between $t_1$ and $x_0$ (hence strictly between $x$ and $x_0$).
Note that $x_0$ is a zero of multiplicity at least $n-k+1$ for $g^{(k)}$ and $g^{(k)}$ is differentiable.
So this argument can be repeated until $k=n$ resulting a zero $t_{n+1}$ of $g^{(n+1)}$ strictly between $x$ and $x_0$.
Proof of Taylor's Theorem. The theorem is trivial if $x_0=x$ (take $c=x=x_0$). So, assume $x \neq x_0$.
Let $K$ be the constant so that the function
$$g(t):=f(t)-P_n(t;x_0)-K(t-x_0)^{n+1}$$ vanishes at $t=x$. Such $K$ exists, because we can solve for it as $x \neq x_0$.
It is easy to check that $g(t)$ statisfies the assumptions in the Lemma and hence there exists some $c$ strictly between $x$ and $x_0$ such that $g^{(n+1)}(c) = 0$.
Since $P_n(t;x_0)$ is a polynomial in $t$ of degree at most $n$, its $n+1$-th derivative vanishes. Therefore, $g^{(n+1)}(t) = f^{(n+1)}(t) - K(n+1)!$.
And so, $0 = g^{(n+1)}(c) = f^{(n+1)}(c)-K(n+1)!$. That is, $K = \dfrac{f^{(n+1)}(c)}{(n+1)!}$. This completes the proof.
Remark If $f$ is merely infinitely differentiable at $x_0$, there is no guarantee that its Taylor series at $x_0$ is covergent. And even if it is convergent at $x$, there is no guarantee that it coverges to $f(x)$.
We have seen that an example of a function $f$ (in Notes 02) whose Taylor series at 0 is completely zero. Since $f(x)> 0$ for all positve $x$, $f$ does not agree with its Taylor series on an neighborhood of $0$ and so $f$ is not analytic at $0$.
Example. (Taken from R.Boas, H.Boas' A primer of real functions)
Consider the function $f$ defined by the integral
$$ f(x) = \int_0^{\infty} e^{-t}\cos(t^2x) dt.$$
Assuming one can differentiate under the integral sign (this will be justified later), then it is easy to check that for even $n$,
$$f^{(n)}(0) = \pm \int_0^{\infty} t^{2n}e^{-t}dt = \pm(2n)!$$
and $f^{(n)}(0) =0$ for odd $n$.
Exercise. Show that the Taylor series of the function $f$ in the example above diverges.
Hint. Use comparison test and root test. Note that since the coefficients of the odd powers of $x$ are all zeros, the ratio test is not applicable.
In fact, something much more surprising called the Borel-Peano Theorem is true.
Every power series is a Taylor series of some infinitely differentiable function.
See this article for a proof.
If the remainder term $R_n(x;x_0) \to 0$ as $n \to \infty$, then Taylor's Theorem asserts that $f(x)$ agrees with its Taylor series at $x$. Here is a simple sufficient criteria for that to happen.
Corollary. If $f^{(n)}$ ($n \ge 1$) are uniformly (in $n$) bounded between $x$ and $x_0$, then $\ds \lim_n R_n(x;x_0) =0$.
Proof. The assumption means there is a $B > 0$ such that for any $n$ and for any $c$ between $x$ and $x_0$, $|f^{(n)}(c)| \le B$. Thus, as $n \to \infty$,
$$ |R_n(x;x_0)| \le \frac{B}{(n+1)!}(x-x_0)^{n+1} \to 0 $$
because $(x-x_0)^{n}/{n!} \to 0$ as $n \to \infty$. (See MAT401 notes here)
Example. As an example, let us show that the Taylor series of $e^x$ at $0$ converges to $e^{x_0}$ for every $x_0 \in \mathbb{R}$, i.e. $e^x$ is analytic on the whole real line.
Take any closed bounded interval $I$ that contains both $0$ and $x_0$. By continuity of $e^x$, it is bounded by some $B$ on $I$. But every derivative of $e^x$ is itself, so they are all bounded by $B$ on $I$, hence the assertion follows from the corollary above.
Theorem (Binomial series) For $\alpha \in \Rr$,
$$(1+x)^{\alpha} = \sum_{n=0}^{\infty} \binom{\alpha}{n}x^n \qquad \text{for}\ (|x| < 1)$$
where $\ds \binom{\alpha}{n} = \frac{\alpha(\alpha-1) \cdots (\alpha-n+1)}{n!}.$
Sketch of Proof.
Shows that $\lim_n \left| \binom{\alpha}{n}/\binom{\alpha}{n+1} \right|=1$. Deduce from this that the radius of convergence of the binomial series above is $1$.
For $n \ge 1$, check that $\ds n\binom{\alpha}{n}=\alpha\binom{\alpha-1}{n-1}$.
For $n \ge 1$, check that $\ds \binom{\alpha-1}{n} + \binom{\alpha-1}{n-1} = \binom{\alpha}{n}$.
Let $f(x)$ be the function defined by $\ds \sum_{n=0}^{\infty} \binom{\alpha}{n}x^n$ on $|x| < 1$. Show that $f(x)$ satisfies the differential equation $$ (1+x)f'(x) = \alpha f(x).$$
Deduce that $f(x) = (1+x)^{\alpha}$ for $|x| < 1$.
Remark.
The convergence of this power series at the end points depends on $\alpha$. (A proof can be found here)
It follows from the theorem above on Binomial series that the function $f(x)=\sqrt{1+x}$ ($\alpha =1/2$) is analytic at $0$. However, this cannot be deduced from the Corollary of Taylor's theorem as the derivatives of $f$ are not uniformly bounded on $(-1,1)$.
Raabe Test (optional)¶
Proposition (Raabe Test)
(Modified slightly from Bartle and Shebert's Introduction to real analysis)
Let $(x_n)$ be a sequence of nonzero real numbers.
(a) If there exists numbers $a > 1$ such that
$$ \left| \frac{x_{n+1}}{x_n}\right| \le 1- \frac{a}{n+1} \qquad \text{for}\quad n \gg 0, \tag{1} $$ then $\sum x_n$ is absolutely convergent.
(b) If there exists numbers $a \le 1$ such that
$$ \left| \frac{x_{n+1}}{x_n}\right| \ge 1- \frac{a}{n+1} \qquad \text{for}\quad n \gg 0, \tag{2} $$ then $\sum x_n$ is not absolutely convergent.
Proof. Part (a). By throwing away finitely many terms from $(x_n)$, we can assume (1) holds and hence,
$$ (k+1)|x_{k+1}| \le (k+1)|x_k| -a|x_k|$$
for $k \ge 1$.
On reorganizing the inequality, we have
$$ 0 < (a-1)|x_k| \le k|x_k| - (k+1)|x_{k+1}| \tag{3}$$
This shows that $(k|x_{k}|)$ is a decreasing sequence.
Adding (3) for $k=1, \ldots, n$ (note that it is a telescopic sum), we get
$$ 0< (a-1)(|x_1| + \cdots + |x_n|) \le |x_1| - (n+1)|x_{n+1}| \le |x_1|.$$
This shows that the partial sums of $\sum |x_n|$ is bounded above and hence $\sum x_n$ converges absolutely.
Part (b). Again, by throwing away finitely many terms, we can assume (2) holds for all $n \ge 1$. Thus,
$$ (n+1)|x_{n+1}| \ge (n+1)|x_n|-a|x_n| > n|x_n| \quad \forall n \ge 1. $$
That means the sequence $(n|x_n|)$ is increasing and that
$$ |x_n| > |x_1|/n $$
for all $n$. Since $x_k \neq 0$ for each $k$, we have $|x_1| > 0$ (even after throwing finitely many terms away).
Since the Harmonic series is divergent, we conclude that $\sum |x_n|$ is divergent by the comparsion test.
Corollary (Limit form of Raabe's Test)
Suppose $(x_n)$ is a sequence of nonzero real numbers and that
$$ a:=\lim_n\ (n+1)\left(1-\left|\frac{x_{n+1}}{x_n}\right|\right) $$
exists. Then $\sum x_n$ converges absolutely if $a > 1$ and $\sum x_n$ does not converge absolutely if $a < 1$.
Proof. We just prove the case when $a > 1$ and left the case $a < 1$ as an exercise.
Since $a > 1$, $1< a_1 := 1+a/2 < a$.
Thus, for $n \gg 0$,
$$ a_1 < (n+1)\left( 1 - \left|\frac{x_{n+1}}{x_n}\right|\right)$$
That is for $n \gg 0$,
$$ \left|\frac{x_{n+1}}{x_n}\right| < 1- \frac{a_1}{n+1}$$
hence $\sum x_n$ absolutely converges by Raabe's Test.
Remark. Note that
$$ n\left( 1 - \left|\frac{x_{n+1}}{x_n}\right|\right) = \frac{n}{n+1} (n+1)\left( 1 - \left|\frac{x_{n+1}}{x_n}\right|\right).$$
and $\lim n/(n+1) = 1$, so
$$\lim_n (n+1)\left( 1 - \left|\frac{x_{n+1}}{x_n}\right|\right) =\lim _n n\left( 1 - \left|\frac{x_{n+1}}{x_n}\right|\right)$$
if either limit exists.
A straight forward computation shows that for $n \gg 0$,
\begin{align*} (n+1) \left( 1- \left|\binom{\alpha}{n+1}/\binom{\alpha}{n}\right|\right) &= (n+1)-|\alpha -n|\\ &= n+1-(n-\alpha) = 1+ \alpha > 1 \end{align*}
Thus it follows from Raabe's Test that for $\alpha > 1$,
the series $\sum \binom{\alpha}{n}$, that is the binomial series for $(1+x)^{\alpha}$ at $x=1$, converges absolutely.