Start with this video to give you an overview of the subject of Taylor Series. This video will motivate the subjects we discuss in this course and hopefully motivate you to study more about series solutions to differential equations.
Every differentiable solution we determined in our study of differential equations thus far can be written as a Taylor series or power series
$$y(t) = \displaystyle\sum_{k=0}^{\infty} a_k\left(t - t_0\right)^k.$$
For example, a fundamental solution of the form $e^{rt}$ can be expressed using the Maclaurin series
$$e^{rt} = \displaystyle\sum_{k=0}^{\infty} \dfrac{(rt)^k}{k!} = \displaystyle\sum_{k=0}^{\infty} \dfrac{r^k}{k!}\left(t - 0\right)^k = \displaystyle\sum_{k=0}^{\infty} a_k\left(t - 0\right)^k.$$
A Maclaurin series is a Taylor series where $t_0=0$. In the series for $e^{rt}$, $a_k = \dfrac{r^k}{k!}$. It is common to refer to Taylor series simply as
power series
and only use the terms Taylor series and Maclaurin series when it is necessary to distinguish between the two types of power series.
We are going to use differential and integral calculus to study, solve, and compute solutions to our differential equations. Keep in mind that we will extensively use your previous mathematics courses. In this chapter we will need the definitions and theorems concerning Taylor series. This includes the Taylor Remainder Theorem , Taylor's Theorem for determining the coefficients $a_k$ of a smooth function, and determining where on the real line a power series converges using the Ratio Test . We might need the Comparison Tests , Integral Test , Root Test , or Weierstrauss M-Test as well so keep them handy. Finally we will need the definition of absolute convergence and conditional convergence . Let us start by reviewing series and these theorems.
A
sequence
is a list of numbers in a definite order or list
$$a_0, a_1, a_2, \dots , a_n, \dots$$
If the sequence ends after a finite number of elements
$$a_0, a_1, a_2, \dots , a_n$$
then is a
finite sequence
. If the list has an element for every positive integer then it is an
infinite sequence
. Mathematicians get tired of writing lists with ellipses so instead we write
$$\left\{a_k : k\in\mathbb{Z}^+ \right\} = \left\{a_k : k=1,2,3,\dots\right\} = \left\{a_k\right\}_{k=0}^{\infty}.$$
Curly brackets can be tiresome as well so you may see articles and textbooks that denote sequences
$$\left(a_k\right)_{k=0}^{\infty} = \left(a_k\right).$$
The sequence defines for us a function $a$ from the non-negative integers $\mathbb{Z}^+$ to the real (or complex) numbers,
$$a:\mathbb{Z}^+\rightarrow\mathbb{R}$$
defined by the algebraic expression
$$a(k) = a_k,\text{ for every }k\in\mathbb{Z}^+.$$
Since there are a infinite number of elements in the list, we cannot "list" them all so we usually use a function to describe all of the elements of the list. For example, if the function is defined by
$$a(k) = \left(\dfrac{1}{2}\right)^k,$$
then the graph of our function $a$ has a
horizontal asymptote
, $L=0$.
This is an example of a
geometric sequence
.
We need more formal language for this phenomenon. A sequence is said to converge if there is a real number $L$, called the limit , so that for some finite number $N$, $|a_k-L|$ is small when $k \gt N$. It is common to use the Greek letters $\epsilon$ and $\delta$ to represent small numbers. Our definition for the limit of a sequence is
Definition of Convergent Sequence ¶
A sequence $\left\{a_k\right\}$ is said to converge to a real number $L$, called the limit , if and only if for every $\epsilon \gt 0$, there is a positive integer $N \gt 0$, so that $k \gt N$ implies that $\left|L-a_k\right| \lt \epsilon$. We write the limit of a sequence
$$\displaystyle\lim_{k\to\infty} a_k = L.$$
How do we show that the geometric sequence $\left\{a_k\right\} = \left\{\left(\dfrac{1}{2}\right)^k\right\}$ converges to $0$?
We work backward using reversible algebraic steps. We want
$$ \left|0 - a_k\right| = \left|a_k\right| = \left|\left(\dfrac{1}{2}\right)^k\right| = \left(\dfrac{1}{2}\right)^k \lt \epsilon,$$
so our job is to find a $k$ large enough so that $\left(\frac{1}{2}\right)^k \lt \epsilon$. Both sides of this inequality are positive so computing the logarithm of both sides results in
$$\log\left(\dfrac{1}{2}\right)^k = -k\log(2)\quad \lt \quad \log(\epsilon) = -\log\left(\dfrac{1}{\epsilon}\right),$$
where we use that fact that $\epsilon \gt 0$ is
small
, that is $0 \lt \epsilon \lt 1$. Dividing both sides by $-\log(2)$ (which reverses the inequality),
$$ 0 \lt -\dfrac{\log(\epsilon)}{\log(2)} \lt k.$$
Since $\epsilon$ is
small
it follows that $\frac{1}{\epsilon}$ is
big
. Moreover we know from the graph of the function $f(x)=\frac{1}{x}$ that as $\epsilon\rightarrow 0$, $\frac{1}{\epsilon}\rightarrow\infty$. Since $0 \lt \dfrac{\log\left(\frac{1}{\epsilon}\right)}{\log(2)} \lt k$, the smaller $\epsilon$ gets, the larger $k$ must be. As we never run out of positive integers we follow our algebraic steps backward and get the following.
Prove that the geometric sequence $\left\{a_k\right\} = \left\{\left(\dfrac{1}{2}\right)^k\right\}$ converges to $0$.
For every $\epsilon \gt 0$, choose $N \gt -\dfrac{\log(\epsilon)}{\log(2)}$. Then $k \gt N$ implies that $k \gt -\dfrac{\log(\epsilon)}{\log(2)}$. Thus
$$\begin{align*}
-k\log(2) &\lt \log(\epsilon) \\
\\
\log\left(2^{-k}\right) &\lt \log(\epsilon) \\
\\
2^{-k} &\lt \epsilon \\
\\
\left|\ 0 - \left(\frac{1}{2}\right)^k\ \right| &= 2^{-k} \lt \epsilon.
\end{align*}$$
Since this equation is true for every $\epsilon \gt 0$,
no matter how small
, the sequence
$$\left\{\left(\dfrac{1}{2}\right)^k\right\}_{k=0}^{\infty}$$
converges to the limit $L = 0$.
∎
If $r \gt 0$ is a real number, then the sequence $\left\{r^k\right\}_{k=0}^{\infty}$ is called a geometric sequence . If $r \gt 1$, then the sequence does not converge. In this case the sequence is said to diverge . One can see that the geometric sequence does not converge by looking at the graphs of the geometric sequences for $r=2$.
The definition of a sequence that diverges to infinity is the opposite of convergent sequence .
Definition of Divergent Sequence ¶
A sequence $\left\{a_k\right\}$ is said to be divergent if for every positive number $M \gt 0$, there is a positive integer $N \gt 0$ so that $k \gt N$ implies that $\left|a_k\right| \gt M$. In this case we write the limit
$$\displaystyle\lim_{k\to\infty} a_k = \infty.$$
If $r=1$, then the sequence is a constant sequence and converges to the constant $1$, that is $\displaystyle\lim_{k\to\infty} 1^k = 1$.
Show that if $0 \lt r \lt 1$, then the geometric sequence converges to $L=0$.
The inequality flipped because $-\log\left(\frac{1}{r}\right)$ is negative.For any $0 \lt \epsilon \lt 1$, choose $N \gt \dfrac{\log\left(\epsilon\right)}{\log(r)}$. Then $k \gt N$ implies that
In calculus I we learned that there are algebraic ways to combine the limits of functions to obtain the limits of algebraic combinations of functions. It follows that if the limit laws apply to functions and sequences are functions whose inputs are the positive integers, then the limit laws apply to sequences also.
Limit Laws for Sequences ¶
If $\{a_n\}$ and $\{b_n\}$ are convergent sequences and $c$ is a constant, then
- The limit of a sum (of sequences) is the sum of the limits (of those sequences).
$$ \displaystyle\lim_{n\to\infty} \left(a_n + b_n\right) = \displaystyle\lim_{n\to\infty} a_n + \displaystyle\lim_{n\to\infty} b_n $$
- The limit of a difference (of sequences) is the difference of the limits (of those sequences).
$$ \displaystyle\lim_{n\to\infty} \left(a_n - b_n\right) = \displaystyle\lim_{n\to\infty} a_n - \displaystyle\lim_{n\to\infty} b_n $$
- The limit of a constant times a sequence is that constant times the limit of the sequence.
$$ \displaystyle\lim_{n\to\infty} \left(ca_n\right) = c\displaystyle\lim_{n\to\infty} a_n $$
- The limit of a constant (sequence) is that constant.
$$ \displaystyle\lim_{n\to\infty} c = c $$
- The limit of a product (of sequences) is the product of the limits (of those sequences).
$$ \displaystyle\lim_{n\to\infty} \left(a_n\cdot b_n\right) = \displaystyle\lim_{n\to\infty} a_n\cdot \displaystyle\lim_{n\to\infty} b_n $$
- The limit of a quotient (of sequences) is the quotient of the limits (of those sequences), if $\{b_n\}\nrightarrow 0$.
$$ \displaystyle\lim_{n\to\infty} \left(\dfrac{a_n}{b_n}\right) = \dfrac{\displaystyle\lim_{n\to\infty} a_n}{\displaystyle\lim_{n\to\infty} b_n},\ \ \text{ if } \displaystyle\lim_{n\to\infty} b_n\neq 0 $$
- The limit of an exponentiated sequence is the exponentiated limit of the sequence, if $p \neq 0$ and $\{a_n\}\rightarrow L \gt 0$.
$$ \displaystyle\lim_{n\to\infty} a_n^p = \left[\displaystyle\lim_{n\to\infty} a_n\right]^p,\ \ \text{if } p \gt 0\ \text{ and }\{a_n\}\rightarrow L \gt 0 $$
We also have a couple of important theorems
The Squeeze Theorem ¶
If we have three sequences and for some positive integer $N$, $n \gt N$ implies that $a_n \leq b_n \leq c_n$, and the sequences $\{a_n\}$ and $\{c_n\}$ converge to the same limit $L$. That is $\displaystyle\lim_{n\to\infty}a_n = L = \displaystyle\lim_{n\to\infty}c_n$. Then the sequence between them $\{b_n\}$ converges also and $\displaystyle\lim_{n\to\infty}b_n = L$.
The Absolute Convergence Theorem for Sequences ¶
If $\displaystyle\lim_{n\to\infty}\left|a_n\right| = 0$, then $\displaystyle\lim_{n\to\infty}a_n = 0$.
Let us practice finding the limit of a few sequences.
Find $\displaystyle\lim_{n\to\infty} \dfrac{n}{n+1}$.
The method we will use here is to treat the sequence as a function of real numbers, positive integers and apply the limit laws to this function.
$$\begin{align*}
\displaystyle\lim_{n\to\infty} \dfrac{n}{n+1} &= \displaystyle\lim_{n\to\infty} \dfrac{n}{n+1}\cdot\dfrac{\frac{1}{n}}{\frac{1}{n}} = \displaystyle\lim_{n\to\infty} \dfrac{1}{1+\frac{1}{n}} \\
\\
&= \dfrac{\displaystyle\lim_{n\to\infty} 1}{\displaystyle\lim_{n\to\infty} 1 + \displaystyle\lim_{n\to\infty} \frac{1}{n}} = \dfrac{1}{1 + 0} = 1.
\end{align*}$$
Is the sequence $a_n = \dfrac{n}{\sqrt{10 + n}}$ convergent or divergent?
Calculate $\displaystyle\lim_{n\to\infty} \dfrac{\log n}{n}$.
Series are sequences whose elements are sums of another sequence. Typically we start with an infinite sequence $\{a_n\}$ and we create a new sequence $\{s_n\}$ as follows
$$\begin{align*}
s_0 &= a_0 \\
\\
s_1 &= a_0 + a_1 \\
\\
s_2 &= a_0 + a_1 + a_2 \\
\\
s_3 &= a_0 + a_1 + a_2 + a_3 \\
\\
s_4 &= a_0 + a_1 + a_2 + a_3 + a_4 \\
&\ddots \\
s_n &= a_0 + a_1 + a_2 +\ \cdots\ + a_{n-1} + a_n = \displaystyle\sum_{k=0}^n a_k
\end{align*}$$
Each of the elements of our new sequence $\left\{s_n\right\}$ is called a
partial sum
. The new
sequence of partial sums
$\{s_n\}$ may converge or diverge. We call this new sequence a
series
. Notice that the series it
not
a sum of an infinite number of terms, but rather the limit of a sequence of finite sums. Mathematically this is an important distinction. The notation for a series is confusing because it appears to imply we are computing an infinite sum; we are not. We are computing a limit of a sequence.
Definition of Convergent Series ¶
If a sequence of partial sums $\{s_n\}$ is convergent and $\displaystyle\lim_{n\to\infty}s_n = S$ exists as a real number, then we say the series $\displaystyle\sum_{k=1}^n a_n$ is convergent and we write
$$S = \displaystyle\lim_{n\to\infty} s_n = \displaystyle\lim_{n\to\infty}\displaystyle\sum_{k=0}^n a_k = \displaystyle\sum_{k=0}^{\infty} a_k,$$
or $$a_0 + a_1 + a_2 +\ \cdots\ +\ a_n\ +\ \cdots\ = S.$$
The number $S$ is called the sum of the series. If the sequence $\{s_n\}$ is divergent, then the series is called divergent .
Consider the
geometric series
$$\dfrac{1}{2} + \dfrac{1}{4} + \dfrac{1}{8} +\ \cdots\ + \dfrac{1}{2^n} +\ \cdots\ = \displaystyle\sum_{n=1}^{\infty} \dfrac{1}{2^{n+1}}.$$
Each of the elements of our sequence is a partial sum
$$\begin{align*}
s_1 &= \dfrac{1}{2} \\
\\
s_2 &= \dfrac{1}{2} + \dfrac{1}{4} \\
\\
s_3 &= 1 + \dfrac{1}{2} + \dfrac{1}{4} + \dfrac{1}{8} \\
&\ddots \\
s_n &= \dfrac{1}{2} + \dfrac{1}{4} + \dfrac{1}{8} +\ \cdots\ + \dfrac{1}{2^n}
\end{align*}$$
If we multiply the last equation on both sides by $\frac{1}{2}$, then we have
$$\dfrac{1}{2}s_n = \dfrac{1}{4} + \dfrac{1}{8} +\ \cdots\ + \dfrac{1}{2^{n+1}}.$$
If we take the difference of the two previous equations, then
$$s_n - \dfrac{1}{2}s_n = \dfrac{1}{2} - \dfrac{1}{2^{n+1}}.$$
If we multiply both sides by $2$ we have
$$s_n = 1 - \dfrac{1}{2^n}.$$
There the
sum
of the geometric series, that is the limit of the sequence of partial sums is
$$S = \displaystyle\lim_{n\to\infty} 1 - \frac{1}{2^n} = \displaystyle\lim_{n\to\infty} 1 - \displaystyle\lim_{n\to\infty} \frac{1}{2^n} = 1 - 0 = 1.$$
That is
$$\displaystyle\sum_{n=1}^{\infty} \dfrac{1}{2^n} = 1.$$
It is important to remember that the infinite series on the left-hand side of the equation is
not
an "infinite sum" even though it is common to call the limit just that.
For what real values of $r$ is the sequence $\{r^n\}_{n=0}^{\infty}$ convergent?
If $|r| \lt 1$, what is the
sum
of the geometric series
$$a + ar + ar^2 + ar^3 +\ \cdots\ + ar^n +\ \cdots\ = \displaystyle\sum_{n=0}^{\infty} ar^n.$$
There is a special type of convergence for series called absolute convergence . It is an important concept for applications, so here is its formal definition and some important tools associated with it.
Definition of Absolute Convergence
A series $\displaystyle\sum_{k=0}^{\infty}a_k$ is called absolutely convergent if and only if the series $\displaystyle\sum_{k=0}^{\infty}|a_k|$ is convergent.
Definition of Conditional Convergence
A series is called conditionally convergent if it converges but it is not absolutely convergent.
Theorem
If a series is absolutely convergent, then it is convergent.
The Ratio Test
(i) If $\displaystyle\lim_{k\to\infty}\left|\dfrac{a_{k+1}}{a_k}\right| = L < 1$, then the series $\displaystyle\sum_{k=0}^{\infty} a_k$ converges absolutely.
(ii) If $\displaystyle\lim_{k\to\infty}\left|\dfrac{a_{k+1}}{a_k}\right| = L > 1$, then the series $\displaystyle\sum_{k=0}^{\infty} a_k$ diverges.
(iii) If $\displaystyle\lim_{k\to\infty}\left|\dfrac{a_{k+1}}{a_k}\right| = L = 1$, then the Ratio Test is inconclusive, no conclusion can be drawn about the convergence or divergence of the series $\displaystyle\sum_{k=0}^{\infty} a_k$.
We will need absolute convergence of series in part two of our review of Taylor Series!
Creative Commons Attribution-NonCommercial-ShareAlike 4.0
Attribution
You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
Noncommercial
You may not use the material for commercial purposes.
Share Alike
You are free to share, copy and redistribute the material in any medium or format. If you adapt, remix, transform, or build upon the material, you must distribute your contributions under the
same license
as the original.