Math 242: Calculus I
1.5 Limits
1.5.1 Sequences¶
$$ \require{color} \definecolor{brightblue}{rgb}{.267, .298, .812} \definecolor{darkblue}{rgb}{0.0, 0.0, 1.0} \definecolor{palepink}{rgb}{1, .73, .8} \definecolor{softmagenta}{rgb}{.99,.34,.86} \definecolor{blueviolet}{rgb}{.537,.192,.937} \definecolor{jonquil}{rgb}{.949,.792,.098} \definecolor{shockingpink}{rgb}{1, 0, .741} \definecolor{royalblue}{rgb}{0, .341, .914} \definecolor{alien}{rgb}{.529,.914,.067} \definecolor{crimson}{rgb}{1, .094, .271} \def\ihat{\mathbf{\hat{\unicode{x0131}}}} \def\jhat{\mathbf{\hat{\unicode{x0237}}}} \def\khat{\mathbf{\hat{\unicode{x1d458}}}} \def\tombstone{\unicode{x220E}} \def\contradiction{\unicode{x2A33}} $$
Definition¶
A sequence is an indexed list of real numbers.
- $\ \left\{ 1, 2, 3, 4, 5 \right\}$ is a sequence of five numbers, the first five positive integers.
Definition¶
An infinite sequence is an infinite indexed list of real numbers.
- $\ \mathbb{Z}^+$ is an infinite sequence of number, the positive integers.
We often define a rule or function for evaluating an element of a sequence based on its index.
- $\ \left\{\,a_k\,:\,a_k=2k-1, k\in\mathbb{Z}^+\,\right\}$ is a sequence of positive odd integers.
$$ \left\{\,1,\ 3,\ 5,\ 7,\ ,\ 9\ \dots\,\right\} $$
It works as follows:
$$ \begin{align*} a_{\color{red} 1} &= 2\cdot{\color{red} 1} - 1 = 1 \\ a_{\color{red} 2} &= 2\cdot{\color{red} 2} - 1 = 3 \\ a_{\color{red} 3} &= 2\cdot{\color{red} 3} - 1 = 5 \\ a_{\color{red} 4} &= 2\cdot{\color{red} 4} - 1 = 7 \\ a_{\color{red} 5} &= 2\cdot{\color{red} 5} - 1 = 9 \\ &\ddots \end{align*} $$
Exercise 1¶
Use set builder notation and an appropriate rule to describe the sequence of even positive integers
Check Your Work
$\left\{\,a_k\,|\,a_k = 2k,\ k\in\mathbb{Z}^+\,\right\}$
$$ \begin{align*} a_{\color{red} 1} &= 2\cdot{\color{red} 1} = 2 \\ a_{\color{red} 2} &= 2\cdot{\color{red} 2} = 4 \\ a_{\color{red} 3} &= 2\cdot{\color{red} 3} = 6 \\ a_{\color{red} 4} &= 2\cdot{\color{red} 4} = 8 \\ a_{\color{red} 5} &= 2\cdot{\color{red} 5} = 10 \\ &\ddots \end{align*} $$
1.5.2 Convergent Sequences¶
Intuition¶
If the elements of the sequence of numbers get closer and closer together they can also get closer and closer to a single number called the limit of the sequence. When a sequence has a limit, we say that the sequence converges to the limit.
Example 1¶
Consider the sequence $A = \left\{ \frac{1}{n}\,:\,n\in\mathbb{N}\,\right\}$.
Every term of this sequence of numbers gets closer and closer to the real number zero. We say that
There were several attempts to create a clear and mathematically correct definition of a convergent sequence. This turned out to be MUCH more challenging that anyone ever thought.
Zeno of Elea created examples of sequences that he suggested lead to logical contradictions. These Paradoxes were meant to illustrate the absurdity of people's experiences in motion, space, and time.
Let's us consider one of these examples. Suppose Atalanta wishes to walk to the end of a path.
- Before she can get to the end of the path, she must get halfway there.
- Before she can get halfway there, she must get a quarter of the way there.
- Before traveling a quarter, she must travel one-eighth;
- before an eighth, one sixteenth; and so on.
This requires Atalanta to complete and infinite number of tasks represented by a sequence
$$ \left\{1,\ \frac{1}{2},\ \frac{1}{8},\ \frac{1}{16},\ \dots \right\} = \left\{\,\frac{1}{2^n}\,:\,n\in\mathbb{N}\cup\left\{0\right\}\,\right\} = \left\{\,\frac{1}{2^n}\,:\,n\in\mathbb{W}\,\right\} $$
This sequence converges even faster to zero than Example 1! This is where calculus, while a useful mathematical model of the real world, does not mimic the real world. Zeno argues that it is impossible for Atalanta to complete an infinite number of tasks in any finite number of days. However in the real world, we cannot keep making our steps shorter and shorter. There is a limit to how short a step we may take.
However calculus CAN divide up the real line into ever smaller steps of length $\displaystyle\frac{1}{2^n}$?
Insofar as the propositions of mathematics refer to reality, they are uncertain; insofar as they as they are certain, they do not apply to reality.
-Albert Einstein
This is what Dr. Strogatz referred to as infinity power. Calculus indeed can tell us that the sequence converges to zero with absolute certainty. Unfortunately for Zeno, it does not mean that walking a path is an illusion that never really happens.
1.5.3 Weierstrauss' Great Idea¶
Karl Weierstrauss came up with a way to describe a limit that did NOT require us to resort to completing an infinite number of tasks, and yet still prove with mathematical certainty that the limit of Atalanta's steps would cross the path, because limit of the size of each step is zero. This meant that the remaining length of the uncrossed path is zero.
His idea requires two participants in a discussion about the limit of the sequence.
Example 2¶
Actor 1: I believe that the limit of the sequence $\left\{\,\frac{1}{2^n}\,\right\}$ is 0.
Actor 2: Really? Do the elements of the sequence eventually get closer to zero than $\frac{1}{2}$?
Actor 1: Yes, in fact every element of the sequence with index $n > 2$ has the form $n-2 > 0$, so
$$ \frac{1}{2^n} = \frac{1}{2^{n-2}2^2} = \frac{1}{2^{n-2}}\cdot\frac{1}{4} <\frac{1}{4} < \frac{1}{2} $$
This is due to the fact that for $n > 2$
$$ \begin{align*} n &> 2 \\ \\ n-2 &> 0 \\ \\ 2^{n-2} &> 2^0 = 1 \\ \frac{1}{2^{n-2}} &< 1 \end{align*} $$
So ALL of the elements of the sequence with index greater than 2 is closer than $\frac{1}{2}$.
Actor 2: Well, do the elements of the sequence eventually get closer to zero than $\frac{1}{100}$?
Actor 1: Yes, in fact every element of the sequence with index $n \ge 7$ is closer to zero than $\frac{1}{100}$ because
$$ \begin{align*} n &\ge 7 \\ \\ 2^n &\ge 2^7 = 128 \\ \frac{1}{2^n} &\le \frac{1}{128} < \frac{1}{100} \end{align*} $$
Actor 2: Okay, but do the elements of the squence eventually get closer to zero that $\epsilon > 0$, no matter how small?
Actor 1: _YES! In fact $\displaystyle\frac{1}{\epsilon}$ is just a LARGE real number. No matter how small $\epsilon>0$ may be, there will always be infinitely many real number BIGGER than $\displaystyle\frac{1}{\epsilon}$. Let for example $2^k$ be bigger than $\displaystyle\frac{1}{\epsilon}$. Then for $n > k$,
$$ \begin{align*} n &> k \\ \\ 2^n &> 2^k > \frac{1}{\epsilon} \\ \frac{1}{2^n} &< \epsilon \end{align*} $$
So for all of the indexes $n > k$, the elements of the sequence are closer to zero than $\epsilon$.
Actor 2: Wow! I guess you are right, the sequence converges to zero.
Definition¶
A sequence $\left\{\,a_n\,\right\}$ converges to limit $L$ if and only if for every $\epsilon > 0$, there exists an index $k$ so that $n>k$ implies that
$$ |a_n-L| < \epsilon $$
In this case we write
$$ \displaystyle\lim_{n\rightarrow\infty} a_n = L $$
Example 3¶
Consider the sequence $\left\{\frac{n}{n+1}\right\}$. Show that $\displaystyle\lim_{n\rightarrow\infty}\frac{n}{n+1} = 1$.
We could create a table
$$ \begin{array}{rcrrrrrrrrrrr} \hline n & | & 1 & 2 & 3 & 4 & 5 & 6 & \dots & 98 & \dots & 499 & \dots \\ \hline \frac{n}{n+1} & | & \frac{1}{2} & \frac{2}{3} & \frac{3}{4} & \frac{4}{5} & \frac{5}{6} & \frac{6}{7} & \dots & \frac{98}{99} & \dots & \frac{499}{500} & \dots \\ \hline & | & 0.5 & 0.66667 & 0.75000 & 0.80000 & 0.83333 & 0.87500 & \dots & 0.98990 & \dots & 0.99800 & \dots \\ \hline \end{array} $$
It certainly looks like the sequence is converging to 1. How can we use the definition to obtain a $k$ so that $n>k$ implies that
$$ \left| \frac{n}{n+1} - 1 \right| < \epsilon\ \ \ ? $$
First of all the absolute value signs are adding confusion. We know that
$$ \begin{align*} n &< n+1 \\ \\ \frac{n}{n+1} &< \frac{n+1}{n+1} = 1 \\ \\ \left| \frac{n}{n+1} - 1 \right| &= \left| 1 - \frac{n}{n+1} \right| = 1 - \frac{n}{n+1} \end{align*} $$
We can write our equation
$$ 1 - \frac{n}{n+1} < \epsilon $$
This is the end of the path. To get to the end we need a place to start our walk, and a course to take. Can we take our steps backwards until we find ourselves in a familiar place?
$$ \begin{align*} 1 - \frac{n}{n+1} &< \epsilon \\ \frac{n+1}{n+1} - \frac{n}{n+1} &< \epsilon \\ \frac{(n+1)-n}{n+1} &< \epsilon \\ \frac{1}{n+1} &< \epsilon \\ n+1 &> \frac{1}{\epsilon} \\ n &> \frac{1}{\epsilon} - 1 \end{align*} $$
EUREKA! If I pick $k > \frac{1}{\epsilon} - 1$, then any integer $n > k > \frac{1}{\epsilon} - 1$, and
$$ \begin{align*} n &> \frac{1}{\epsilon} - 1 \\ n+1 &> \frac{1}{\epsilon} \\ \frac{1}{n+1} &< \epsilon \\ \frac{(n+1)-n}{n+1} &< \epsilon \\ \frac{n+1}{n+1} - \frac{n}{n+1} &< \epsilon \\ 1 - \frac{n}{n+1} &< \epsilon \\ \left| \frac{n}{n+1} - 1 \right| &< \epsilon \end{align*} $$
Voici! For any $\epsilon > 0$, we choose $k > \frac{1}{\epsilon}-1$. Then $\left| \frac{n}{n+1} - 1 \right| < \epsilon$, and we have that the sequence $\left\{\,\frac{n}{n+1}\,\right\}$ converges to 1, or
$$ \displaystyle\lim_{n\rightarrow\infty} \frac{n}{n+1} = 1 $$
1.5.4 Limits of Intervals of Real Numbers¶
This is a fine thing to understand about elements of a sequence, however we want to determine limits of intervals of real numbers in the domain and codomain of a function.
Example 4¶
Consider the open interval $(0,1)$. If we represent elements of our interval with the variable $x$, what is the limit as $x$ approaches the right end point?
We would write
$$ \displaystyle\lim_{x\rightarrow 1} x $$
This is not a trick question. The limit is obviously 1. Can we use the language of Weierstrauss to describe the limit?
Given an $\epsilon > 0$, are there values in the interval $(0,1)$ closer to 1, than $\epsilon$. Can we find $x\in(0,1)$ so that
$$ \left|x-1\right| < \epsilon\ \ ? $$
Once again values $x\in(0,1)$ are less than one so we can write our expression with the absolute value symbols
$$ \left| x - 1 \right| = \left| 1 - x \right| = 1 - x $$
So let's work backwards from the conclusion we want to make.
$$ \begin{align*} 0 < 1 - x &< \epsilon \\ 0 < x - 1 &> -\epsilon \\ 1 > x &> 1 - \epsilon \\ 1 - \epsilon < x &< 1 \end{align*} $$
We are looking at the open interval $(1-e, 1)$. For any $\epsilon > 0$, we choose any $x\in(1-\epsilon,1)$. Then we have
$$ \begin{align*} 1 - \epsilon < x &< 1 \\ 1 > x &> 1 - \epsilon \\ 0 < x - 1 &> -\epsilon \\ 0 < 1 - x &< \epsilon \\ \left| x - 1 \right| &< \epsilon \end{align*} $$
We just proved using Weierstrauss' definition that the limit as the points of an open interval approaches one of its end points is ... the end point. Not an ground-breaking conclusion, but we needed an easy problem first.
Exercise 1¶
What is the $\displaystyle\lim_{x\rightarrow 0} 0$ in the interval $[0, 1]$?
1.5.5 Limits of a Function¶
Geometrically we still want to solve the Tangent Problem. In order to solve this problem, we need to compute the limit of the output variable in the codomain, as the input variable approaches a point in the domain. Our textbook author gives several examples using tables in section 1.5.
Let us perform the computation of a function $y=x^2$ as $x$ approaches $2$. Now we have two limits going on here.
- $x$ is approaching $2$ in the domain
- the limit of $y$ is going to be $4$ in the codomain
We are going to need two small positive numbers and compute two inequalities.
Intuition¶
If $x$ is close to $a$ in the domain, then is $f(x)$ close to $L$ in the codomain. If this is true, then we write
$$ \displaystyle\lim_{x\rightarrow a} f(x) = L $$
Example 5¶
Consider the function $f(x)=x^2$. What is the limit of the function $f$ as $x$ approaches $2$? We can create a graph or a table an conclude that as $x$ gets very close to $2$, $f(x)$ must get very close to $4$.
We can create a table that will tell us the same. However, can we prove that the limit is 4?
1.5.6 Weierstrauss Definition of the Limit of a Function¶
Definition¶
We say that the limit of a function is equal to $L$ as $x$ approaches $a$ if and only if, given an $\epsilon > 0$, we can find a small enough $\delta > 0$ so that $|x-a|<\delta$ implies that $|f(x)-L|<\epsilon$.
We will compute these limits using the Weierstrauss definition only for a few simple functions. I will show you an easier(?) way to compute limits.
Example 5 (cont'd)¶
One must find the set of steps that will lead us to conclude that whenever $|x-2|<\delta$, then $|x^2 - 4|<\epsilon$. As before we should start at the end of the path and work backward.
$$ \begin{align*} |x^2 - 4| &< \epsilon \\ -\epsilon &< x^2 - 4 < \epsilon \\ 4-\epsilon &< x^2 < 4+\epsilon \\ \sqrt{4-\epsilon} &< x < \sqrt{4 + \epsilon} \\ \sqrt{4-\epsilon} - 2 &< x - 2 < \sqrt{4 + \epsilon} - 2 \end{align*} $$
We now have the kind of inequality we are looking to find. Given an $\epsilon > 0$, we choose
$$ 0 < \delta < \min\left\{2-\sqrt{4-\epsilon}, \sqrt{4+\epsilon}-2 \right\} $$
We chose it because if $|x-2|<\delta$ implies that
$$ \begin{align*} -\delta &< x-2 < \delta \\ \sqrt{4-\epsilon} - 2 &< x - 2 < \sqrt{4 + \epsilon} - 2 \\ \sqrt{4-\epsilon} &< x < \sqrt{4 + \epsilon} \\ 4-\epsilon &< x^2 < 4+\epsilon \\ -\epsilon &< x^2 - 4 < \epsilon \\ |x^2 - 4| &< \epsilon \end{align*} $$
We proved that if $|x-2|<\delta$, then $|f(x)-4|<\epsilon$. Thus
$$ \displaystyle\lim_{x\rightarrow 2} x^2 = 4 $$
Exercise 2¶
What is limit of the function $g(x)=2x$ as $x$ approaches 3 using the Weierstrauss definition of the limit of a function?
Check Your Work
$\displaystyle\lim_{x\rightarrow 3} 2x = 6$
Follow Along
Given an $\epsilon > 0$ we need to find a $\delta > 0$ so that $|x-3|<\delta$ implies that $|2x-6|<\epsilon$.
$$ \begin{align*} |2x-6| &< \epsilon \\ -\epsilon &< 2x-6 < \epsilon \\ 6-\epsilon &< 2x < \epsilon + 6 \\ 3 - \frac{\epsilon}{2} &< x < \frac{\epsilon}{2} + 3 \\ -\frac{\epsilon}{2} &< x-3 < \frac{\epsilon}{2} \end{align*} $$
Given an $\epsilon > 0$, we choose $0 < \delta < \frac{\epsilon}{2}$, then $|x-3|<\delta$ implies
$$ \begin{align*} -\delta &< x-3 < \delta \\ -\frac{\epsilon}{2} &< x-3 < \frac{\epsilon}{2} \\ 3 - \frac{\epsilon}{2} &< x < \frac{\epsilon}{2} + 3 \\ 6-\epsilon &< 2x < \epsilon + 6 \\ -\epsilon &< 2x-6 < \epsilon \\ |2x-6| &< \epsilon \end{align*} $$
Using the Weierstrauss definition of limit this shows that $\displaystyle\lim_{x\rightarrow 3} 2x = 6$.
---
1.5.7 One-Sided Limits¶
Example 6¶
Now consider the Heaviside step function,
$$ H(x) = \left\{\begin{array}{lcr} 0 & \text{ if } & t < 0 \\ \\ 1 & \text{ if } & t \ge 0 \end{array} \right. $$
If we approach $x=0$ from the left of $0$ in the domain, then we get a limit of zero. However, if we approach $x=0$ from the right-hand side of $0$ in the domain, we get a limit of one. These two limits are called one sided limits.
Definition¶
The left-hand limit of $f(x)$ as $x$ approaches $a$ is denoted by
$$ \displaystyle\lim_{x\rightarrow a^-} f(x) = L $$
if and only if for every $\epsilon > 0$, there is a $\delta > 0$ so that $a-x<\delta$ implies that $|f(x)-L|<\epsilon$.
The right-hand limit of $f(x)$ as $x$ approaches $a$ is denoted by
$$ \displaystyle\lim_{x\rightarrow a^+} f(x) = M $$
if and only if for every $\epsilon > 0$, there is a $\delta > 0$ so that $x-a<\delta$ implies that $|f(x)-L|<\epsilon$.
If $L = M$, then we know that the two-sided limit exists and
$$ \displaystyle\lim_{x\rightarrow a}f(x) = L $$
Example 6 (cont'd)¶
In the case of the Heaviside function
$$ \begin{align*} \displaystyle\lim_{x\rightarrow a^-}f(x) &= 0 \\ \\ \displaystyle\lim_{x\rightarrow a^+}f(x) &= 1 \\ \\ \displaystyle\lim_{x\rightarrow a}f(x) &\ \ \ \text{DNE, that is it Does Not Exist!} \end{align*} $$
Theorem 1.5.1¶
The two-sided limit exists according to the Weierstrauss definition if and only if both the left-hand limit and the right-hand limit both exist and they are equal.
1.5.8 Infinite Limits¶
We have already seen that the left-hand limit and the right-hand limit may be different. When this happens, the two-sided limits does not exist. There are also functions with do not have even one-sided limits.
In our textbook the author discusses the topologists sine curve
Example 7¶
$$ y = \sin\left(\frac{\pi}{x}\right) $$
There no kind of limit at $x=0$. Brownian motion and the collapse of a wave function in quantum mechanics are examples of functions that do not have our nice notion of limits.
While these situations do occur, it is more likely that a limit does not exist because it is an infinite limit. In our previous algebra classes we called infinite limits vertical asymptotes. For example
Example 8¶
$$ \begin{align*} \lim_{x\rightarrow 0^-} \frac{1}{x^2} &= \infty \\ \\ \lim_{x\rightarrow 0^+} \frac{1}{x^2} &= \infty \\ \end{align*} $$
For this reason we say that the two-sided limit is an infinite limit
$$ \lim_{x\rightarrow 0} \frac{1}{x^2} = \infty \\ $$
Example 9¶
$$ \begin{align*} \lim_{x\rightarrow 0^-} \frac{1}{x} &= -\infty \\ \\ \lim_{x\rightarrow 0^+} \frac{1}{x} &= \infty \\ \end{align*} $$
Since the left-hand and right-hand limits are different, the two-sided limit does not exist.
Definition¶
The limit of a function as $x$ approaches $a$ is $+\inf$ if and only if for every large number $M>0$, there is a $\delta>0$ so that $|x-a|<\delta$ implies that $f(x) > M$.
The limit of a function as $x$ approaches $a$ is $-\inf$ if and only if for every large number $M>0$, there is a $\delta>0$ so that $|x-a|<\delta$ impies that $-f(x) > M$.
Exercise 3¶
Show that $f(x) = \displaystyle\frac{x+3}{x^2-9}$ has an infinite limit at $x=3$.
Solution
Given an $M > 0$ we need to find a $\delta > 0$ so that $x-3<\delta$ implies that $\frac{x+3}{x^2-9} > M$.
$$ \begin{align*} \frac{x+3}{x^2-9} &> M \\ \frac{x+3}{(x+3)(x-3)} &> M \\ \frac{1}{x-3} &> M \\ x-3 &< \frac{1}{M} \end{align*} $$
Given an $M > 0$, we choose $0 < \delta < \frac{1}{M}$, then $x-3<\delta$ implies
$$ \begin{align*} x-3 &< \frac{1}{M} \\ \frac{1}{x-3} &> M \\ \frac{x+3}{(x+3)(x-3)} &> M \\ \frac{x+3}{x^2-9} &> M \end{align*} $$
Using the Weierstrauss definition of a one-sided infinite limit, $\displaystyle\lim_{x\rightarrow 3^+} \displaystyle\frac{x+3}{x^2-9} = \infty$.
Your use of this self-initiated mediated course material is subject to our Creative Commons License 4.0