In our study of first order differential equations we have studied linear, separable and autonomous equations. These are all important methods, and each has its own way of representing the differential equation. A separable equation can be written
$$ g(y)\dfrac{dy}{dt} = h(t).$$
The standard form of a linear equation is
$$ \dfrac{dy}{dt} + p(t)y = g(t).$$
An Autonomous equation can be expressed
$$ \dfrac{dy}{dt} = h(y).$$
All of these are special cases of the general form of a first order differential equation
$$ \dfrac{dy}{dt} = f(t, y).$$
In each case we want to integrate both sides of the equation to get a closed form solution for the differential equation. A closed form of the solution is a function $\phi$
$$y = \phi(t),$$
where $\phi$ is made up of algebraic combinations of the functions we learned in
all
of our previous math classes. Functions like $\phi$, even if they look complex, are called
elementary functions
). Out textbook calls elementary functions
simple functions
, but simple functions have a different definition almost everywhere else. We will see in
Section 2.8
that using Picard's theorem we can always attempt to find a series that converges to the solution even if the solution is not an elementary function. If our initial value problem satisfies the hypotheses of either
Theorem 2.4.1
or
2.4.2
then we can expect Picard's method to work. However not everyone wakes up in the morning thinking, "I'd like to find a series representation of the solution."
Some of the autonomous and separable differential equations we solved gave us elementary solutions, but not
closed form
solutions. That means we could not solve for $y$ as a function of $t$ using elementary functions. Some solutions were
implicit
functions of the dependent variable $y$.
The functions
all are examples of elementary functions but they can not be easily solved for the dependent variable $y$ unless we break up the domain of the independent variable into pieces. Some implicit functions have two or more values assigned to each value of the independent variable. These equations fail the vertical line test.
How do implicit equations represent solutions to our differential equations?
To answer this we need to look again at our general form of a first order differential equation and re-visit partial derivatives of a function with two inputs and one output.
$$ \dfrac{dy}{dx} = f(x,y) $$
On the right-hand side of the general form of a first order differential equation we have a multivariate function; one with two inputs and one output. The function is called multivariate because it has two inputs. That means the domain of the function is some region of the plane and the output can be imagined as a height above (or below) the domain. We can draw the graphs of these functions and the graphs form a surface . The following image is a surface over the region $[0,1]\times[0,1]$ and $f:[0,1]\times[0,1]\rightarrow\mathbb{R}$ is given by $z = e^{-x^2 - y^2}$.
If I fix the output $z =$ constant, that would create a curve on the surface; the intersection of the surface with the plane $z =$ constant. While we may not have a well-defined function $y$ as a function of $x$, we do have a well-defined function $z$ as a function of $x$ and $y$. The curves we create are called level sets because they are parallel to the $xy$-plane (or level ).
If we look straight down the vertical axis at the surface and the $xy$-plane and plot only the level sets we get a contour plot of the surface.
Do these curves look familiar?
They are the
isoclines
of the direction field, the curves where the output $\frac{dy}{dx} =$ constant. This is because
$$ \dfrac{dy}{dx} = z = f(x,y).$$
Are there differential equations for which the isoclines are the integral curves of the differential equation?
I'm glad you asked!
In calculus we learned about the
total derivative
or
differential
of a real-valued function $y = f(x)$,
$$ df = f'(x)dx.$$
This
differential form
was the basis for linear approximations, the
chain rule
and integrating by the method of substitution. If $x$ is a function of independent variable $t$ then we can differentiate both sides of $y = f(x)$ and get
$$ \dfrac{dy}{dt} = f'(x)\dfrac{dx}{dt} = \dfrac{df}{dx}\,\dfrac{dx}{dt}.$$
What does the total differential look like for a function with two independent variables?
The total differential has the form
$$ df = \dfrac{\partial f}{\partial x}dx + \dfrac{\partial f}{\partial y}dy.$$
If $f$ is a constant then it should be clear that the differential will be equal to zero. This gives us
$$ \dfrac{\partial f}{\partial x}dx + \dfrac{\partial f}{\partial y}dy = 0.$$
The vector field of a differential equation in which the isoclines are also the integral curves is called a
conservative
vector field. A first order differential equation that has this form is called an
exact differential equation
.
A function with two inputs has two partial derivatives. How many second derivatives does it have?
Examining the function
$$ z = f(x,y) = x^2y + xy^2 $$
Its first partials are
$$ \begin{matrix} \dfrac{\partial f}{\partial x} = 2xy + y^2 &\quad& \dfrac{\partial f}{\partial y} = x^2 + 2xy \end{matrix}, $$
and its second partials are
$$ \begin{matrix} \dfrac{\partial^2 f}{\partial x^2} = \dfrac{\partial}{\partial x}\dfrac{\partial f}{\partial x} = \dfrac{\partial}{\partial x}\left(2xy + y^2\right) = 2y & \ \ &
\dfrac{\partial^2 f}{\partial y\partial x} = \dfrac{\partial}{\partial y}\dfrac{\partial f}{\partial x} = \dfrac{\partial}{\partial y}\left(2xy + y^2\right) = 2x + 2y \\
\\
\dfrac{\partial^2 f}{\partial x\partial y} = \dfrac{\partial}{\partial x}\dfrac{\partial f}{\partial y} = \dfrac{\partial}{\partial x}\left(x^2 + 2xy\right) = 2x + 2y & \ \ &
\dfrac{\partial^2 f}{\partial y^2} = \dfrac{\partial}{\partial y}\dfrac{\partial f}{\partial y} = \dfrac{\partial}{\partial y}\left(x^2 + 2xy\right) = 2x. \end{matrix} $$
There are four of them. Notice that the upper right and lower left second partials are equal.
Clairaut's Theorem ¶
If the two second order derivatives $\dfrac{\partial^2 f}{\partial x\partial y}$ and $\dfrac{\partial^2 f}{\partial y\partial x}$ are continuous, then they are equal.
How does that help us solve a differential equation?
If we rewrite our first order differential equation in the form
$$ M(x,y)dx + N(x,y)dy = 0 $$
and we suspect that it is an exact differential equation, then we can check by differentiating again.
Determine if the differential equation
$$2x + y^2 + 2xyy' = 0$$
is an exact differential equation. The author of our textbook prefers this notation but I think it hides the important information. Let us re-write it
$$2x + y^2 + 2xy\dfrac{dy}{dx} = 0.$$
Now I think we can recognize the differential form
$$(2x + y^2)\,dx + 2xy\,dy = 0.$$
We want to determine if this is an exact differential equation. That is, we want to know if there is going to be a function $z = f(x,y)$ so that
$$ \dfrac{\partial f}{\partial x}\,dx + \dfrac{\partial f}{\partial y}\,dy = 0.$$
If there is such a
potential
function f, then
$$\dfrac{\partial f}{\partial x} = 2x + y^2$$
and
$$\dfrac{\partial f}{\partial y} = 2xy.$$
Calculating the two mixed second partial derivatives we get
$$\dfrac{\partial^2 f}{\partial y\partial x} = \dfrac{\partial}{\partial y}\left(2x + y^2\right) = 2y$$
and
$$\dfrac{\partial^2 f}{\partial x\partial y} = \dfrac{\partial}{\partial x}\left(2xy\right) = 2y.$$
Since they are equal we know that there is such an $f(x,y)$ and we know that the differential equation is
exact
.
Determine if the differential equation
$$(3xy + y^2) + (x^2 + xy)y' = 0$$
is an exact differential equation. First let us re-write it so that it is clear what our next step is going to be.
$$(3xy + y^2)\,dx + (x^2 + xy)\,dy = 0$$
If this is an exact differential equation then there is a potential function $f$ such that
$$\begin{align*}
\dfrac{\partial f}{\partial x} &= 3xy + y^2 \\
\\
\dfrac{\partial f}{\partial y} &= x^2 + xy \\
\end{align*}$$
Differentiating we get
$$\begin{align*}
\dfrac{\partial^2 f}{\partial y\partial x} &= \dfrac{\partial}{\partial y}\left(3xy + y^2\right) = 3x + 2y \\
\\
\dfrac{\partial^2 f}{\partial x\partial y} &= \dfrac{\partial}{\partial x}\left(x^2 + xy\right) = 2x + y. \\
\end{align*}$$
The two second order partial derivatives are
not
equal so the differential equation is
not
an exact differential equation.
Determine if the differential equation
$$(y\cos(x) + 2xe^y) + (\sin x + x^2e^y - 1)y' = 0$$
is an exact differential equation.
To solve an exact differential equation we need to integrate twice because our potential function $f$ has two inputs. If we find our potential function $z = f(x,y)$, then the level sets of $f$ are the isoclines of the direction field $\frac{dy}{dx} = f(x,y)$. Since the direction field is conservative we have that the graphs of the solutions of the differential equation, the integral curves, are also the isoclines of the direction field; that is the level sets of $f$. Therefore our family of solutions will be described by the level sets or
$$f(x,y) = \text{constant}$$
for every real constant.
Solve the differential equation
$$2x + y^2 + 2xyy' = 0.$$
We determined in
Example 2.6.1
that this first order differential equation is exact. Knowing this we can proceed to find the potential function and the family of solutions to the exact differential equation. If we write the differential equation in differential form it will be easier to see the steps in the method of solution.
$$(2x + y^2)\,dx + 2xy\,dy$$
Since we know that this derivative is an exact derivative of a potential function $f$, we have that
$$\dfrac{\partial f}{\partial x} = 2x + y^2 $$
Using the Fundamental Theorem of Calculus give us our potential function
$$f(x,y) = \displaystyle\int \dfrac{\partial f}{\partial x}\,dx = \displaystyle\int (2x + y^2)\,dx = x^2 + xy^2 + g(y).$$
Normally we would an indefinite integral would give us "$+\ c$", an arbitrary constant, but with a function of two variables we get a function of the other variable only. We can check by differentiating our anti-derivative with respect to $x$;
$$ \dfrac{\partial}{\partial x}\left(x^2 + xy^2 + g(y)\right) = 2x + y^2 + 0.$$
Since differentiating gives us the integrand we know from the fundamental theorem of calculus that we have the correct anti-derivative. Next we differentiate our new potential function $f$ with respect to $y$
$$ \dfrac{\partial f}{\partial y} = \frac{\partial}{\partial y}\left(x^2 + xy^2 + g(y)\right) = 0 + 2xy + g'(y).$$
At this point we have two expressions for $\frac{\partial f}{\partial y}$ so we set them equal to each other
$$ \dfrac{\partial f}{\partial y} = 2xy = 2xy + g'(y).$$
Subtracting the term $2xy$ from both sides leaves us with
$$g'(y) = 0.$$
Integrating both sides with respect to $y$ recovers the unknown function $g$ as
$$g(y) = \displaystyle\int g'(y)\,dy = \displaystyle\int 0\,dy = \text{ a constant.}$$
Substituting our expression for $g$ into our equation for $f$ give us the potential function
$$ f(x,y) = x^2 + xy^2 + k$$
for every constant $k$. The level sets of this surface will be the solutions of our differential equation. Setting out potential function equal to an arbitrary constant and subtracting the constants from both sides gives us the family of solutions
$$x^2 + xy^2 = C.$$
Notice that trying to solve this equation for $y$ will lead to a problem because of a square root. This implicit form of the solution is our best description of the family of solutions to this differential equation.
Solve the exact differential equation
$$(3 + 2xy) + (x^2 - 3y^2)y' = 0.$$
Recall that the first order differential equation
$$(3xy + y^2) + (x^2 + xy)y' = 0$$
is
not
an exact differential equation.
Is it possible to find an factor like linear equations that will make the left-hand side of the differential equation exact?
As in the case of linear equations we will attempt to find an
integrating factor
$\mu$, however our potential function $f$ has two inputs so our integrating factor may need two inputs, $\mu(x,y)$. If we multiply both sides of our differential equation by our integrating factor $\mu$ we will get
$$\mu(x,y)\,(3xy + y^2)\,dx + \mu(x,y)\,(x^2 + xy)\,dy = 0.$$
If our integrating factor works then we can check the mixed second derivatives and they will be equal. Using the product rule we obtain
$$\dfrac{\partial^2 f}{\partial y\partial x} = \dfrac{\partial}{\partial y}\left(\mu\,(3xy + y^2)\right) = \dfrac{\partial \mu}{\partial y}\,\left(3xy + y^2\right) + \mu\,\left(3x + 2y\right),$$
and
$$\dfrac{\partial^2 f}{\partial x\partial y} = \dfrac{\partial}{\partial x}\left(\mu\,(x^2 + xy)\right) = \dfrac{\partial \mu}{\partial x}\,(x^2 + xy) + \mu\,(2x + y).$$
Since they must be equal we have
$$\dfrac{\partial \mu}{\partial y}\,\left(3xy + y^2\right) + \mu\,\left(3x + 2y\right) = \dfrac{\partial \mu}{\partial x}\,(x^2 + xy) + \mu\,(2x + y).$$
The issue we have with this equation is that $\mu(x,y)$ is a function of two variables and we only have one equation. This means there are infinitely many functions $\mu$ that would satisfy this equation. There are
LOT
of integrating factors that have been thought up to solve various nonlinear exact first order differential equations. What follows is usually a lot of guess work, and problem solving.
If we look closely at our equation it seems that the term $\frac{\partial \mu}{\partial y}\,\left(3xy + y^2\right)$ is the most complicated. It occurs to one after deriving several integrating factors that we might try limiting $\mu$ to be a function of $x$ only and not $y$; that is $\mu = \mu(x)$. That way $\frac{d\mu}{dy} = 0$ and our complicated term vanishes. This leaves us with
$$\mu\,\left(3x + 2y\right) = \dfrac{\partial \mu}{\partial x}\,(x^2 + xy) + \mu\,(2x + y).$$
or
$$\dfrac{\partial \mu}{\partial x}\,x(x + y) = \mu\,\left(3x + 2y\right) - \mu\,(2x + y) = \mu\,(x + y).$$
We want to divide both sides by $(x + y)$ but we must investigate what that means first.
Our differential equation becomes
$$(-3y^2 + y^2) + (y^2 - y^2)y' = -2y^2 = 0.$$
$y = 0$ is a solution to our differential equation because $\frac{dy}{dx} = 0$ and our differential equation becomes
$$(3x(0) + 0^2) + (x^2 + x\cdot 0)(0) = 0.{\huge\color{#307fe2}\checkmark}$$
Dividing our equation for $\mu$ by the factor $(x + y)$ on both sides gives us
$$x\,\dfrac{\partial \mu}{\partial x} = \mu.$$
Just like in section 2.1 we are going to solve this separable differential equation for $\mu$ to get the integrating factor we need to solve our exact differential equation. Dividing both sides of our equation by $x\mu$ one obtains
$$\dfrac{1}{\mu}\,\dfrac{d\mu}{dx} = \dfrac{1}{x}.$$
Integrating both sides with respect to $x$ gives use
$$\log|\mu| = \log|x| + c.$$
or
$$|\mu| = e^{\log|\mu|} = e^{\log|x|+c} = C|x|.$$
Our arbitrary constant takes care of the absolute value signs so
$$ \mu(x) = cx = x.$$
Any arbitrary constant will do and $c=1$ is the simplest. Multiplying both sides of our original differential equation by our integrating factor results in
$$x\,(3xy + y^2)\,dx + x\,(x^2 + xy)\,dy = 0$$
or
$$(3x^2y + xy^2)\,dx + (x^3 + x^2y)\,dy = 0.$$
Let's check to make sure it is exact.
$$\begin{align*}
\dfrac{\partial^2 f}{\partial y\partial x} &= \dfrac{\partial}{\partial y}\left(3x^2y + xy^2\right) = 3x^2 + 2xy \\
\\
\dfrac{\partial^2 f}{\partial x\partial y} &= \dfrac{\partial}{\partial x}\left(x^3 + x^2y\right) = 3x^2 + 2xy. \\
\end{align*}$$
The new differential equation obtained by applying the integrating factor is
exact
and we can use the method of solving an exact equation to find the family of solutions to our original differential equation.
Solve the first order differential equation
$$(3xy + y^2) + (x^2 + xy)\,y' = 0.$$
In section 2.6.6 we solved a separable differential equation for the integrating factor
$$\mu(x) = x.$$
Multiplying both sides of our differential equation by $\mu$ results in
$$x\,(3xy + y^2)\,dx + x\,(x^2 + xy)\,dy = 0,$$
or
$$(3x^2y + xy^2)\,dx + (x^3 + x^2y)\,dy = 0.$$
We already know this equation is exact and there must be a potential function $f$ so that
$$\dfrac{\partial f}{\partial x} = 3x^2y + xy^2.$$
Integrating both sides with respect to $x$ one obtains
$$f(x,y) = \displaystyle\int\dfrac{\partial f}{\partial x}\,dx = \displaystyle\int 3x^2y + xy^2\,dx = x^3y + \dfrac{x^2y^2}{2} + g(y)$$
where $g(y)$ is our integration "
constant
" when integrating a function of two variables with respect to only one of them. Now we can compute the partial derivative of $f$ with respect to $y$. Differentiating the last equation gives us
$$\dfrac{\partial f}{\partial y} = \dfrac{\partial}{\partial y}\left(x^3y + \dfrac{x^2y^2}{2} + g(y)\right) = x^3 + x^2y + g'(y).$$
We also know from the differential equation that
$$\dfrac{\partial f}{\partial y} = x^3 + x^2y,$$
So we have
$$x^3 + x^2y + g'(y) = x^3 + x^2y,$$
or when we subtract $x^3 + x^2y$ from both sides
$$g'(y) = 0.$$
Hence $g(y) = k$, a constant. Substituting our expression for $g(y)$ into our expression for $f(x,y)$ yields
$$f(x,y) = x^3y + \dfrac{x^2y^2}{2} + k.$$
All of the level sets of this surface make up the family of solutions of our exact differential equation. Subtracting our arbitrary constant from both sides of the equation give us the family of solutions
$$x^3y + \dfrac{x^2y^2}{2} = C$$
for every real number $C$.
Solving first order nonlinear differential equations can difficult. After 400 years we have yet to discover a substitution, integrating factor or trick to solve many nonlinear differential equations. As an example, let us look at a type of nonlinear differential equation that Jakob Bernoulli solved. He studied equations of the form
$$ y' + p(t)y = q(t)y^n.$$
If $n=0$ or $n=1$ then we have a linear equation and we can solve it using an integrating factor from
Section 2.1
. However if $n\neq 0,1$ then we have a nonlinear equation. Jakob Bernoulli discovered the substitution
$$v = y^{1-n}.$$
Using the chain rule one obtains
$$\dfrac{dv}{dt} = (1-n)y^{-n}\,\dfrac{dy}{dt}.$$
That is
$$\dfrac{y^{n}}{1-n}\,\dfrac{dv}{dt} = \dfrac{dy}{dt} = y'.$$
Substituting into our original differential equation gives us
$$\dfrac{y^{n}}{1-n}\,\dfrac{dv}{dt} + p(t)y = q(t)y^n$$
Dividing both sides by $y^n$ gives us
$$\dfrac{1}{1-n}\,\dfrac{dv}{dt} + p(t)v = q(t).$$
This is a linear ordinary differential equation.
Creative Commons Attribution-NonCommercial-ShareAlike 4.0
Attribution
You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
Noncommercial
You may not use the material for commercial purposes.
Share Alike
You are free to share, copy and redistribute the material in any medium or format. If you adapt, remix, transform, or build upon the material, you must distribute your contributions under the
same license
as the original.