Skip to main content

Section 1.6 Existence and Uniqueness of Solutions

If \(x' = f(t, x)\) and \(x(t_0) = x_0\) is a linear differential equation, we have already shown that a solution exists and is unique. We will now take up the question of existence and uniqueness of solutions for all first-order differential equations. The existence and uniqueness of solutions will prove to be very important—even when we consider applications of differential equations.

Subsection 1.6.1 The Existence and Uniqueness Theorem

The following theorem tells us that solutions to first-order differential equations exist and are unique under certain reasonable conditions.
Let us examine some consequences of the existence and uniqueness of solutions.

Example 1.6.2.

Consider the initial value problem
\begin{align*} x'(t) & = \frac{\sin(tx)}{x^2 + t^2},\\ x(0) & = 1. \end{align*}
In this case \(f(t,x) = \sin(tx)/(x^2 + t^2)\) is continuous at \((0,1)\) as is
\begin{equation*} \frac{\partial f}{\partial x} = \frac{t \cos(t x)}{t^{2} + x^{2}} - \frac{2 x \sin(t x)}{{\left(t^{2} + x^{2}\right)}^{2}}. \end{equation*}
Therefore, a solution to the initial value problem must exist. However, finding such a solution in terms of elementary functions may be quite difficult if not impossible.

Example 1.6.3.

Consider the initial value problem \(y' = y^{1/3}\) with \(y(0) = 0\) and \(t \geq 0\text{.}\) Separating the variables,
\begin{equation*} y^{-1/3}\, dy = dt. \end{equation*}
Thus,
\begin{equation*} \frac{3}{2} y^{2/3} = t + C \end{equation*}
or
\begin{equation*} y = \left( \frac{2}{3} ( t + C) \right)^{3/2}. \end{equation*}
If \(C = 0\text{,}\) the initial condition is satisfied and
\begin{equation*} y = \left( \frac{2}{3} t \right)^{3/2} \end{equation*}
is a solution for \(t \geq 0\text{.}\) However, we can find two additional solutions for \(t \geq 0\text{:}\)
\begin{align*} y & = - \left( \frac{2}{3} t \right)^{3/2},\\ y & \equiv 0. \end{align*}
This is especially troubling if we are looking for equilibrium solutions. Although \(y' = y^{1/3}\) is an autonomous differential equation, there is no equilibrium solution at \(y = 0\text{.}\) The problem is that
\begin{equation*} \frac{\partial}{\partial y} y^{1/3} = \frac{1}{3} y^{-2/3} \end{equation*}
is not defined at \(y = 0\text{.}\)

Example 1.6.4.

Suppose that \(y' = y^2\) with \(y(0) = 1\text{.}\) Since \(f(t,y) = y^2\) and \(\partial f/ \partial y = 2y\) are continuous everywhere, a unique solution exists near \(t = 0\text{.}\) Separating the variables,
\begin{equation*} \frac{1}{y^2} \; dy = dt, \end{equation*}
we see that
\begin{equation*} y = - \frac{1}{t + C} \end{equation*}
or
\begin{equation*} y = \frac{1}{1-t}. \end{equation*}
Therefore, a solution also exists on \((-\infty, 1)\) if \(y(0) = -1\text{.}\) In the case that \(y(0) = -1\text{,}\) the solution is
\begin{equation*} y = - \frac{1}{t + 1}, \end{equation*}
and a solution exists on \((-1, \infty)\text{.}\) Solutions are only guaranteed to exist on an open interval containing the initial value and are very dependent on the initial condition.

Remark 1.6.5. Solutions Curves Cannot Cross.

The Existence and Uniqueness Theorem tells us that the integral curves of any differential equation satisfying the appropriate hypothesis, cannot cross. If the curves did cross, we could take the point of intersection as the initial value for the differential equation. In this case, we would no longer be guaranteed a unique solution to a differential equation.

Activity 1.6.1. Applying the Existence and Uniqueness Theorem.

Which of the following initial value problems are guaranteed to have a unique solution by the Existence and Uniqueness Theorem (Theorem 1.6.1)? In each case, justify your conclusion.
(a)
\(y' = 4 + y^3\text{,}\) \(y(0) = 1\)
(b)
\(y' = \sqrt{y}\text{,}\) \(y(1) = 0\)
(c)
\(y' = \sqrt{y}\text{,}\) \(y(1) = 1\)
(d)
\(x' = \dfrac{t}{x-2}\text{,}\) \(x(0) = 2\)
(e)
\(x' = \dfrac{t}{x-2}\text{,}\) \(x(2) = 0\)
(f)
\(y' = x \tan y\text{,}\) \(y(0) = 0\)
(g)
\(y' = \dfrac{1}{t} y + 2t\text{,}\) \(y(0) = 1\)

Subsection 1.6.2 Picard Iteration

It was Emile Picard (1856–1941) who developed the method of successive approximations to show the existence of solutions of ordinary differential equations. He proved that it is possible to construct a sequence of functions that converges to a solution of the differential equation. One of the first steps towards understanding Picard iteration is to realize that an initial value problem can be recast in terms of an integral equation.

Proof.

Suppose that \(u = u(t)\) is a solution to
\begin{align*} x' & = f(t, x)\\ x(t_0) & = x_0, \end{align*}
on some interval \(I\) containing \(t_0\text{.}\) Since \(u\) is continuous on \(I\) and \(f\) is continuous on \(R\text{,}\) the function \(F(t) = f(t, u(t))\) is also continuous on \(I\text{.}\) Integrating both sides of \(u'(t) = f(t, u(t))\) and applying the Fundamental Theorem of Calculus, we obtain
\begin{equation*} u(t) - u(t_0) = \int_{t_0}^t u'(s) \, ds = \int_{t_0}^t f(s, u(s)) \, ds \end{equation*}
Since \(u(t_0) = x_0\text{,}\) the function \(u\) is a solution of the integral equation.
Conversely, assume that
\begin{equation*} u(t) = x_0 + \int_{t_0}^t f(s, u(s)) \, ds. \end{equation*}
If we differentiate both sides of this equation, we obtain \(u'(t) = f(t, u(t))\text{.}\) Since
\begin{equation*} u(t_0) = x_0 + \int_{t_0}^{t_0} f(s, u(s)) \, ds = x_0, \end{equation*}
the initial condition is fulfilled.
To show the existence of a solution to the initial value problem
\begin{align*} x' & = f(t, x)\\ x(t_0) & = x_0, \end{align*}
we will construct a sequence of functions, \(\{ u_n(t) \}\text{,}\) that will converge to a function \(u(t)\) that is a solution to the integral equation
\begin{equation*} x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds. \end{equation*}
We define the first function of the sequence using the initial condition,
\begin{equation*} u_0(t) = x_0. \end{equation*}
We derive the next function in our sequence using the right-hand side of the integral equation,
\begin{equation*} u_1(t) = x_0 + \int_{t_0}^t f(s, u_0(s)) \, ds. \end{equation*}
Subsequent terms in the sequence can be defined recursively,
\begin{equation*} u_{n+1} = x_0 + \int_{t_0}^t f(s, u_n(s)) \, ds. \end{equation*}
Our goal is to show that \(u_n(t) \rightarrow u(t)\) as \(n \rightarrow \infty\text{.}\) Furthermore, we need to show that \(u\) is the continuous, unique solution to our initial value problem. We will leave the proof of Picard’s Theorem to a series of exercises (Exercise Group 1.6.5.5–13), but let us see how this works by developing an example.

Example 1.6.7.

Consider the exponential growth equation,
\begin{align*} \frac{dx}{dt} & = kx\\ x(0) & = 1. \end{align*}
We already know that the solution is \(x(t) = e^{kt}\text{.}\) We define the first few terms of our sequence \(\{ u_n(t) \}\) as follows:
\begin{align*} u_0(t) & = 1,\\ u_1(t) & = 1 + \int_0^t ku_0(s) \, ds\\ & = 1 + \int_0^t k \, ds\\ & = 1 + kt,\\ u_2(t) & = 1 + \int_0^t ku_1(s) \, ds\\ & = 1 + \int_0^t k(1 + ks) \, ds\\ & = 1 + kt + \frac{(kt)^2}{2}. \end{align*}
The next term in the sequence is
\begin{equation*} u_3(t) = 1 + kt + \frac{(kt)^2}{2} + \frac{(kt)^3}{2\cdot 3}, \end{equation*}
and the \(n\)th term is
\begin{align*} u_n(t) & = 1 + 1 + \int_0^t ku_{n-1}(s) \, ds\\ & = 1 + \int_0^t k\left(1 + ks \frac{(kt)^2}{2!} + \frac{(kt)^3}{3!} + \cdots +\frac{(kt)^{n-1}}{(n-1)!}\right) \, ds\\ & = 1 + kt + \frac{(kt)^2}{2!} + \frac{(kt)^3}{3!} + \cdots + \frac{(kt)^n}{n!}. \end{align*}
However, this is just the \(n\)th partial sum for the power series for \(u(t) = e^{kt}\text{,}\) which is what we expected.

Subsection 1.6.3 Important Lessons

  • Existence and uniqueness of solutions of differential equations has important implications. Let \(x' = f(t, x)\) have the initial condition \(x(t_0) = x_0\text{.}\) If \(f\) and \(\partial f/ \partial x\) are continuous functions on the rectangle
    \begin{equation*} R = \left\{ (t, x) : 0 \leq |t - t_0| \lt a, 0 \leq |x - x_0| \lt b \right\}, \end{equation*}
    there exists a unique solution \(u = u(t)\) for \(x' = f(t, x)\) and \(x(t_0) = x_0\) on some interval \(|t - t_0| \lt h\) contained in the interval \(|t - t_0| \lt a\text{.}\) In particular,
    • Solutions are only guaranteed to exist locally.
    • Uniqueness is especially important when it comes to finding equilibrium solutions.
    • Uniqueness of solutions tells us that the integral curves for a differential equation cannot cross.
  • The function \(u = u(t)\) is a solution to the initial value problem
    \begin{align*} x' & = f(t, x)\\ x(t_0) & = x_0, \end{align*}
    if and only if \(u\) is a solution to the integral equation
    \begin{equation*} x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds. \end{equation*}
  • Existence and uniqueness of solutions is proved by Picard iteration. This is of particular interest since the proof actually tells us how to construct a sequence of functions that converge to our solution.

Reading Questions 1.6.4 Reading Questions

2.

The differential equations \(y' = \sqrt[5]{y}\) and \(y(0) = 0\) has two solutions, \(y(t) \equiv 0\) and \(y(t) = 5y^{6/5}/6\text{.}\) Why does this not contradict Theorem 1.6.1?

Exercises 1.6.5 Exercises

1.

Which of the following initial value problems are guaranteed to have a unique solution by the Existence and Uniqueness Theorem (Theorem 1.6.1)? In each case, justify your conclusion.
  1. \(y' = y^2 + y^3\text{,}\) \(y(0) = 1\)
  2. \(y' = \sqrt[4]{y}\text{,}\) \(y(1) = 0\)
  3. \(y' = \sqrt[4]{y}\text{,}\) \(y(1) = 1\)
  4. \(x' = \dfrac{t}{x^2 - 4}\text{,}\) \(x(0) = 2\)
  5. \(x' = \dfrac{t}{x^2 - 4}\text{,}\) \(x(2) = 0\)
  6. \(y' = x \sin y\text{,}\) \(y(0) = 0\)
  7. \(y' = \dfrac{1}{t - 1} y + 2t\text{,}\) \(y(1) = 1\)
Hint.
  1. There exists a unique solution to \(y' = y^2 + y^3\text{,}\) \(y(0) = 1\text{,}\) since \(f(t, y) = y^2 + y^3\) and \(\partial f(t, y)/\partial y = 2y + 3y^2\) are continuous at the point \((0, 1)\text{.}\)
  2. The Existence and Uniqueness Theorem does not apply to \(y' = \sqrt[4]{y}\text{,}\) \(y(1) = 0\text{,}\) since \(f(t, y) = \sqrt[4]{y}\) is not continuous at \((1, 0)\text{.}\)
  3. There exists a unique solution to \(y' = \sqrt[4]{y}\text{,}\) \(y(1) = 1\text{,}\) since \(f(t, y) = \sqrt[4]{y}\) and \(\partial f(t, y)/\partial y = y^{-3/4}/4\) are both continuous at the point \((1, 1)\text{.}\)
  4. The Existence and Uniqueness Theorem does not apply to \(x' = t/(x^2 - 4)\text{,}\) \(x(0) = 2\text{,}\) since \(f(t, x) = t/(x^2 - 4)\) is not continuous at \((0, 2)\text{.}\)
  5. There exists a unique solution to \(x' = t/(x^2 - 4)\text{,}\) \(x(2) = 0\text{,}\) since \(f(t, x) = t/(x^2 - 4)\) and \(\partial f(t, x)/\partial x = - 2tx/(x^2 - 4)^2\) are both continuous at the point \((2, 0)\text{.}\)
  6. There exists a unique solution to \(y' = x \sin y\text{,}\) \(y(0) = 0\text{,}\) since \(f(x, y) = x \sin y\) and \(\partial f(x, y)/\partial y = x \cos y\) are both continuous at the point \((0, 0)\text{.}\)
  7. The Existence and Uniqueness Theorem does not apply to \(y' = 1/(t - 1) y + 2t\text{,}\) \(y(1) = 1\text{,}\) since \(f(t, y) = 1/(t - 1) y + 2t\) is not continuous at \((1, 1)\text{.}\)

2.

Find an explicit solution to the initial value problem
\begin{align*} y' & = \frac{1}{(t - 1)(y + 1)}\\ y(0) & = 1. \end{align*}
Use your solution to determine the interval of existence.

3.

Consider the initial value problem
\begin{align*} y' & = 3y^{2/3}\\ y(0) & = 0. \end{align*}
  1. Show that the constant function, \(y(t) \equiv 0\text{,}\) is a solution to the initial value problem.
  2. Show that
    \begin{equation*} y(t) = \begin{cases} 0, & t \leq t_0 \\ (t - t_0)^3, & t \gt t_0 \end{cases} \end{equation*}
    is a solution for the initial value problem, where \(t_0\) is any real number. Hence, there exists an infinite number of solutions to the initial value problem.
  3. Explain why this example does not contradict the Existence and Uniqueness Theorem.
Hint.
(b) Make sure that the derivative of \(y(t)\) exists at \(t = t_0\text{.}\)

4.

Consider the initial value problem
\begin{align*} y' & = 2ty + t\\ y(0) & = 1. \end{align*}
  1. Use the fact that \(y' = 2ty + t\) is a first-order linear differential equation to find a solution to the initial value problem.
  2. Let \(\phi_0(t) = 1\) and use Picard iteration to find \(\phi_n(t)\text{.}\)
  3. Show that the sequence \(\{ \phi_n(t) \}\) converges to the exact solution that you found in part (a) as \(n \to \infty\text{.}\)

Proof of the Existence and Uniqueness Theorem.

In Exercise Group 1.6.5.5–13, prove the Existence and Uniqueness Theorem for first-order differential equations.
5.
Use the Fundamental Theorem of Calculus to show that the function \(u = u(t)\) is a solution to the initial value problem
\begin{align*} x' & = f(t, x)\\ x(t_0) & = x_0, \end{align*}
if and only if \(u\) is a solution to the integral equation
\begin{equation*} x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds. \end{equation*}
6.
If \(\partial f/ \partial x\) is continuous on the rectangle
\begin{equation*} R = \left\{ (t, x) : 0 \leq |t - t_0| \lt a, 0 \leq |x - x_0| \lt b \right\}, \end{equation*}
prove that there exists a \(K \gt 0\) such that
\begin{equation*} |f(t, x_1) - f(t, x_2) | \leq K |x_1 - x_2| \end{equation*}
for all \((t, x_1)\) and \((t, x_2)\) in \(R\text{.}\)
7.
Define the sequence \(\{ u_n \}\) by
\begin{align*} u_0(t) & = x_0,\\ u_{n+1} & = x_0 + \int_{t_0}^t f(s, u_n(s)) \, ds, \qquad n = 1, 2, \ldots. \end{align*}
Use the result of the previous exercise to show that
\begin{equation*} |f(t, u_n(t)) - f(t, u_{n-1}(t) )| \leq K|u_n(t) - u_{n-1}(t) |. \end{equation*}
8.
Show that there exists an \(M \gt 0\) such that
\begin{equation*} |u_1(t) - x_0| \leq M | t - t_0|. \end{equation*}
9.
Show that
\begin{equation*} |u_2(t) - u_1(t)| \leq \frac{KM | t - t_0|^2}{2}. \end{equation*}
10.
Use mathematical induction to show that
\begin{equation*} |u_n(t) - u_{n -1}(t)| \leq \frac{K^{n-1}M |t - t_0|^n}{n!}. \end{equation*}
11.
Since
\begin{equation*} u_n(t) = u_1(t) + [u_2(t) - u_1(t)] + \cdots + [u_n(t) - u_{n-1}(t)], \end{equation*}
we can view \(u_n(t)\) as a partial sum for the series
\begin{equation*} u_0(t) + \sum_{n=1}^\infty [u_n(t) - u_{n-1}(t)]. \end{equation*}
If we can show that this series converges absolutely, then our sequence will converge to a function \(u(t)\text{.}\) Show that
\begin{equation*} \sum_{n=1}^\infty |u_n(t) - u_{n-1}(t)| \leq \frac{M}{K} \sum_{n=1}^\infty \frac{(K |t - t_0|)^n}{n!} \leq \frac{M}{K} \left( e^{K|h|} - 1 \right), \end{equation*}
where \(h\) is the maximum distance between \((t_0, x_0)\) and the boundary of the rectangle \(R\text{.}\) Since \(|u_n(t) - u_{n -1}(t)| \to 0\text{,}\) we know that \(u_n(t)\) converges to a continuous function \(u(t)\) that solves our equation.
 1 
We must a theorem from advanced calculus here to ensure uniform continuity (see Exercise 1.6.5.14). Any sequence of functions that converges uniformly, must converge to a continuous function.
12.
To show uniqueness, assume that \(u(t)\) and \(v(t)\) are both solutions to
\begin{equation*} x(t) = x_0 + \int_{t_0}^t f(s, x(s)) \, ds. \end{equation*}
Show that
\begin{equation*} |u(t) - v(t)|\leq K \int_{t_0}^t |u(s) - v(s)| \, ds. \end{equation*}
13.
  1. Define
     2 
    A similar argument will work for \(t \leq t_0\text{.}\)
    \begin{equation*} \phi(t) = \int_{t_0}^t |u(s) - v(s)| \, ds, \end{equation*}
    then \(\phi(t_0) = 0\) and \(\phi(t) \geq 0\) for \(t \geq t_0\text{.}\) Show that
    \begin{equation*} \phi'(t) = |u(t) - v(t)|. \end{equation*}
  2. Since
    \begin{equation*} |u(t) - v(t)| - K \int_{t_0}^t |u(s) - v(s)| \, ds \leq 0, \end{equation*}
    we know that
    \begin{equation*} \phi'(t) - K \phi(t) \leq 0. \end{equation*}
    Use this fact to show that
    \begin{equation*} \frac{d}{dt} \left[ e^{-Kt} \phi(t) \right] \leq 0. \end{equation*}
    Conclude that
    \begin{equation*} \phi(t) = \int_{t_0}^t |u(s) - v(s)| \, ds = 0 \end{equation*}
    for \(t \geq t_0\) or for all \(t \geq t_0\) and \(u(t) = v(t)\text{.}\)

14.

Let \(\phi_n(x) = x^n\) for \(0 \leq x \leq 1\) and show that
\begin{equation*} \lim_{n \rightarrow \infty} \phi_n(x) = \begin{cases} 0, & 0 \leq x \lt 1 \\ 1, & x = 1. \end{cases} \end{equation*}
This is an example of a sequence of continuous functions that does not converge to a continuous function, which helps explain the need for uniform continuity in the proof of the Existence and Uniqueness Theorem.