II. Finite Difference Methods for Solving PDEs
III. Numerical Examples
$$ \renewcommand{PDut}{\frac{\partial u}{\partial t}} \renewcommand{PDux}{\frac{\partial u}{\partial x}} \renewcommand{PDutt}{\frac{\partial ^2u}{\partial t^2}} \renewcommand{PDuxx}{\frac{\partial ^2u}{\partial x^2}} \renewcommand{PDuyy}{\frac{\partial ^2u}{\partial y^2}} a(x,t)\PDutt + 2b(x,t)\frac{\partial ^2u}{\partial t\partial x} + c(x,t)\PDuxx + d(x,t)\PDut + e(x,t)\PDux = f(t,x,u) $$
It can be classified into three categories: Hyperbolic, Parabolic and Elliptic
In general, the different types of equations make a big difference in how the solutions behave and
on how we can solve them more effectively
$b^2(x,t) - a(x,t)c(x,t) = 0 $ $\;$
The canonical form: $$ \large{ \PDut = d \;\PDuxx } $$
$b^2(x,y) - a(x,y)c(x,y) < 0 $ $\;$
The canonical form: $$ \large{ \PDuxx + \PDuyy = f(x,y) } $$
$$ \large{ u(x, t=0) = u_0(x) } $$
$$ \large{ a \; u + b\; \PDux = c, \;\forall \; x \in \partial Q, \;\forall \; t } $$
$$ \renewcommand{PDuS}{\frac{\partial u}{\partial S}} \renewcommand{PDuSS}{\frac{\partial ^2u}{\partial S^2}} \PDut + \frac{1}{2}\sigma^2S^2\PDuSS + rS\PDuS - ru = 0 $$
\begin{aligned} &u(S,T) = \max(S-K, 0); \\ &u(0,t) = 0; \\ &u(S,t) = S - Ke^{-r(T-t)}, \;\mbox{as } \; S \rightarrow \infty \end{aligned}
\begin{aligned} \renewcommand{FDut}{\frac{u_{i,k+1}-u_{i,k}}{\triangle t}} \renewcommand{FDutb}{\frac{u_{i,k}-u_{i,k-1}}{\triangle t}} \renewcommand{FDutc}{\frac{u_{i,k+1}-u_{i,k-1}}{2\triangle t}} \renewcommand{FDutt}{\frac{u_{i,k+1}-2u_{i,k}+u_{i,k-1}}{\triangle t^2}} \renewcommand{FDux}{\frac{u_{i+1,k}-u_{i,k}}{\triangle x}} \renewcommand{FDuxb}{\frac{u_{i,k}-u_{i-1,k}}{\triangle x}} \renewcommand{FDuxc}{\frac{u_{i+1,k}-u_{i-1,k}}{2\triangle x}} \renewcommand{FDuxx}{\frac{u_{i+1,k}-2u_{i,k}+u_{i-1,k}}{\triangle x^2}} &u(S,T) = \max(K-S, 0); \\ &u(0,t) = Ke^{-r(T-t)}; \\ &u(S,t) = 0, \;\mbox{as } \; S \rightarrow \infty \end{aligned}
$$t_0 = 0, t_1, t_2, \cdots, t_M = T$$
in time and
$$x_0 = x_{min}, x_1, x_2, \cdots, x_N = x_{max}$$
in space. The grid is uniform, i.e. $t_{k+1} = t_k + \triangle t, \;\triangle t = \frac{T}{M}$ and $x_{i+1} = x_i + \triangle x, \;\triangle x = \frac{x_{max}-x_{min}}{N}$.
$$ u_{i,k} = u(x_i, t_k), \;\mbox{for } i = 0,1,\cdots,N, \mbox{ and } k = 0,1,\cdots,M $$
Partial Derivative | Finite Difference | Type | Order |
---|---|---|---|
$\PDux$ | $\FDux$ | forward | 1st in $x$ |
$\PDux$ | $\FDuxb$ | backward | 1st in $x$ |
$\PDux$ | $\FDuxc$ | central | 2nd in $x$ |
$\PDuxx$ | $\FDuxx$ | symmetric | 2nd in $x$ |
$\PDut$ | $\FDut$ | forward | 1st in $t$ |
$\PDut$ | $\FDutb$ | backward | 1st in $t$ |
$\PDut$ | $\FDutc$ | central | 2nd in $t$ |
$\PDutt$ | $\FDutt$ | symmetric | 2nd in $t$ |
$$ \small \PDut - \frac{1}{2}\sigma^2S^2\PDuSS - rS\PDuS + ru = 0 $$
$$ \small \FDut - \frac{1}{2} \sigma^2 x_i^2 \FDuxx - r x_i \FDuxc + r u_{i,k} = 0 $$
\begin{aligned} \small u_{i,k+1} & = \left\{ \frac{1}{2}\sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} - r x_i \frac{\triangle t}{2\triangle x} \right\} u_{i-1,k} \\ & + \left\{ 1 - \sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} -r \triangle t \right\} u_{i,k} \\ & + \left\{ \frac{1}{2}\sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} + r x_i \frac{\triangle t}{2\triangle x} \right\} u_{i+1,k} \end{aligned}
For any FDM that are used to solve practical problems, we should ask
1) How acurate is the method?
2) Does it converge?
3) What is the best choice of step sizes?
$$ T(x,t) = O(\triangle x^2 + \triangle t) $$
$$ \epsilon_{j,k} = e^{a t_k}e^{il_mx_j} $$
$$ \epsilon_{j,k} = v_{j,k} - u_{j,k} $$
$\hspace{0.2in}$ again satifies the FDM due to linearity.
$$ \epsilon(x,t) = \sum_{m= 1}^{M} e^{at}e^{il_mx} $$
where $e^{at}$ is a special form of the amplitude, and $l_m$ is the wavelength: $l_m = \frac{\pi m}{L}, m = 1, \cdots, M, \mbox{ and } M = \frac{L}{\triangle x}$
$$ e^{at_k}e^{il_mx_j} $$
$$ \frac{\epsilon_{j,k+1}}{\epsilon_{j,k}} = e^{a\triangle t} $$
$$ ||e^{a\triangle t}|| < 1 $$
\begin{aligned} e^{a (t_k + \triangle t)}e^{il_mx_j} & = \left\{ \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} - r x_j \frac{\triangle t}{2\triangle x} \right\} e^{a t_k}e^{il_mx_{j-1}} \\ & + \left\{ 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} -r \triangle t \right\} e^{a t_k}e^{il_mx_j} \\ & + \left\{ \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} + r x_j \frac{\triangle t}{2\triangle x} \right\} e^{a t_k}e^{il_mx_{j+1}} \end{aligned}
\begin{aligned} e^{a\triangle t} & = \left\{ \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} - r x_j \frac{\triangle t}{2\triangle x} \right\} e^{-il_m\triangle x} \\ & + \left\{ 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} -r \triangle t \right\} \\ & + \left\{ \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} + r x_j \frac{\triangle t}{2\triangle x} \right\} e^{il_m\triangle x} \end{aligned}
\begin{aligned} e^{a\triangle t} & = \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} e^{-il_m\triangle x} \\ & + 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \\ & + \frac{1}{2}\sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} e^{il_m\triangle x} \end{aligned}
\begin{aligned} e^{a\triangle t} &= \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \cos(l_m\triangle x) + 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \\ & = 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \cdot 2\sin^2(l_m\triangle x/2) \end{aligned}
$$ || 1 - \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \cdot 2\sin^2(l_m\triangle x/2) || < 1 $$
$$ \triangle t < \frac{\triangle x^2}{ \sigma^2 x_{max}^2 } $$
$$ \small \FDutb - \frac{1}{2} \sigma^2 x_i^2 \FDuxx - r x_i \FDuxc = - r u_{i,k} $$
\begin{aligned} \small u_{i,k-1} & = \left\{ -\frac{1}{2}\sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} + r x_i \frac{\triangle t}{2\triangle x} \right\} u_{i-1,k} \\ & + \left\{ 1 + \sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} + r \triangle t \right\} u_{i,k} \\ & + \left\{ -\frac{1}{2}\sigma^2 x_i^2 \frac{\triangle t}{\triangle x^2} - r x_i \frac{\triangle t}{2\triangle x} \right\} u_{i+1,k} \end{aligned}
$$ T(x,t) = O(\triangle x^2 + \triangle t) $$
$$ e^{a\triangle t} = \frac{1}{1 + \sigma^2 x_j^2 \frac{\triangle t}{\triangle x^2} \cdot 2\sin^2(l_m\triangle x/2)} $$
\begin{aligned} \small \FDut & = \frac{1}{4} \sigma^2 x_i^2 \left\{ \FDuxx + \frac{u_{i+1,k+1}-2u_{i,k+1}+u_{i-1,k+1}}{\triangle x^2} \right\} \\ & + \frac{1}{2} r x_i \left\{ \FDuxc + \frac{u_{i+1,k+1}-u_{i-1,k+1}}{2\triangle x} \right\} \\ & - \frac{1}{2} r \left\{ u_{i,k} + u_{i,k+1} \right\} \end{aligned}
$$ T(x,t) = O(\triangle x^2 + \triangle t^2) $$