Convergence rates for adaptive weak approximation of stochastic differential equations

by Moon, Kyoung-Sook, Zouraris, Georgios, Anders Szepessy, Raul Tempone
Year: 2005

Bibliography

 Moon, Kyoung-Sook; Szepessy, Anders; Tempone, Raúl; Zouraris, Georgios E. "Convergence rates for adaptive weak approximation of stochastic differential equations." 

Stoch. Anal. Appl. 23 (2005), no. 3, 511–558.

Abstract

Convergence rates of adaptive algorithms for weak approximations of Itoˆ stochastic differential equations are proved for the Monte Carlo Euler method. Two algorithms based either on optimal stochastic time steps or optimal deterministic time steps are studied. The analysis of their computational complexity combines the error expansions with a posteriori leading order term introduced in Szepessy et al. [Szepessy, A., R. Tempone, and G. Zouraris. 2001. Comm. Pure Appl. Math. 54:1169–1214] and an extension of the convergence results for adaptive algorithms approximating deterministic ordinary differential equations, derived in Moon et al. [Moon, K.-S., A. Szepessy, R. Tempone, and G. Zouraris. 2003. Numer. Math. 93:99–129]. The main step in the extension is the proof of the almost sure convergence of the error density. Both adaptive alogrithms are proven to stop with asymptotically optimal number of steps up to a problem independent factor defined in the algorithm. Numerical examples illustrate the behavior of the adaptive algorithms, motivating when stochastic and deterministic adaptive time steps are more efficient than constant time steps and when adaptive stochastic steps are more efficient than adaptive deterministic steps.

DOI:10.1081/SAP-200056678

 

Keywords

Adaptive mesh refinement algorithm, Almost sure convergence Computational complexity Monte Carlo methods Stochastic differential equations