Model Approximation and Robustness


Continuous and discrete Markov chain approximations can be important in predictive modeling and computationally efficient filtering for a number of reasons.  To dramatically increase the class of signals that can be filtered by our various methods, we have concerned ourselves with the Markov chain approximation of the extremely complex signals arising in multiple target tracking and other spatial filtering problems such as pollution or bacteria tracking within a sheet of water.  Then we can substitute the computer-tractable approximation for the real signal and rely on robustness results of, for example, Kushner or Bhatt and Karandikar  (1999, 2002) to justify our use of a simplified model.

With regard to constructing a general powerful method of approximating complicated signals with a Markov chain, Kouritzin and Long have made crucial innovations to advance the existing approximations of Ludwig Arnold, Peter Kotelenez, and Douglas Blount.  These innovations allow for driving noise sources, more computationally efficient implementations of novel Markov chain approximations, and a larger selection of spatial processes, which can be so approximated.  The innovations include enlarging the class of elliptic operators and the class of reaction functions that can be used in Markov chain approximations within stochastic reaction-diffusion equations, as well as improvements that allow for the use of much slower rates within these approximations to reduce the computational requirements.  Kouritzin and Long's alterations dramatically slow Markov chain state change rates, often yielding a one hundred-fold increase in the simulation speed over the previous version of the method.

In an attempt to attract Stantec or other potential industry partner with interests in environmental monitoring, the aforementioned innovations have been first incorporated into a stochastic reaction-diffusion equation motivated by pollution distribution.  (See the 1995 book by Kallianpur and Xiong for background information on this problem.)  Kouritzin and Long (2002) have established the convergence of Markov chain approximations to stochastic reaction diffusion equations in both the quenched and annealed senses.  Their first convergence result, a quenched law of large numbers, establishes convergence in probability of the Markov chains to the pathwise unique mild solution of the stochastic reaction diffusion equation for each fixed path of the driving Poisson measure source.  Their second result is the annealed approach that establishes convergence in probability of the Markov chains to the mild solution of the stochastic reaction-diffusion equation while considering the Poisson source as a random medium for the Markov chains.  These results are vital for application of filtering theory to the pollution dispersion-tracking problem, as they can be combined with the robustness results of Kushner or Bhatt and Karandikar and the aforementioned particle filtering methods to create a computer workable algorithm.

In more detail, Kouritzin and Long considered the stochastic model of ground water pollution, which mathematically can be written with a stochastic reaction diffusion equation.  In the context of simulating the transport of a chemical or bacterial contaminant through a sheet of water, they extended a well-established method of approximating reaction-diffusion equations with Markov chains by allowing convection, certain Poisson measure driving sources and a larger class of reaction functions.  This work applies to Lockheed Martin's interest in detecting and classifying oil slicks and vessel traces or wakes.

A weighted L2 Hilbert space was chosen to symmetrize the elliptic operator in the stochastic reaction diffusion equation, and the existence of and convergence to pathwise unique mild solutions of the stochastic reaction-diffusion equation was considered.  The region [0,L1]  [0,L2] was divided into L1N  L2N cells, and the Markov chain approximation on these cells was analyzed as N .

The particles in cells evolve in time according to births and deaths from reaction, random walks from diffusion and drift, and some area dependent births from the Poisson noise sources.  In this stochastic particle model, the formalism allows for two kinds of randomness:  the external fluctuation coming from the Poisson driving sources and the internal fluctuation from the reaction and drift-diffusion on the particle level.  Independent standard Poisson processes defined on another probability were used to construct the Markov chains by the random time changes method.

In a second work on the stochastic model of water pollution, which mathematically can be written with a stochastic partial differential equation driven by Poisson measure noise,  Kouritzin, Long and Sun (2002) establish a more general annealed law of large numbers.  It shows convergence in probability for our Markov chains to the solution of the stochastic reaction-diffusion equation while considering the Poisson source as a random medium for the Markov chains.  Our proof method of the main result is substantially different from the previous work Kouritzin and Long (2002) using the weak convergence method.  Here, we directly apply Cauchy criterion (convergence in probability) to our Markov chains and utilize the nice regularity of Green's function with a delicate iteration technique.  (The usual Gronwall's Lemma doesn't work in our case.)


The results of Kouritzin and Long (2002) and Kouritzin, Long and Sun (2002) have enabled the creation of prototype stochastic models of pollution. Methods for efficient simulation of the models have been implemented in a computer code.

Kouritzin, Long, Ballantyne, and H. Chan have created a simulation of a stochastic reaction-diffusion equation which can represent the transport of a pollutant or bacteria through a river or the leaching of a pollutant through a ground water system, including adsorption effects, all in a manner amenable to filtering. 

The stochastic models of the spread of pollution developed at MITACS-PINTS have the following general features:

  The models employ Markov chain approximations to nonlinear SPDEs representing stochastic reaction-diffusion equations.

  The equations include convective forces, as would be found in a flowing water system, and Poisson generating sources to model contamination from sites such as factories, storage ponds and agricultural facilities.

  The Markov chain approximations used in the models converge to the exact solution of the stochastic reaction-diffusion equation in both the annealed and the quenched senses.

  The approximations provide the basis for further work on applying filtering techniques to track the sources of contaminants given only imperfect, noise corrupted samples at a few locations.

  These concept proofs provide an effective foundation for incorporating novel filtering techniques into different models.  These techniques are also used to model other reactive flows, such as heat diffusion through a substance of varying heat capacity, or heat-activated internal reaction.


Lucic and Heunis (2002) study signal robustness in the extreme case of singular perturbations with the goal of characterizing the limiting nonlinear filter (if any) as the perturbation parameter tends to zero.  The signal arises from a singularly perturbed stochastic differential equation with a small parameter, in the case where the dynamics of the signal are conditioned by the observation process.  We show that the nonlinear filter is a solution of a particular measure-valued martingale problem, and then show that the limiting nonlinear filter exists and characterize it completely.  The approach is to use solvability of Poisson-type operator equations to construct a limiting measure-valued martingale problem, and use the uniqueness in law results of Lucic and Heunis (2001) to show that this limiting martingale problem is well posed and that its solution corresponds to the limiting nonlinear filter.

Kouritzin and Xiong (2002) study observation robustness to demonstrate the asymptotic correctness of the classical, non-instrumentable continuous-time observations via instrumentable coloured-noise approximations.  In particular, we consider the effect of the observations: , where  is an Ornstein-Uhlenbeck process, as .  In this case, the integrated observation noise converges to Brownian motion and we show that the filter also converges to the classical observation filter.  The non-integrated observations can be instrumented, so this result demonstrates that the classical observations are a natural idealized or limiting object.  Our result generalizes, in some manner, previous results by Kunita (1993), Mandal and Mandrekar (2000), Gawarecki and Mandrekar (2000), and Bhatt and Karandikar (2001).  Our method is to use the Kurtz-Xiong particle approach to derive a FKK-like filtering equation and uniqueness for this equation based upon observations , then we prove tightness for filter distributions, and identify a unique limit using the uniqueness result of Bhatt and Karandikar (1995).

Douglas Blount and Michael Kouritzin have derived Hölder continuity for processes related to the Zakai equation of filtering theory.  Blount and Kouritzin have obtained a criterion, which gives Hölder continuity results in Hilbert space for a class of solutions of stochastic evolution equations.  The class includes the superstable processes with critical binary branching and Ornstein-Uhlenbeck type SPDEs with a suitable eigenfunction expansion for the drift operator. It should also give regularity results for some types of SPDEs arising from filtering theory.  The resulting paper, ?x201c;Hölder continuity for spatial and path processes via spectral analysis?x201d;, appeared in< Probability Theory and Related Fields and was the subject of an invited talk in a session on stochastic analysis of the AMS meeting held in January 2001 in New Orleans.