Computer Implementation


1.BENCHMARK/NON-FILTERING PROBLEMS

In consultation with our corporate partners, we developed the following twelve-benchmark problems.  They are close enough to our partners' real interests to be useful to them but still differ enough that we are allowed to describe them fully without compromising propriety information.  We do not subject the reader to all the finer details of our models herein but they are available upon request. Also, the solution to many of these problems can be found on our web page: http://www.math.ualberta.ca/Pints.  Clearly, some of these problems contain unrealistic simplifications.  These simplifications were arrived at through consultation with our corporate partners and deemed to be reasonable with respect to their real interest.  The development of these problems themselves represents a large amount of work performed in the November 2000 to November 2002 timeframe.

The Altitude Tracking Problem tracks the altitude of an aircraft based upon barometric altimeter readings. The altitude signal is modeled as a two dimensional (position, vertical velocity) Ito equation and the observations is the altimeter reading after transmission to a ground station. In particular, the observation at time  is formed by taking the altitude of the aircraft at this time, adding noise due to local weather variations and measurement noise, rounding the result to the closest one hundred feet, and then adding transmission noise.  The measurement noise is typically taken to be i.i.d. Gaussian and the transmission noise are usually asymmetric with heavy tails.

The Search and Rescue Problem models a dinghy lost on the ocean surface to be located, tracked, predicted, and historically tracked using highly corrupted infrared camera readings. These cameras sense body heat of occupants as well as combustion heat produced by a motor but are also subject to temperature fluctuation of the ocean itself. The dinghy has a seven dimensional state space consisting of planar position, planar speed, orientation, angular velocity, and motion type (adrift, rowing, or motoring). The first six states evolve according to an Itô equation and the last state changes according to a Markov chain. The forward operator decomposes into sums of operators with no more than four effective states due to independence. The observations are synthetic spatial pixel-by-pixel heat readings over the area of detection that are modeled as noise superimposed upon a small constant heat reading over the dingy surface. Usually, the observation noise is set to be additive and independent in space and time with some fixed distribution. However, non-additive, non-independent noise has also been considered and constructed. Various dinghy sizes have been considered from 1 to 500 pixels squared. For a 125 pixels squared dinghy in a 65,536 pixels squared detectable area, we consider pixel-by-pixel signal to noise ratios to the extreme of 1 to 41, which is approximately one-eighth the minimum detectable by the human eye and about the minimum possible (using the optimal filter) for reasonably reliable tracking.

The Fish Tracking Problem models a fish swimming in a tank, neglecting fish deformation, wire frame effects. The fish reflects off the sides of the tank and is modeled as a two-dimensional Skorohod stochastic differential equation. We neglect vertical motion in this example. As opposed to the previous example, we only model position in this model. For simplicity, we take the observable fish shape to be a constant square block of varying sizes. The observation is identical to the Search and Rescue Problem; only the type of camera used is different. Instead of an infrared camera we assume a digital vision camera in this problem.

The Performer Prediction Problem predicts the future state of a single performer moving around on a stage.  Her three dimensional position, forward speed (that can be negative), and orientation are modeled as an Itô equation constrained to stay on a stage and biased to face the audience. The vertical velocity is taken to be independent of the other state variables.  Acoustic measurements are formed by timing supersonic noise from each of four perimeter speakers to a microphone on the performer's person.  Noise is present due to reflections and blockage of the sound waves, meaning observations can be missed.  The observations are modeled as the correct travel time of the sound wave multiplied by a position-dependent Bernoulli random variable (indicating a missed observation) plus additive noise.

The Pod Tracking Problem is concerned with tracking a pod that is constrained to move along a two-dimensional manifold of three-dimensional space.  Physically, the pod is a sensor external to an aircraft connected by a pole, and we need to estimate the position of the pod based on a second sensor in the plane.  A Stratonovich equation with coefficient vector fields designed to keep the pod on the manifold is used to model the signal. The simplest model used has reflections at the boundary, a deterministic pull back to the origin proportional to the distance from the origin along the manifold, and additive noise, i.e. is like an Ornstein Uhlenbeck process, and the simplest manifold is the surface of a hemisphere.  However, these assumptions are not quite correct and more complicated models and manifolds need to be employed for high fidelity.  The observations can be either:  1) The projection of the pod position onto a plane plus additive uniformly distributed noise, or 2) Angle only information, consisting of the two angles of the pod relative to an outward normal plus additive noise.

The Weakly-Interacting Multi-Target Problem involves tracking an unknown number of (seven dimensional) dinghies that interact weakly to avoid collision and to maneuver together.  The model is devised so that if there is just one dinghy it undergoes the motion described in the Search and Rescue Problem. There can be anywhere from zero to six dinghies and we must detect the presence, determine the correct number, determine the dinghy locations, and predict where they will be at some future time.  The observations are as described in the Search and Rescue Problem except there are multiple dinghy images superimposed with the observation noise on the detectable region.

The Volatility Parameter Problem is concerned with estimating future volatility and parameters of a stock, volatility model based upon price information.  In particular, the volatility signal is a non-negative mean-reverting Itô equation model and the discrete-time observations are the logarithm of the returns process, defined as the current price divided by the previous price.  To allow heavy tails, we model observations as some nominal value (deterministic drift) plus the volatility times stable random variables.  Most often we need to estimate the parameters of the signal and observation processes using combined parameter estimation and filtering algorithm. These parameters include:  a volatility time constant, mean volatility in log scale, amplitude of volatility noise, the index  of a centered symmetric -stable law, and a price drift constant.

The Ordered Multi-Target Tracking Problem is concerned with tracking an unknown number of targets (collectively the signal) on a compact interval.  These targets repulse their nearest neighbours and the boundaries but not so strongly that they cannot cross or leave the interval and die.  Each particle has its own ?x201c;power?x201d; and ?x201c;shape?x201d; that are independent of everything, evolve slowly, and can be represented on (0, )2 .  There are small birth and death rates.  A new particle is positioned uniformly over the interval with a birth rate independent of the number of living particles or their positions.  A constant death rate applies to each living particle.  The initial conditions and rates are constructed in order to anticipate between thirty and sixty targets most of the time.  We observe a time-dependent spatial impulse response of some unknown background function plus the collection of particle shapes centered at their location and multiplied by their power.  This is all obscured by additive noise and there are parameters to be estimated in the background function as well as in the signal itself.  Hence, at each observation time, the observation is a time-dependent incompletely determined function of the signal plus noise at that time.  The observations focus on different parts of the signal at different times.

The Random Environment Filtering Problem refers to the problem of filtering a non-Markov diffusion in a random environment as well as determining close path-by-path approximations to this signal.  For a fixed environment, the signal can be constructed as a path-continuous Markov process using Dirichlet form theory.  However, the drift is too rough to interpret the solution as more than a formal solution to a stochastic differential equation and the signal becomes non-Markov when the environment realization is incorporated.  The motivation for this problem comes from the Search and Rescue Problem when the dinghy is in the high seas and the sea provides a random environment through which the dinghy must maneuver.  The continuous-time observations are taken to be have classical form except there can be correlation between the observation noise and the signal.  For implementation purposes, we also find it worthwhile to consider the filtering problems where:  I) The signal is constrained to live in a compact interval by reflections at the boundary, and II) in addition, the gradient of the Brownian sheet (or other  random field) appearing as the ?x201c;drift?x201d; in the diffusion in the random environment is replaced with a truncated (non-square summable) Fourier expansion of the same.  On compact intervals, we treat the filtering problem in a random environment as a combined filtering estimation problem.  In the case that the Fourier expansion is not truncated it is functional estimation and, otherwise, parametric.

The Submarine Location Problem involves the location and tracking of an underwater submarine based upon sound waves that it emits and that are received by sonabuoys.  Due to randomness, these sound waves evolve as a stochastic wave equation in a random environment. The observations are the partial, corrupted measurements received by the collection of sonabuoys.  The signal can be taken to be the (infinite-dimensional) sound wave itself, requiring a second estimation problem of finding the submarine itself from the wave estimates.  In the stochastic control variant of the problem, one is allowed to place additional sonabuoys in the optimal places.

The Pollution Problem is the monitoring of the flow of bacterial or chemical pollution that reacts, drifts, and diffuses through a sheet of groundwater (or in a region of space).  The signal is therefore a stochastic reaction-diffusion equation, usually driven by Poisson measure noise.  This noise source is the random dumping of the pollutant into the water by various factories along its way.  The observations are then taken to be a collection of corrupted measurements representing the average value over a lake or pinpoint measurements at well sites.  One is interested in answering vital questions such as:  ?x201c;Which factory is the biggest polluter??x201d; ?x201c;In which regions is the water safe to drink??x201d; and ?x201c;What will the water quality be like at this point in the future??x201d; all based upon filtering theory.

The Option Pricing Problem is to produce a pathspace empirical measure that can be used to price all pathspace and terminal options simultaneously for a collection of stocks.  The model consists of a collection of stocks that jointly influence interest rates through the empirical measure of the stocks.  Such randomness in the interest rates is natural and means that prices brought back to the current time are subject to random weighting.  Thus, a resampled particle system like SERP can be expected to outperform straight Monte Carlo method in terms of performance per computation.  Naturally, this problem is not filtering but still is estimation and solvable with minor modifications to our filtering algorithms.
 

2.OBJECT ORIENTED PROGRAMMING

We have a long and growing list of both sample problems and filtering algorithms.  Therefore, to ease the burden of new implementations and reduce duplication of code segments, we are modifying the object-oriented interfaces in all our algorithms in order to improve the ability to swap various problems and filters.  In addition, we are upgrading our display capabilities using external 3-D ray tracing code.  The first end goal of this effort is to trivialize the introduction of new problems, algorithms, and comparisons, allowing users with little programming experience to create, implement, and test new algorithms with ease.  The second end goal is to create a more flexible, expressive display mechanism.  We spare the reader the details of how we are approaching this goal but mention that very considerable effort is being spent with the consent and encouragement of our corporate partners.
 
 

3.PARTICLE FILTERS

Previously, we implemented and compared the classical weighted particle filter, Del Moral & Salut's interacting particle filter, and MIBR on our benchmark Search and Rescue, Fish Farming, and Performer problems. There was a very significant improvement in going from the weighted to interacting to MIBR filters and this was reported previously. Therefore, we will only compare SERP to the best of earlier implemented methods, MIBR.

We have implemented and tested SERP on the Search and Rescue, Fish Farming, Performer, and Option Pricing problems, empirically solving the stochastic optimal control problem for a feedback control resampling parameter  as a function of the particle system state for each problem.  Work is ongoing to implement SERP on the Weakly-interacting Multi-target problem.  Our unveiling and first demonstration of SERP (on the Fish Farming problem) occurred at Lockheed Martin Naval Electronic and Surveillance Systems in February 2002.  Early results showed a marginal increase in speed and a six to ten percent improvement in performance over MIBR using the same number of particles.  However, further work on streamlining the code (as we did earlier for MIBR and all other methods) and the stochastic control  selection problem produced an even faster, more accurate SERP algorithm typically beating MIBR by ten percent in speed and twenty percent in estimation.  We now know that this improvement in performance increases as we consider the larger, more challenging problems.  The elegance of the solution and the improved performance of the method convinced Lockheed Martin to patent the algorithm and implementation.  We agreed to work with them on that.

It is believed that SERP will work on any problem for which the classical weighted particle method works.  Therefore, based upon the works of Kurtz and Xiong (1998) and Kouritzin and Xiong (2002), we believe that SERP can replace MIBR everywhere and even work on problems where there is correlation between the observation noise and the signal.  Moreover, the optimal control problem can promote either time to localization or longer-term error, depending upon which is more critical in the problem at hand, so there is extra flexibility that is not present in earlier particle filters.

When being used with discrete-time observations, most computational effort for SERP (and other particle methods) is usually spent evolving particles forward using Euler or Mihlstein approximations.  Therefore, explicit solutions can be used (as described above) when applicable to create an incredibly fast and accurate method.  We sometimes refer to particle simulation by use of explicit solution as supercharging particles.

To cope with more technical problems of our corporate partners, we implemented the following variants of MIBR.  First, the introduction of a cemetery state was used as a mechanism to perform detection and tracking, as opposed to tracking a signal that is known to exist in the domain.  A blurring of the sort that is used in simulated annealing, with the same reduction of the blurring in time, was applied to the particle dynamics and the inverse of the observation function to resolve problems with highly directed signal dynamics and very definite observation signatures.  The particle control technique has been modified to not duplicate particles about which there is no information, but instead to resuscitate a recently deleted particle, thereby removing artifacts near areas of the domain, such as the edges, that often contain particles about which there is little knowledge.  This is especially helpful in the case of a strong signal with little noise filtered using few particles.  Most importantly, it was realized that the historical information stored to provide the path-space filter could be used to handle time-dependent observation functions.  For example, if the observations depend on the path of the signal over the last few moments of time, the branching filter can still optimally calculate the probability that the signal is at a certain state or had taken a particular path because it can access data regarding the potential past paths of the signal while assessing the current observation.  Many signals in the real world leave observable trails but are hard to detect directly, and that is why this discovery has wide applicability.

Simulations have been constructed to show the use of the advanced historical, or path-space, tracking for MIBR and SERP.  The current simulations outline a use for the technique in the search and rescue situation, which had been used previously to showcase the particle methods.  The new simulations offer an updated view of the path-space capabilities of the MIBR and SERP filters.  It is important to note that the filter provides a full approximate distribution for the seven-dimensional dingy state at past, present, and future times, as well as full path space information.  In the anti-narcotic smuggling example, the pathspace estimate can be used to construct the path over which the smugglers may have discarded or deposited their narcotics and future positions where the smugglers could be intercepted.

Adaptations to MIBR and SERP allow them to estimate the parameters of the signal rather than to assume a completely known model.  It is important that any such filter also retains its efficiency, so careful work was required.  Hitherto, the parameter-inclusion filter, the GMM, and the recursive least squares filter methods have been implemented with the parameter-inclusion filter being the clear loser.  The recursive least squares filter estimation algorithm works well in practice but mathematical proofs are yet to be constructed.
 

4.REST IMPLEMENTATION

We have implemented and tested REST, initializing it with very few cells of large size to localize the signal in.  As time progresses, REST automatically fine tunes its cell size, splitting the cells in the most profitable dimension.  If the signal becomes lost, it recombines cells in the most critical dimension.  We deem comparisons between REST and particle filters to be fair if either the particle filter uses a fine enough time discretization with its Euler or Mihlstein approximations of the particle evolutions to match the accuracy of REST closely or a discretized operator is used in REST so it collects, compares, and performs the net effect of a few operations in order to save computations and reduce its accuracy to that of particle systems.  Otherwise, it is clear that REST is more accurate but the particle system may be faster.  With the noted equalization, REST usually significantly outperforms the particle filters including SERP.  Hitherto, we have implemented REST on the Fish Tracking Problem and found it to be the clear winner in terms of computations and performance measured by time to localize the fish and average longer-term error.  The advantages of REST are speculated to be the introduction of far less simulation noise and the fact that initially it only tries to determine which of few large cells the signal is in rather than hitting a home run and getting the precise signal location right away.  These conjectures have been partially proven by simulations, especially simulations where there are very few particles and the particle filters often do not find the fish.

We also implemented REST in the Performer problem.  However, there is a huge disadvantage to discretization methods over particle filters in this problem because the forward operator decomposes into the sum of operators ?x201c;with smaller dimension?x201d; due to independence and this can be made good use of by particle filters but not by REST due to the fact the ?x201c;particles?x201d; in REST do not evolve independently as the signal.  This means that the particle filters effectively solve a smaller dimensional problem than REST does.  This problem-inherited advantage proved to be too great for the usual superiority of REST to overcome. We were impressed that REST produced numbers of the same order as the particle filters.

Our computer code for REST is general, allowing it to work in any dimensions and with a variety of signal and observation models.  However, it takes quite a few off-line mathematical calculations to produce the adjoint operator, complete with boundary values, for our more complicated multi-target problems.  Moreover, REST relies on fairly regular differential operators and domains.  Hence, its implementation for the multi-target problems is still ongoing.
 
 

5.IDEX IMPLEMENTATION AND PARAMETER ESTIMATION

The IDEX algorithm has been implemented and an initial practical simulation has been completed for the Altitude-Tracking and Volatility problems.  In addition, IDEX has been generalized to handle predictive models driven by Lévy process observations and, simultaneously, to estimate model parameters using least squares and maximum likelihood methods in the volatility problem.  The program was applied to estimate mean-reverting stochastic volatility using symmetric driven price returns as observations.  In particular, the kth observation was ,where is the stochastic volatility at time , and () are independent symmetric-stable random variables with centers zero and unit spread parameters.  Five parameters including were simultaneously estimated.  (A complete description of the parameter was givenpreviously when the volatility problem was introduced.)  The method was proven to work mathematically and experimentally in Kouritzin, Remillard and Chan (2001).  The implementation was done by C. Chan,  Kouritzin and Wiebe.