Using a set of samples that are associated with weights (namely the particle method) to represent the distribution of interest for filtering under very general hypotheses (often referred to as the sequential Monte Carlo, SMC approaches or particle filters) has gained high attention in the last two decades. However, the particle method suffers from problems such as sample depletion, huge computational time and challenges raised in multi-target tracking (MTT). Aiming to address these problems and challenges, this thesis investigates efficient particle filtering from two perspectives that deal with different number of objects of interest: single-object and multi-object. On one side, novel resampling schemes, fast implementations of particle filters are developed under the Bayesian framework. On the other side, improved particle implementations of the probability hypothesis density (PHD) filter, namely the particle PHD filters are presented to deal with MTT.
Resampling is a critical step in the implementation of particle filters that is of practical and theoretical significance. Firstly, various resampling methods and new developments are compared and classified into different categories, providing a comprehensive overview of resampling methods. General discussions about statistical effects of resampling are given with emphasis on robustness and identical distribution testing. New deterministic, adaptive and fast resampling schemes are put forward separately. Further, to increase the computing speed of the particle filter, a fast likelihood computing method based on numerical fitting is proposed, where the likelihood of particles is numerically fitted by the likelihood probability density function (Li-PDF) instead of directly computing it based on measurements. This is the first attempt that applies numerical fitting to enable real-time particle filtering.