2010 Summer Project Week Microscopy extensions for ITK

Key Investigators

Gaetan lehmann: INRA

Alex Gouaillard: CoSMo Software / A*STAR

Luis Ibanez: Kitware Inc.

Introduction

Fluorescence microscopy is a very common image acquisition modality, used in many fields including medical imaging and biological research. Depending on the techniques used, the images produced can be in 2D or 3D, time lapse, and contain several channels.
ITK provides a lot of useful tools to analyze the images produce in fluorescence microscopy. The N-dimensionality of ITK is especially useful when dealing with 3D data sets. However, it contains very few methods to correct the aberrations introduced in the image during the acquisition.
We propose to implement some tools dedicated to the restoration of images in fluorescence microscopy. Because no algorithm exists to perfectly correct those aberrations, we propose to implement several algorithms for each kind of problem, including the most common algorithms, which are well know and widely used, and some of the state of the art algorithms.

Being able to simulate the noise behavior is very important because it let the user compare the result of a restoration method with the known unblurred object.

Deconvolution

non-blind deconvolution

This is the most common deconvolution type. With all those methods, it is supposed that the point spread function is known. Most of those methods, for practical reasons, also assume that the point spread function (PSF) is spatially invariant in the image. Some of the methods are focused on the computation efficiency, at the cost of the restoration quality - often the linear algorithms -, and some on the restoration quality at the cost of the computation complexity - usually the iterative algorithms. The fluorescence noise follows a Poisson distribution, but is often considered as gaussian noise to simplify the computations. This assumption is generally valid when the SNR is high - most of the time for wide field microscopy. Because of the noise amplification produce by the deconvolution process, most of the methods are regularized. In fact, the only unregularized methods are iterative, and are effectively regularized by stopping the iteration process before the convergence.

Space invariant PSF - Linear algorithms

All the proposed linear algorithms are made with the assumption of an additive gaussian noise. Other algorithms combined with wavelet transformed produce better results.

Regularized Linear Least Squares (truncated SVD) #

Maximum A Posteriori Linear Least Squares #

Wiener #

Tikhonov Miller #

Space Invariant - Iterative Algorithms

van Cittert #: No noise assumption. A single convolution is required by iteration.

Jansson-van Cittert #: This is a constrained version of the van Cittert algorithm. The pixel values are constrained in a range of values.

Landweber #: The noise is assumed to follow a gaussian additive model. The algorithm requires two convolutions by iteration and the step size is fixed, which makes it converge slowly. It can be optionally constrained to avoid negative values as expected in fluorescence microscopy.

Richardson-Lucy #: The noise is assumed to follow a poisson model. The algorithm requires two convolutions by iteration and the step size is fixed, which makes it converge slowly. The non negativity property is ensured without additional constraint. This algorithm is also known as Maximum Likelihood Maximization Expectation.

Damped Richardson-Lucy # This is a simple regularized version of the Richardson-Lucy algorithm. It is widely used in astronomy.

Tikhonov-Miller regularized Richardson-Lucy #: This is a regularized version of the Richardson-Lucy algorithm using the Tikhonov-Miller usual penalization. Ref: Dey et al. 3D Microscopy Deconvolution using Richardson-Lucy Algorithm with Total Variation Regularization. hal.archives-ouvertes.fr (2004)

Total variation regularized Richardson-Lucy: This is a regularized version of the Richardson-Lucy algorithm which also minimize the Total Variation of the image to avoid the noise amplification.

Maximum Entropy regularized Richardson-Lucy #: This is a regularized version of the Richardson lucy algoritm by imposing a maximum entropy constraint on the unblurred image.

Good regularized Richardson-Lucy: This is a regularize Richardson-Lucy algorithm which also minimize the Good's roughness.

Iterative Constrained Tikhonov-Miller @: The noise is assumed to follow an additive gaussian model, and the process is regularized using the usual penalization. The non negativity is enforced at each iteration and the convergence is accelerated with a conjugate gradient acceleration.

Verveer's Maximum A Posteriori (MAPGG, MAPGE, MAPPG, MAPPE, MAPGR and MAPPR): The noise is assumed to follow either a gaussian or a poisson model. Three different a priori can be imposed on the unblurred image: a gaussian model, a maximum entroy or a Good's roughness. The process is accelerated with a Newton acceleration.

Markov Random Field MAP: The noise is assumed to follow a poisson model. The unblurred image is assumed to be a realization of a Markov random field. The problem is solved with a split gradient technique which is known to converge slowly, but the result quality is enhance compared to other state of the art algorithms.

Generalized Vector-Valued Total Variation: The noise is assumed to follow an additive gaussian model. The process also minimize the total variation and is accelerated. The code is available for matlab with a BSD license.

space variant psf

Depth-variant ML-EM

PSF

The PSF is required for all the non-blind deconvolution methods. It is possible to get it either by measuring it with beads, are with a model. One way are the other both have their advantages and inconvenience and should be available for the final user.

PSF simulation

measurement / Evaluation of psf

Regularization Parameters Estimation

Generalized Cross Validation (GCV): A good estimation can be computed for the Tikhonov-Miller regularization with the generalized cross validation method.

Usability

On the implementation side, several goal should be reached.

observability #: The iterative deconvolution algorithm are usually quite long to run - several minutes or tens of minutes on todays computers. As a consequence, it is important that the process can be observed by the user so that he can validate visually the intermediate results. The various measurements made should also be available during the process. If the result is not the expected one, or if the expected result is reach sooner than expected, the user should be able to interrupt the process cleanly.

continue a completed process to increase the number of iterations #: The iterative filter should provide a mechanism to restart a completed deconvolution in order to increase the number of iterations.

preconditionning #: Preconditionning is a usual technique where the blurred image and/or the PSF are transformed at the beginning at the deconvolution in order to improve the deconvolution quality or to decrease the number of iterations needed to reach the convergence. The usual precoditionning methods include the gaussian blurring, the Wiener fieltering and various denoising methods (mostly wavelet based).

multithreaded calculators: ITK does not currently provide any base class to easily implement multithreading in the calculators. We propose to implement such a base class and to use it in the calculators required in the other tasks.

FFTW wisdom integration #: FFTW provides a mechanism to store the optimized plans on the disk. We propose to use this capability in ITK FFTW filters and to make it's usage transparent for the user.

Objective

The objectives of the project week is to share with the community the filters to come to avoid multiple groups working on the same algorithms. Persons interested can contact gaetan or alex directly.