Conditions:
The Bachelor Semester Projects and Master Semester Projects are only reserved for
regular EPFL students or for students of enrolled in am official mobility program.

Project Proposals

All Project

Number of projects:9

An off-the-grid algorithm in ImageJ for 3D single-molecule localization microscopy

Master Semester Project: Reserved

A new class of algorithms has recently emerged in the literature for the recovery of point source signals from altered and noisy measurements. These methods are able to perform the reconstruction, without requiring any discretization of the domain, by solving an infinite dimensional optimization problem. They interleave convex optimization updates (corresponding to adding a new point source) with non-convex optimization steps (corresponding to changing the intensities and positions of the point sources). Single-molecule localization microscopy is an imaging technique in fluorescence microscopy that is able to bypass the diffraction limit and reach nanoscale resolution for the imaging of sub-cellular structures in cells (e.g., microtubules). High performance numerical solvers are needed to locate precisely the positions of the fluorescent molecules. The previously mentioned class of algorithms are currently the one that obtain the state-of-the-art results in this application. The goal of this project is to implement one of these methods, called the Sliding Frank-Wolf algorithm, in Java as an ImageJ/Fiji plugin so that it can be usable by biologists. It will include a user interface and permit automatic processing of large datasets and output super-resolved images. This project is a continuation of the semester project of Amandine Evard. The code and material are available. The idea is to add new features to the algorithm and the plugin (spline interpolation of the point-spread function, tracking of molecules in time, new modalities...).

Single-particle cryo-electron microscopy (cryo-EM) is a Nobel-prized technology that aims to characterize the 3D structure of proteins at the atomic level. The electron microscope first images numerous (~100k) replicates of a protein, positioned at various orientations. Algorithms then reconstruct a high-resolution 3D structure from the acquired images.
The main challenge in cryo-EM reconstruction, compared to traditional tomographic set-ups, is that the angles at which the images were taken are unknown. Another challenge is that the images are extremely noisy and blurred. The sheer amount of images per protein (~100k), as well as the number of imaged proteins (~4k), should however enable a data-driven approach to overcome those challenges.
Project goal: Design a neural network to estimate the angular relation between images of a protein. The developed neural network will be trained and tested on simulated and real data.
Prerequisites: Experience with Python programming. Experience with (Deep) Machine Learning (with any framework) is desirable. No experience in biology is required. Experience in imaging is a plus.

In an inverse problem, the objective is to reconstruct a signal from a set of measurements. This is typically achieved by solving an optimization problem with a data-fidelity term that enforces the consistency between the reconstructed signal with the measured data. When the problem is ill-posed, a common technique is to add a regularization term that is based on our prior knowledge on the form of the signal. A regularization parameter then balances the weight between the data fidelity and the regularization terms. The choice of this parameter is crucial, and it is typically hard to tune. Hence, homotopy methods aim to solve the optimization problem for all possible values of the regularization parameter, so that the user can choose a suitable one. The goal of this project is to investigate such methods for some specific discrete inverse problems, starting with a literature review, and to implement an algorithm in practice. As the project is somewhat exploratory, the student should be able to take initiative and to work autonomously. He or she should also have strong mathematical interest, particularly in the field of optimization.

Computed tomography (CT) is a method for creating a 3D image of a sample. It proceeds by computationally combining many projection images (e.g., the X-ray image a doctor might take of a broken bone), taken from different angles. In some applications, a full
180° view of the object is not feasible. In the last decade, iterative procedures which exploit sparsity have been used for limited angle CT reconstructions.
In this project, the student will apply a dictionary learning algorithm for limited angle CT to exploit redundancies in the measurements.
Good Matlab skills are required for this project.
For further details, don’t hesitate to send a mail.

Poisson processes are used to model sparse piecewise-smooth signals. They are pure jump processes characterized by the law of the jumps and the average density of knots. The properties of the law of the jumps is intimately linked with the asymptotic behavior of the process. The goal of this project is to link the decay rate of its jumps probability with the (almost surely) inclusion of the process in the space of tempered distributions. This has recently been shown using advanced mathematical tools. Here, the aim is to find an elementary proof by studying the existence of moments of the law of jumps. The student should have a solid understanding of functional analysis and probability theory.

Continuous-domain signals can be reconstructed from their discrete measurements using variational approaches. The reconstructed signal is then defined as a minimizer of a functional, which is composed of a convex combination of two terms, namely data-fidelity (quadratic) and regularization. We have recently proved that when using the L1 norm for the regularization, this reconstructed signal is sparse in the continuous-domain. The sparsity is enforced via a regularization operator: for example, the derivative leads to piecewise-constant solutions (total variation), while the second derivative yields piecewise-linear solutions. The goal of this project is to investigate novel regularization operators, namely fractional derivatives which allow us to vary continuously e.g. between the derivative and the second order derivative. The work would also include the design and implementation of an algorithm for signal reconstruction. The student should have strong mathematical interests (particularly in functional analysis), and basic knowledge of optimization theory.

Broadly speaking, machine learning algorithms aim to find an input-output relationship based on a (usually large) set of training examples. This is typically achieved by solving an optimization problem with a regularization term that reduces overfitting. In particular, sparsity-promoting regularizers lead to model simplifications and make the handling of large training datasets easier. We recently developed a theory for sparse learning that characterizes the form of the solution to this optimization problem in the continuous domain (i.e., the learned output is a parametric function). Based on these theoretical results, a grid-based discretization scheme has been proposed and implemented. The goal of this project is to study the convergence of these discretized problems in relation to the underlying continuous-domain problem in different setups. The ideal outcome would be to obtain bounds for the error rate as the grid goes finer. The student should have a solid understanding of functional analysis and convex optimization.

The theory of reproducing kernel Hilbert spaces includes a large class of positive-definite kernels that can be used in different machine learning problems. Among the many choices, Gaussian kernels are the most popular and classical ones. Recently, we have developed a theory for continuous domain sparse learning using an adaptive kernel expansion. However, this theory is currently incompatible with Gaussian kernels. The goal of this project is to extend our theory to cover the Gaussian kernels. It requires a study beyond tempered distributions in order to consider Gaussian kernels as the Green’s function of some suitable operators. The student should have a solid understanding of functional analysis and distribution theory, and basic knowledge of convex optimization.

Blebbing is a very dynamic phenomenon that plays an important role during apoptosis, cell migration, or cell division. Using time-lapsed microscopy techniques, phase contrast and fluorescence, biologists can observe blebs which are spherical protrusions which appear and disappear on the membrane of the cell.
The goal of the project is to design and to implement image-analysis algorithms based on active contour and curve optimization take into account the blebbing. It requires a automatic segmentation of the cell over the multichannel sequence of images and a local extraction of the bulges to quantify blebbing. The project will be implemented in Java as an ImageJ plugin with an user interface allowing a manual edition of the outlines of the blebs.