Abstract: Since upcoming telescopes will observe thousands of strong lensing systems,
creating fully-automated analysis pipelines for these images becomes
increasingly important. In this work, we make a step towards that direction by
developing the first end-to-end differentiable strong lensing pipeline. Our
approach leverages and combines three important computer science developments:
(a) convolutional neural networks, (b) efficient gradient-based sampling
techniques, and (c) deep probabilistic programming languages. The latter
automatize parameter inference and enable the combination of generative deep
neural networks and physics components in a single model. In the current work,
we demonstrate that it is possible to combine a convolutional neural network
trained on galaxy images as a source model with a fully-differentiable and
exact implementation of gravitational lensing physics in a single probabilistic
model. This does away with hyperparameter tuning for the source model, enables
the simultaneous optimization of nearly one hundred source and lens parameters
with gradient-based methods, and allows the use of efficient gradient-based
posterior sampling techniques. These features make this automated inference
pipeline potentially suitable for processing a large amount of data. By
analyzing mock lensing systems with different signal-to-noise ratios, we show
that lensing parameters are reconstructed with percent-level accuracy. More
generally, we consider this work as one of the first steps in establishing
differentiable probabilistic programming techniques in the particle
astrophysics community, which have the potential to significantly accelerate
and improve many complex data analysis tasks.