Total Generalized Variation regularization for X-ray tomography – Experimental Data

Summary: Tomography means reconstructing the internal structure of a physical body using X-ray images of the body taken from different directions. Mathematically, the problem is to recover a non-negative function f(x) from a collection of line integrals of f. Here we show how to solve the tomography problem using Total Generalized Variation (TGV) regularization on experimental data. TGV is an edge-preserving reconstruction method for ill-posed inverse problems, introduced in 2010 by Bredies, Kunisch, and Pock [2]. While Total Variation favors sharp piecewise constant reconstructions, the strength of TGV is in the awareness of higher-order smoothness.

Authors: Kristian Bredies, Tatiana Bubba and Samuli Siltanen.

Software: Tested on Matlab 2018a.
Data: The open data set lotusroot.
Download package as zip file: CodesTGV

Literature: Kristian Bredies, Recovering piecewise smooth multichannel images by minimization of convex functionals with Total Generalized Variation penalty, Lecture Notes in Computer Science, 8293:44– 77, 2014.

Please read also the posts Total variation regularization for X-ray tomography and Total variation regularization for X-ray tomography – Experimental data for an introduction on TV tested on simulated and real data. A summary of the experimental data can be found on arXiv.

The usual Total Variation (TV) seminorm [1] is defined as

TV is usually a suitable model for images with edges. However, natural images are often piecewise smooth due to shading, for instance. To overcome the limits of TV, one can consider higher-order derivatives. For example

where Sd×d denotes the set of symmetric matrices. The limit here is that solutions of variational problems with TV2 penalty cannot have jump discontinuities and object boundaries become inevitably blurry.

The idea behind Total Generalized Variation (TGV) [2] is to incorporate smoothness information on different scales by combining first- and second-order derivative:

In this way, TGV can measure smooth regions as well as jump discontinuities in a convex manner.

In the following we compare the reconstructions obtained by minimizing the penalty functional:

for different values of the parameters α and β. Here, A is the measurement matrix, m the experimental data, f a discretization of the object to reconstruct, α the regularization parameter and ι≥0 the indicator function of the nonnegative orthant. The discretization of TGV is carried out by finite differences as described in details in [3].

For the solution of the minimization problems above, any kind of numerical algorithm for the solution of convex-concave saddle point problems can be used. The algorithm implemented in the dowloadable package above is a primal-dual ascent-descent method with primal extragradient, as described in the famous 2011 paper by Chambolle and Pock [4]. The main bulk of the code is in tomo_tgv.m. The script recon_tgv.m can be used to set up the experiment: loading the data and setting up the values for different parameters (the parameters α and β and the maximum number of iterations).

In the following computational examples we use the 256×256 lotus root dataset obtainable from here (documentation in arXiv). This dataset uses 120 projection directions. We set the maximum number of iterations equal to 10000 and β = 2, and we vary the value of the regularization regularization parameterα in the range {10−4, 10−5, 10−6} (see Figure 1). As always, the value for the regularization parameter is chosen to balance the data mismatch term and the prior information given by the TGV penalty.

Figure 1: Reconstructions with 10000 iterations and β = 2.

We can also play around with the value of β, keeping fixed the value of the regularization parameterα (and the maximum number of iterations equal to 10000) to see how this affect the reconstructions. For instance, we can fix α = 10−5 and vary β in {1.5, 2, 2.5} (see Figure 2).

True object                                                               β = 2

Figure 2: Reconstructions with 10000 iterations and α = 10−5.

References

[1] L. Rudin, S. Osher, and E. Fatemi, Nonlinear total variation based noise removal algorithms, Physica D 60, 259–268 (1992).

[2] K. Bredies, K. Kunisch, and T. Pock, Total Generalized Variation, SIAM Journal on Imaging Science3, 492–526 (2010).

[3] K. Bredies, Recovering piecewise smooth multichannel images by minimization of convex functionals with Total Generalized Variation penalty, Lecture Notes in Computer Science 8293, 44–77 (2014).

[4] A. Chambolle and T. Pock, A first-order primal-dual algorithm for convex problems with applications to imaging, Journal of Mathematical Imaging and Vision 40, 120–145 (2011).

Scroll to Top