The prime goal of this project is the development of mathematical analysis and numerical algorithms
for variational methods for data, which are high-dimensional, vector-valued (e.g. color data), sparse
and, in particular, defined on manifolds. Our research on methods for filtering data of such kind is
motivated by, but not restricted to, biological applications. The biological data serve as benchmark
problems for the numerical algorithms to be developed during the course of the project. However, the
developed mathematical analysis should be universally applicable.

Our particular biological application is a quantitative analysis of cell motion and splitting in model
organisms like the zebra-fish during early embryonic development. The available imaging data have been
recorded with a 3D fluorescence microscope, are spatio-temporal and of high resolution. An important
characteristic of these data is that all the relevant information, in this case fluorescence-marked cells,
is concentrated in the neighborhood of a manifold, the surface of the embryo's yolk. In addition the shape
of this surface varies over time.

[1] C. Kirisits, L. F. Lang, and O. Scherzer. Optical flow on evolving surfaces with space and time regularisation. J. Math. Imaging Vision, 52(1):55–70, 2015.

[2] C. Kirisits, L. F. Lang, and O. Scherzer. Decomposition of optical flow on the sphere. GEM. Int. J. Geomath., 5(1):117–141, 2014.

Regularization is an approach to approximate reconstruction of the unknown functional dependency from available
noisy data. This approach is often based on a compromise between the attempt to fit given data and the desire
to reduce complexity of a data fitter.

Starting from the pioneering works in the mid-sixties a huge body of regularization theory has been built around
the issue of choosing the regularization parameter. At the same time, the most recent trend in the regularization
theory has led to a new line of research of the adaptive choice of the regularization space, with still many open
questions.

[1] Massimo Fornasier, Valeriya Naumova, and Sergei V. Pereverzyev. Parameter Choice Strategies for Multipenalty Regularization. SIAM J. Numer. Anal., Vol. 52(4), 2014, pp. 1770–1794, DOI:10.1137/130930248

[2] Shuai Lu, and Sergei V. Pereverzev. Regularization Theory for Ill-posed Problems. Selected Topics. Inverse and Ill-Posed Problems Series (Vol. 58). Berlin, Boston: De Gruyter. 2014

[3] Gerhards Christian; Jr. Sergiy Pereverzyev; Tkachenko Pavlo. A parameter choice strategy for the inversion of multiple observations. Advances in Computational Mathematics, Bd. 43 (1), S. 101-112, 2017

[4] Pereverzyev Sergei V.; Mathe Peter. Complexity of linear ill-posed problems in Hilbert space. Journal of Complexity, Bd. 38, S. 50-67, 2017

Natural and social phenomena usually emerge from the behavior of complex systems consisting of interacting
components or variables. In practice, we do not have a direct access to the “laws” governing the underlying
relationships between them; instead, we are faced with a data set recorded from the possibly interacting
variables.

The learning problem consists of inferring a function that maps between the variables in a predictive fashion,
such that this function can be used to predict result from future input. It is well-known that the regularization
theory can be profitable used in the context of learning theory.

The starting point is a representation of the learning problem as a discretized version of some ill-posed equation
in hypothesis space. Then various regularization methods may be applied to the corresponding normal equation. This
opens the door to the use of the whole arsenal of regularization methods in the context of learning.

[1] Galyna Kriukova, Oleksandra Panasiuk, Sergei V. Pereverzyev, Pavlo Tkachenko. A linear functional strategy for regularized ranking. Neural Networks, Vol. 73, January 2016, pp 26-35, DOI:10.1016/j.neunet.2015.08.012.

[2] Galyna Kriukova, Sergei V. Pereverzyev, Pavlo Tkachenko. On the convergence rate and some applications of regularized ranking algorithms. Journal of Complexity, Available online 25 September 2015, DOI:10.1016/j.jco.2015.09.004.

[3] Kriukova Galyna, Pereverzyev Sergiy Jr and Tkachenko Pavlo. Nyström type subsampling analyzed as a regularized projection. Inverse problems, Bd. 33 (7), S. 074001, 2017

[4] Tkachenko Pavlo, Pereverzyev Sergei V.. Regularization by the Linear Functional Strategy with Multiple Kernels. Frontiers in Applied Mathematics and Statistics, S. 9, 2017

In view of 366 millions of people affected by diabetes, this disease in now a global healthcare challenge. The treatment of diabetes is one of the most difficult therapies to manage and require the development of special technologies. We work on the developing new mathematical tools for diabetes therapy management, where the key problem is to predict the future blood glucose levels evolution for a diabetic patient from available current and past information about therapeutically valuable factors.

[1] Sampath Sivananthan, Valeriya Naumova, Chiara Dalla Man, Andrea Facchinetti, Eric Renard, Claudio Cobelli, and Sergei V. Pereverzyev. Assessment of blood glucose predictors: the prediction-error grid analysis. Diabetes Technology & Therapeutics. Vol. 13(8), 2011, pp. 787-96. DOI:10.1089/dia.2011.0033

[2] Valeriya Naumova, Sergey V. Pereverzyev, Sivananthan Sampath. A Meta-Learning Approach to the Regularized Learning – Case Study: Blood Glucose Prediction. Neural Networks, vol. 33, 2012, pp. 181-193. DOI: 10.1016/j.neunet.2012.05.004

[3] Tkachenko Pavlo; Kriukova Galyna; Aleksandrova Marharyta; Chertov Oleg; Renard Eric et al. [..]. Prediction of nocturnal hypoglycemia by an aggregation of previously known prediction approaches: proof of concept for clinical application. Computer Methods and Programs in Biomedicine, Bd. 134, S. 179-186, 2017

[4] Sampath Sivananthan; Tkachenko Pavlo; Renard Eric; Pereverzev Sergei V.. Glycemic Control Indices and Their Aggregation in the Prediction of Nocturnal Hypoglycemia From Intermittent Blood Glucose Measurements. Journal of Diabetes Science and Technology, Bd. 10, S. 1245-1250, 2016

The classical water wave problem is concerned with the flow of a perfect fluid of unit density, subject to the forces
of gravity and surface tension. This problem is mathematically described by the Euler equations with a free surface over
a flat bottom. In this project, we consider two-dimensional inviscid periodic travelling waves with vorticity.

Our goal is to find the free surface of the water wave and, furthermore, obtain other characteristics of the water flow,
such as the velocity field and the pressure beneath the wave. Both the quantitative and qualitative knowledge of these
characteristics allows the more accurate simulation of the relevant water waves in the laboratory.

We follow two approaches:

The first relies on the work in [1], and requires the numerical and analytical studying of partial differential equations,
see [2] and [3].

The second relies on a non-local formulation of the problem, see [4].

[1] A. Constantin and W. Strauss. Exact steady periodic water waves with vorticity, Comm. Pure Appl. Math., 2004.

[2] A. Constantin, K. Kalimeris, and O. Scherzer. A penalization method for calculating the flow beneath travelling water waves of large amplitude, SIAM Journal on Applied Mathematics, 2015.

[3] A. Constantin, K. Kalimeris, and O. Scherzer. Approximations of steady periodic water waves in flows with constant vorticity, Nonlinear Analysis: Real World Applications, 2015.

[4] A. S. Fokas and K. Kalimeris. A novel non-local formulation of water waves, Lectures on the Theory of Water Waves, London Mathematical Society Lecture Note Series, to appear January 2016.

We are interested in mean-curvature related equations, which appear naturally in different problems: phase transition, image processing, fluid mechanics... In particular, we focus on mean curvature flow and prescribed mean curvature surfaces (H-surfaces), such as minimal surfaces.

It is well known that H-surfaces are linked to minimization of total variation (TV), and then continuity results for TV-minimizers can be seen as non contact properties for H-surfaces. We proved several continuity results for constrained minimizers of TV taking advantage of this geometric structure.

- "Continuity results for TV-minimizers", to appear in Indiana University Mathematics Journal. https://www.iumj.indiana.edu/IUMJ/Preprints/7393.pdf

In addition, total variation minimization arises naturally when studying velocity profiles of solid cylindrical inclusions flowing in Bingham fluid under gravity. These profiles can be analytically shown to be solutions of a quite simple geometrical problem (of Cheeger type) and, for simple shapes, can even be computed explicitly.

- Ian A. Frigaard, José A. Iglesias, Gwenael Mercier, Christiane Pöschl, and Otmar Scherzer, Critical Yield Numbers of Rigid Particles Settling in Bingham Fluids and Cheeger Sets, SIAM J. Appl. Math., 77(2), 638–663, 2017. http://epubs.siam.org/doi/10.1137/16M10889770

The Photoacoustic tomography is a hybrid imaging modality which combines the high contrast of optical tomography and the high resolution of ultrasound modality. One of the applications of photoacoustic tomography is in breast imaging for early cancer detection since optical tomography provides high contrast which discriminates between healthy and non-healthy tissue whereas ultrasound provides high spatial resolution which is important in detecting anomalies earlier.

We are studying the photoacoustic model taking into consideration that the material has variable density and bulk modulus is spatially varying. We are looking on how to simultaneously reconstruct absorption density, bulk modulus and material density from photoacoustic measurements using photoacoustic sectional imaging.

In different imaging scenarios, such as medical and biological applications, often several images of the same or similar objects are taken, and in many cases one is interested in the shape and position of the objects, rather than their contrast. This leads to the problem of automatically finding deformations matching a set of given observations, appearing as a tool to control the acquisition environment (registration), as a way to quantify observed changes, or as a first step in shape statistics.

One such situation is when the visible geometrical features of the surface of the objects are the main concern, rather than
their interiors. One can then think of the objects as surfaces embedded in space. For this surface matching problem, we have
considered a minimization problem for a deformation energy depending on the signed distance functions to the surfaces, which
is inspired in nonlinear models for elastic shells. Thanks to this level set framework, our model [1] has the right geometric
invariances, reflects resistance to surface expansion/compression and curvature mismatches, does not develop microstructure,
and allows for straightforward discretization through adaptive finite elements.

Another such problem is that of optical flow, in which the apparent motion undergone by the objects in an image sequence is sought. Due to the lack of information along object edges, it is an ill-posed problem leading to variational regularization approaches. Additionally, when more than two frames are simultaneously considered, time regularization is needed to ensure consistency. In [2] we have proposed using the norm of the convective acceleration along the flow as time regularization and demonstrated improved results of a simple semi-implicit scheme for it, when compared to using the usual time derivative.

[1] J.A. Iglesias, M. Rumpf, O. Scherzer. Shape Aware Matching of Implicit Surfaces Based on Thin Shell Energies. ArXiv:1509:06559

[2] J.A. Iglesias, C. Kirisits. Convective regularization for optical flow. (English summary) Variational methods, 184–201, Radon Ser. Comput. Appl. Math. 18. De Gruyter, Berlin, 2017.

[3] J.A. Iglesias, B. Berkels, M. Rumpf, O. Scherzer. A Thin Shell Approach to the Registration of Implicit Surfaces. In Proceedings of the 18th International Workshop on Vision, Modeling and Visualization (VMV 2013), pp. 89-96, 2013.

The question of detecting interfaces from remote measurements appears naturally in many applications as in the non
destructive testing, medical imaging, seismology etc. We are interested in the PDE models related to the acoustics,
electromagnetism and elasticity. Depending on the applications, we can distinguish two kinds of measurements:

(A) Repeated remote measurements at a fixed frequency. In this case, we can generate the scattered fields of some special
solutions of the underlying PDE models. These solutions are of Green type solutions or Fourier type solutions ( i.e.
geometrical optics solutions). The interface is then detected by analysing the singularities of these special solutions.

(B) Remote measurements corresponding to multiple frequencies. The motivation is that the low frequency measurements
encode the large features of the interfaces (as their sizes) while the large frequency measurements contain the smaller
details. To justify and quantify these properties, we need to analyse the scattered fields at the low and hight frequencies
and derive explicit frequency-dependent convergence rates of optimization methods.

[1] M. Kar and M. Sini. Reconstruction of interfaces from the elastic farfield measurements using CGO solutions. SIAM J. Math. Anal. 46(2014), no. 4, 2650-2691.

[2] M. Sini and N. T. Thanh. Convergence rates of recursive Newton-type methods for multifrequency scattering problems. ESAIM Math. Model. Numer. Anal. 49 (2015), no. 2, 459-480.

[3] L. Rondi and M. Sini. Stable determination of a scattered wave from its far-field pattern: the high frequency asymptotics. Arch. Ration. Mech. Anal. 218 (2015), no. 1, 1-54.

[4] A. Mantile, A. Posilicano, M. Sini. Uniqueness in inverse acoustic scattering with unbounded gradient across Lipschitz surfaces. ArXiv:1702.05312.

The wave propagation by small inhomogeneities is well studied, via the homogenization theory, in the case where these inhomogeinities are periodically distributed in the background. Our aim is to avoid this periodicity assumption and provide perturbative formulas of the scattered waves taking into account all the parameters describing the cluster of small inhomogeneities (as their number, their maximum size, the minimum distance between them and the possible jumps across their interfaces). Applications of our formulas to material sciences (as for metamaterials and cloaking) are immediate.

[1] D. P. Challa and M. Sini. On the justification of the Foldy-Lax approximation for the acoustic scattering by small rigid bodies of arbitrary shapes. Multiscale Model. Simul. 12 (2014), no. 1, 55-108.

[2] A. Alsaedi, B. Ahmed, D. P. Challa, M. Kirane, M. Sini. A cluster of many small holes with negative imaginary surface impedances may generate a negative refraction index. Math. Meth. Appl. Sci. DOI: 10.1002/mma.3805 (2015).

[3] F. Al- Musallam, D. P. Challa, M. Sini. The equivalent medium for the elastic scattering by many small rigid bodies and applications. IMA J. Appl. Math. 81 (2016), no 6 1020–1050.

This is concerned with the mathematical and numerical analysis of imaging modalities using electric or magnetic nanoparticles as contrast agents. An electric (resp. magnetic) nanoparticle is characterized by a large contrast of its own relative permitivitty $\varepsilon$ (resp. relative permeability $\mu$) which is of the order $a^{-\alpha}$, with $\alpha>0$, where $a$ is its relative radius, $a<<1$ (estimated with few nanometers), keeping the relative speed of propagation moderate in terms of $a$. The general idea in such imaging modalities is to collect remotely the data before and after injecting (or delivering) the nanoparticles in the targeted region. The believe is that by contrasting these data we can recover the inner values of the permitivitty of the tissue in that region. Two imaging modalities are of particular interest:

Imaging exploiting electric nanoparticles.

The main idea here is that while injecting
electric type nanoparticles, the electric field (and hence the magnetic field) will
be locally enhanced. In mathematical terms, this enhancement is translated by the
ability to recover the total fields at the 'center' of the nanoparticles from remote
measurements. The mathematical model to consider here is the full Maxwell system with
highly contrasted transmission conditions.

Imaging exploiting magnetic nanaparticles.

The model here is the photoacoustic imaging
using injected magnetic nanoparticles (as Gold nanoparticles). Exciting injected Gold
nanoparticles with magnetic incident waves at certain frequencies will create heat in
their surroundings which in turn will create a propagating acoustic pressure. This
acoustic pressure can be measured away from the location of the tumor. The goal is
to recover the electric permittivity from these measured data.

[1] A. Alsaedi, F. Alzahrani, D. P. Challa, M. Kirane, M. Sini. Extraction of the index of refraction by embedding multiple small inclusions. Inverse Problems 32 (2016), no 4, 045004, 18 pp.

[2] D. P. Challa, A. P. Choudhury, M. Sini. Mathematical imaging using electric or magnetic nanoparticles as contrast agents. ArXiv:1705.01498