In [6] we define an analogue of the Ginzburg-Landau functional on graphs, minimizers of which we use as estimators to the classification problem.

There are two terms, one represents data fidelity and the other penalises soft classifications: are given different labels.

Rohde, Optimal Mass Transport: Signal processing and machine-learning applications, IEEE Signal Processing Magazine, 34(4):43-59, 2017. For example, the smoothing spline problem, which is the problem of fitting a smooth curve through data, can be written as the 'best' curve (in some suitable space) that fits the data.

The power of variational methods is their ability to deal with high frequency oscillations.

I am teaching 21-470 Sepected Topics in Analysis: Introduction to Mathematical Biology in the Fall 2016. The objective is to optimally place the cluster centres. Our results concerned the convergence of minimizers û of E with the number of data points [2] and the associated rates [4].

One could also design custom made energies that are designed with certain behaviours in mind.

ρ(x,y) = c if |x-y|≤ ε and ρ(x,y) = 0 otherwise.

An important question that has been driven by applications in machine learning is how to partition the data set.

Graphical modelling is one way to develop a 'data driven classification method'.

Given a data set ⊂ X and a measure of dissimilarity ρ: X× X→ [0,∞ ) one can define a graph where the nodes are the data points and edges between data points are weighted using ρ. A simple example would be the graph that connects all data points that fall within a certain distance of each other, i.e.