1 code implementation • 22 Apr 2024 • Anthony Baptista, Alessandro Barp, Tapabrata Chakraborti, Chris Harbron, Ben D. MacArthur, Christopher R. S. Banerji
To illustrate this idea, we present a computational framework to quantify the geometric changes that occur as data passes through successive layers of a DNN, and use this framework to motivate a notion of `global Ricci network flow' that can be used to assess a DNN's ability to disentangle complex data geometries to solve classification problems.
no code implementations • 29 Nov 2023 • Andrea Marinoni, Pietro Lio', Alessandro Barp, Christian Jutten, Mark Girolami
The reliability of graph embeddings directly depends on how much the geometry of the continuous space matches the graph structure.
no code implementations • 16 Aug 2023 • Marcelo Hartmann, Bernardo Williams, Hanlin Yu, Mark Girolami, Alessandro Barp, Arto Klami
We use Riemannian geometry notions to redefine the optimisation problem of a function on the Euclidean space to a Riemannian manifold with a warped metric, and then find the function's optimum along this manifold.
no code implementations • 10 Nov 2022 • Heishiro Kanagawa, Alessandro Barp, Arthur Gretton, Lester Mackey
Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation and can be computed even when the target density has an intractable normalizing constant.
no code implementations • 26 Sep 2022 • Alessandro Barp, Carl-Johann Simon-Gabriel, Mark Girolami, Lester Mackey
Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference.
no code implementations • 20 Mar 2022 • Alessandro Barp, Lancelot Da Costa, Guilherme França, Karl Friston, Mark Girolami, Michael I. Jordan, Grigorios A. Pavliotis
In this chapter, we identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.
no code implementations • 23 Jul 2021 • Guilherme França, Alessandro Barp, Mark Girolami, Michael I. Jordan
Optimization tasks are crucial in statistical machine learning.
no code implementations • 6 May 2021 • Alessandro Barp, So Takao, Michael Betancourt, Alexis Arnaudon, Mark Girolami
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
no code implementations • 16 Jun 2020 • Carl-Johann Simon-Gabriel, Alessandro Barp, Bernhard Schölkopf, Lester Mackey
More precisely, we prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, whose reproducing kernel Hilbert space (RKHS) functions vanish at infinity, metrizes the weak convergence of probability measures if and only if k is continuous and integrally strictly positive definite (i. s. p. d.)
no code implementations • NeurIPS 2019 • Alessandro Barp, Francois-Xavier Briol, Andrew B. Duncan, Mark Girolami, Lester Mackey
We provide a unifying perspective of these techniques as minimum Stein discrepancy estimators, and use this lens to design new diffusion kernel Stein discrepancy (DKSD) and diffusion score matching (DSM) estimators with complementary strengths.
no code implementations • 13 Jun 2019 • Francois-Xavier Briol, Alessandro Barp, Andrew B. Duncan, Mark Girolami
While likelihood-based inference and its variants provide a statistically efficient and widely applicable approach to parametric inference, their application to models involving intractable likelihoods poses challenges.
1 code implementation • 9 May 2019 • Wilson Ye Chen, Alessandro Barp, François-Xavier Briol, Jackson Gorham, Mark Girolami, Lester Mackey, Chris. J. Oates
Stein Points are a class of algorithms for this task, which proceed by sequentially minimising a Stein discrepancy between the empirical measure and the target and, hence, require the solution of a non-convex optimisation problem to obtain each new point.
no code implementations • 8 May 2017 • Alessandro Barp, Francois-Xavier Briol, Anthony D. Kennedy, Mark Girolami
The aim of this review is to provide a comprehensive introduction to the geometric tools used in Hamiltonian Monte Carlo at a level accessible to statisticians, machine learners and other users of the methodology with only a basic understanding of Monte Carlo methods.