Department Colloquium in Winterterm
Talks in Winter Semester 2022/23
At the Colloquium of the Department of Mathematics international researchers report on their work. The colloquium will be held on Wednesdays in "Hörsaal 3" (MI 00.06.011).
We invite all interested parties.
These are the lecture dates in Winter Semester 2022/23:
- 16. November 2022: 14:30 Marco Mondelli
- 18. January 2023: 14:30 Richard Kueng + 16:00 Eduard Feireisl
- 08. February 2023: 14:30 Matthias Drton
Topics at the colloquium are:
Understanding gradient descent for over-parameterized deep neural networks: Insights from mean-field theory and the neural tangent kernel
16. November 2022, 14:30 - 15:30
Prof. Marco Mondelli
Training a neural network is a non-convex problem that exhibits spurious and disconnected local minima. Yet, in practice neural networks with millions of parameters are successfully optimized using gradient descent methods. In this talk, I will give some theoretical insights on why this is possible and discuss two approaches to study the behavior of gradient descent. The first one takes a mean-field view and it relates the dynamics of stochastic gradient descent (SGD) to a certain Wasserstein gradient flow in probability space. I will show how this idea allows to study the connectivity, convergence and implicit bias of the solutions found by SGD. In particular, I will focus on a recent work proving that, among the many functions that interpolate the data, ReLU networks at convergence implement a simple piecewise linear map of the inputs. The second approach consists in the analysis of the Neural Tangent Kernel. I will show how to bound its smallest eigenvalue in deep networks with minimum over-parameterization, and discuss implications on memorization and optimization.
Nobody is perfect: Problems with models of perfect fluids
18. January 2023, 16:00 - 17:00
Prof. Eduard Feireisl
Institute of Mathematics of the Czech Academy of Sciences
We review some recent results on the Euler system describing the motion of a perfect (meaning inviscid) compressible fluid.
The main topics include:
1. Existence and density of ``wild'' initial data giving rise to infinitely many solutions
2. Solutions with anomalous (discontinuous) energy profile
3. Violating of determinism in the class of weak solutions
4. Possibilities how to restore order in chaos
Title: Learning to predict ground state properties of gapped Hamiltonians
Subtitle: Provably efficient machine learning for quantum many-body problems
18. January 2023, 14:30 - 15:30
Prof. Richard Kueng
JKU Institute for Integrated Circuits
Classical machine learning (ML) provides a potentially powerful approach to solving challenging quantum many-body problems in physics and chemistry. However, the advantages of ML over traditional methods have not been firmly established. In this work, we prove that classical ML algorithms can efficiently predict ground-state properties of gapped Hamiltonians after learning from other Hamiltonians in the same quantum phase of matter. By contrast, under a widely accepted conjecture, classical algorithms that do not learn from data cannot achieve the same guarantee.
Our proof technique combines mathematical signal processing with quantum many-body physics and also builds upon the recently developed framework of classical shadows. I will try to convey the main proof ingredients and also present numerical experiments that address the anti-ferromagnetic Heisenberg model and Rydberg atom systems.
This is joint work with Hsin-Yuan (Robert) Huang, Giacomo Torlai, Victor Albert and John Preskill, see
Consistent tests of independence via rank statistics
08. February 2023, 14:30 - 15:30
Prof. Matthias Drton
Technical University of Munich
A number of modern applications call for statistical tests that are able to consistently detect non-linear dependencies between a pair of random variables based on a random sample drawn from the pair's joint distribution. This has led to renewed interest in the classical problem of designing measures of correlation. When the considered random variables are continuous, it is appealing to define correlations on the basis of the ranks of the data points as the resulting tests become distribution-free. In this lecture, I will first review recent progress on rank correlations that yield consistent tests. In a second part, I will turn to the problem of detecting dependence between random vectors and discuss how to construct consistent and distribution-free tests with the help of a recently introduced approach to define multivariate ranks using optimal transport.
Further information can be found on the overview page of the Colloquium of the Department of Mathematics.