Fakultätskolloquium Winter 2022/23

Vorträge im Wintersemester 2022/23

16. November 2022 14:30 – 8. Februar 2023 17:00
Fakultätskolloquium im Sommer 2022

Beim Fakultätskolloquium der Mathematik berichten internationale Forscher*innen von ihrer Arbeit. Das Kolloquium an der Fakultät für Mathematik findet in loser Folge mittwochs im Hörsaal 3 (MI 00.06.011) statt. Alle Interessierten laden wir herzlich ein.

Das sind die Vortragstermine im Wintersemester 2022/23: 

  • 16. November 2022: 14:30 Marco Mondelli
  • 18. Januar 2023: Eduard Feireisl + Richard Kueng
  • 08. Februar: Matthias Drton

Themen beim Fakultätskolloquium

Understanding gradient descent for over-parameterized deep neural networks: Insights from mean-field theory and the neural tangent kernel

Prof. Marco Mondelli

16. November 2022, 14:30 - 15:30  

Prof. Marco Mondelli

IST Austria

Training a neural network is a non-convex problem that exhibits spurious and disconnected local minima. Yet, in practice neural networks with millions of parameters are successfully optimized using gradient descent methods. In this talk, I will give some theoretical insights on why this is possible and discuss two approaches to study the behavior of gradient descent. The first one takes a mean-field view and it relates the dynamics of stochastic gradient descent (SGD) to a certain Wasserstein gradient flow in probability space. I will show how this idea allows to study the connectivity, convergence and implicit bias of the solutions found by SGD. In particular, I will focus on a recent work proving that, among the many functions that interpolate the data, ReLU networks at convergence implement a simple piecewise linear map of the inputs. The second approach consists in the analysis of the Neural Tangent Kernel. I will show how to bound its smallest eigenvalue in deep networks with minimum over-parameterization, and discuss implications on memorization and optimization.

Weitere Informationen erhalten Sie auf der Übersichtsseite zum Fakultätskolloquium.