Johannes Alt (University of Bonn)
Spectral Phases of Erdős–Rényi graphs
We consider the Erdős–Rényi graph on $N$ vertices with edge probability $d/N$. It is well known that the structure of this graph changes drastically when $d$ is of order $log N$. Below this threshold it develops inhomogeneities which lead to the emergence of localized eigenvectors, while the majority of eigenvectors remains delocalized. In this talk, I will present the phase diagram depicting these localized and delocalized phases and our recent progress in establishing it rigorously.
This is based on joint works with Raphael Ducatez and Antti Knowles.
Arzu Ahmadova (University of Duisburg-Essen)
Convergence results for gradient systems in the artificial neural network training
The field of artificial neural network (ANN) training has garnered significant attention in recent years, with researchers exploring various mathematical techniques for optimizing the training process. Among these, deep neural networks, or deep learning, have gained popularity due to their ability to learn complex and abstract features from large datasets. In particular, this paper focuses on advancing the current understanding of gradient flow and gradient descent optimization methods. Our aim is to establish a solid mathematical convergence theory for continuous-time gradient flow equations and gradient descent processes, and show that bounded trajectories of gradient system dynamics actually converge to the set of extrema.
This talk is based on a joint work with Martin Hutzenthaler.
Dominik Schmid (University of Bonn)
Biased random walk on dynamical percolation
We consider a biased random walk on dynamical percolation and discuss the existence and the properties of the linear speed as a function of the bias. In particular, we establish a simple criterion to decide whether the speed is increasing or decreasing for large bias. This talk is based on joint work with Sebastian Andres, Nina Gantert, and Perla Sousi.