Exposé de Lionel Gil de l'Institut de Physique de Nice, jeudi 3 juillet à 14h

Lionel Gil chargé de recherche au CNRS à l'Institut de Physique de Nice (INPHYNI) de l'Université Côte d'Azur donnera un séminaire d'une heure le jeudi 3 juillet 2025 à 14 heures en salle 007 du bâtiment Euclide aux Algorithmes.
Title : Generalization emerges from local optimization in a self-organized learning network
Abstract :
We design and analyze a new approach to self-organizing computing networks, driven only by local optimization rules without relying on a global error function. Traditional neural networks with a fixed topology are made up of identical nodes and derive their expressive power from an appropriate adjustment of connection weights. In contrast, our network stores new knowledge in the nodes accurately and instantaneously, in the form of a lookup table. Only then is some of this information structured and incorporated into the network geometry. The training error is initially zero by construction and remains so throughout the dynamics of the network topology evolution. The latter involves a small number of local topological transformations, such as splitting or merging nodes and adding binary connections between them, which can be applied again and again. The choice of operations to be carried out is only driven by optimization of expressivity at the local scale. One key feature of connectionist networks is their ability to generalize, i.e. their capacity to correctly answer questions outside of their training set. We show on numerous examples of classification tasks that our self-organized networks systematically reach such a state of perfect generalization when the number of learned examples becomes sufficiently large. We report on the dynamics of the change of state and show that it is abrupt and has the distinctive characteristics of a first order phase transition, a phenomenon already observed for traditional learning networks and known as grokking.