POSTPONED
From Hopfield Inference to Federated Learning: challenges and solutions in Teacher–Student models
Gianluca Manzan
LISN, Paris Saclay University
Seminar of the Series MLP@P (Machine Learning Physics @ Plateau), joint with LISN and IPhT.
Where: LISN, bat 660 salle 2014 (2° étage)
In this work, we investigate inference in neural networks through the teacher–student framework, which provides a controlled setting to quantify how a student model learns the underlying signal from data generated by a teacher. Beginning with the Hopfield model, interpreted as a dual formulation of associative memory, we characterize the transition between non-informative and learning phases as a function of dataset size, noise level, and temperature. Extending the analysis to Restricted Boltzmann Machines, we show how choices in unit priors and regularization shape the emergence of the signal-retrieval phase and thus determine learning efficiency.
We then address the limitations of the single-student scenario by introducing a collective learning strategy in which multiple student networks are coupled during inference. Recent advances in statistical-physics of learning show that interactions among students enhance generalization, thereby facilitating the teacher recovery. Our analysis of y-interacting Hopfield students confirms this cooperative effect, demonstrating that coupling expands the region of successful inference by lowering data requirements.
This collective perspective has a natural application to federated learning (FL), a decentralized paradigm where multiple clients collaboratively train local models without sharing their private data. In FL, each client performs local updates based on its own dataset and communicates only through model parameters. The cooperative mechanism observed in coupled teacher–student systems provides a theoretical analogue to this framework: just as interacting students benefit from mutual alignment, federated clients collectively enrich the global solution while retaining data locality.
