
CECAM discussion meeting “Coarsegraining with Machine Learning in molecular dynamics” (Sanofi Campus Gentilly, December 46, 2018)
This event is organized by Paraskevi Gkeka (Sanofi), Tony Lelièvre, Pierre Monmarché and Gabriel Stoltz; see the webpage.

Workshop “Computational Statistics and Molecular Simulation: A Practical CrossFertilization” (BIRSOaxaca, November 1216, 2018)
This event is organized by Gabriel Stoltz in collaboration with Christian Robert and Luke Bornn; see the webpage

Workshop Advances in Computational Statistical Physics (September 17th21st, 2018, CIRM)
This event is organized by Tony Lelièvre and Gabriel Stoltz in collaboration with Greg Pavliotis ; see the webpage.

Workshop Simulation and probability: recent trends (58 June 2018, Rennes)
Particle Monte Carlo simulation methods usually refer to a now ubiquitous strategy where a sample of random “particles” approximating a target probability distribution are simulated using importance splitting and/or killing of particles according to some appropriate rules. This workshop aims at gathering in a single mathematically oriented event, some recent trends related to those particle methods. In particular: (i) the simulation of rare events and large deviation theory; and (ii) the filtering/assimilation of high dimensional data.
This event is coorganized by Mathias Rousset; see the webpage.

Working Group
 March, 21st, 10:00 at CERMICS (B211 seminar room). Etienne Bernard will present the article of Scott Armstrong and JeanChristophe Mourrat: Variational methods for the kinetic FokkerPlanck equation.
 January, 14th, 01:00pm at CERMICS (B211 seminar room). Marylou Gabrié (Département de Physique, Ecole Normale Supérieure de Paris) about:
Entropy and mutual information in models of deep neural networks.Abstract: In this talk we will consider information theory for deep neural networks, which is receiving raising interest in the community, either as an interesting observable to study optimisation algorithms such as stochastic gradient descent, as a tool to further improve improve the quality of learning. Nevertheless, it is in practice computationally intractable to compute entropies and mutual informations in large neural networks. In this talk, we will consider instead a class of models of deep neural networks, for which an expression for these informationtheoretic quantities can be derived with the replica method from statistical physics.This work was done in collaboration with Andre Manoel (Owkin), Clément Luneau (EPFL), Jean Barbier (EPFL), Nicolas Macris (EPFL), Florent Krzakala (LPS ENS) and Lenka Zdeborova (IPHT CEA).
 January, 9th, 03:00pm at CERMICS (B211 seminar room). Christophe Poquet (Institut Camille Jordan, Université Lyon 1) about:
Dynamique lente/rapide et émergence de comportements périodiques pour des modèles excitables en champ moyen.
Abstract: Nous verrons comment, en considérant des équations de FokkerPlanck non linéaires (décrivant la dynamique d’une population infinie d’unités bruitées en interaction de type champ moyen) comme des systèmes lents/rapides, on peut décrire l’émergence de comportements périodiques induits par l’effet combiné de l’interaction et du bruit. On s’intéressera particulièrement au cas où la dynamique interne de chaque unité est définie par le modèle de FitzHugh Nagumo.  December, 14th, 10:00 at CERMICS (B211 seminar room). Grégoire Ferré will present the article of Jianfeng Lu, Yulong Lu et James Nolen: Scaling limit of the Stein variational gradient descent part I: the mean field regime.
 December, 7th, 10:00, at CERMICS (B211 seminar room). Zofia Trstanova (Edinburgh) about:
Diffusion maps: a tool for local and global sampling in high dimensional systems
Abstract: In this talk, I will discuss the use of diffusion maps for dimensional reduction and approximation of the generator of Langevin dynamics from simulation data. I will consider both global and local perspectives on diffusion maps, based on whether or not the data distribution has been fully explored. In the first case, diffusion maps are used to identify the metastable sets and to approximate the corresponding committor functions describing transitions between them. I will also discuss the use of diffusion maps within the metastable sets, formalising the locality via the concept of the quasistationary distribution and justifying the convergence of diffusion maps within a local equilibrium. I will demonstrate both approaches on simple toymodels and higher dimensional molecular dynamics problems, providing technical details about the practical implementation.  November, 22nd, 15:30, at CERMICS (B211 seminar room). Maxime Sangnier (LPSM) about:
What can a statistician expect from GANs?Abstract: Generative Adversarial Networks (GANs) are a class of generative algorithms that have been shown to produce stateofthe art samples, especially in the domain of image creation. The fundamental principle of GANs is to approximate the unknown distribution of a given data set by optimizing an objective function through an adversarial game between a family of generators and a family of discriminators. In this talk, we illustrate some statistical properties of GANs, focusing on the deep connection between the adversarial principle underlying GANs and the JensenShannon divergence, together with some optimality characteristics of the problem. We also analyze the role of the discriminator family and study the large sample properties of the estimated distribution.
 November, 9th, 10:00, at CERMICS (B211 seminar room). Florent Hédin about:
The Generalized Parallel Replica dynamics for the long time simulation of metastable biochemical systems
Abstract: Metastability is one of the major encountered obstacle when performing long molecular dynamics simulations, and many methods were developed to address this challenge. The “Parallel Replica” (ParRep) dynamics is known for allowing to simulate very long trajectories of metastable Langevin dynamics in the materials science community, but it relies on assumptions that can hardly be transposed to the world of biochemical simulations. The later developed “Generalized ParRep” variant solves those issues, but it was not applied to significant systems of interest so far.
I will present the first publicly available implementation of the Generalized Parallel Replica method, targeting frequently encountered metastable biochemical systems, such as conformational equilibria or dissociation of proteinligand complexes. It will be shown that the resulting C++ implementation exhibits a strong linear scalability, providing up to 70 % of the maximum possible speedup on several hundreds of CPUs.  October, 18th, 14:00, at CERMICS (room F103). Upanshu Sharma about: Effective dynamics for nonreversible SDEs
Abstract: Starting with a stochastic differential equation and a coarsegraining map, Legoll and Lelièvre have proposed an effective dynamics which approximates the true projected dynamics. In recent years much attention has been devoted to the study of these equations especially in the setting of reversible SDEs. In this talk, I will present recent results on effective dynamics starting from nonreversible SDEs and their link to free energy. This is joint work with F. Legoll and T. Lelièvre.
 August, 27th, 10:00, at CERMICS, seminar room. Yannis Pantazis about: Uncertainty Quantification and Sensitivity Analysis with additional discussion on generative models using machine learning.
 June, 20th, 10:00, at CERMICS, seminar room. Takahiro Nemoto, about: Cloning algorithm to measure large deviation functions of dynamical quantities: principles & applications.
Abstract: Large deviations of nonequilibrium timeextensive quantities have been extensively studied in the last decade in systems ranging from (a)thermally fluctuating particles (Brownian particles, biological motors, Granular particles…), exactly solvable lattice gas models (ASEP, KPZ, KCMs…) as well as highdimensional chaotic dynamics (FPU chain, climate model,…). By definition, studying large deviations is difficult since the fluctuations leading to their occurrence are hardly observed. In this seminar, I will present an algorithm which allows the observation of these rare events in numerical simulations. The algorithm is based on population dynamics (a.k.a. splitting or diffusion quantum MonteCarlo method) [1]: an ensemble of copies of the system is simulated and the dynamics of the population includes a selectionmutation process. Namely, rare copies are multiplied (have descendants) but typical ones are killed (become extinct) to select atypical trajectories of interest. After introducing this algorithm in a pedagogical way, I will present recent applications of the algorithm to active Brownian particles, a model of selfpropelled particles, which show unexpected dynamical phase transitions to flocking/jammed states in their rare events [2].
[1] Cristian Giardinà, Jorge Kurchan and Luca Peliti, Phys. Rev. Lett. 96, 120603 (2006).
[2] T.N., Michael E. Cates, Étienne Fodor, Robert L. Jack and Julien Tailleur, arXiv:1805.02887 (2018).  May, 16th, 14:00, at CERMICS (seminar room). Richard Kraaij, about: Fluctuations for a dynamic CurieWeiss model of selforganized criticality.
Abstract: The CurieWeiss model of selforganized criticality (CW model of SOC) was introduced by Cerf and Gorny(2014,2016) as a modification of the CurieWeiss model of ferromagnetism that drives itself into a critical state. We consider a dynamic variant of this model, i.e. a system of interacting SDE’s, and study its dynamical fluctuations.
Based on joint work with Francesca Collet(Delft) and Matthias Gorny(ParisSud).  April, 30th, 10:00, at CERMICS (seminar room). Manon Michel, about: Rejectionfree MonteCarlo algorithms.
 April, 10th, 14:00, at CERMICS. Tony Lelièvre, about: Groupe de lecture sur “Metastability” de Bovier et Den Hollander.
 April, 6th, 14:00, at CERMICS. Julien Reygner, about: Groupe de lecture sur “Metastability” de Bovier et Den Hollander.
 March, 28th, 10:00, at CERMICS. Grégoire Ferré, about: Molecular dynamics without dynamics.
Abstract: Many applications in statistics and statistical physics require the computation of highdimensional integrals — think for example of a physical system at equilibrium distributed according to a Gibbs distribution. By computing such averages, one can e.g. derive the average pressure of the system, or other relevant quantities. Since numerical quadrature is untractable, and since it is impossible to draw samples directly according to the target distribution, practitioners generally resort to Monte Carlo Markov Chains (MCMC) techniques, and run a long time simulation whose stationary distribution is close to the target distribution.
Although this technique has proved very powerful in practice, it may suffer from different problems : computational cost, metastability, stability of numerical schemes… In a recent paper, Liu and Wang propose a different point of view on the problem of drawing samples according to a complex target distribution. Their idea is to learn a transformation between a simple initial measure and the target by minimizing the entropy between the two measures, a seemingly basic fact. However, and surprisingly, they come up with a close form for the gradient of this entropy, which involves the Stein operator, known in probability theory. In addition, optimizing over a Reproducing Kernel Hilbert Space, they derive a practical algorithm that they use to learn complex distributions from problems in up to 10000 dimensions. The algorithm has the appealing interpretation of a mean field model that balances minimization of the energy and avoiding concentration effects.
This triggered our interest: can we draw samples according to a complex distribution without resorting to MCMC techniques in problems arising in statistical physics ? The goal of the talk is to present the general idea of the algorithm, point out its advantages on weaknesses, and show its behavior on toy examples. Knowing whether this technique can be used for very rough energy landscapes is an open question. However, even though the algorithm is not applicable as such, the tools developed by the authors seem promising for various purposes, like reducing metastability or exploring the phase space, which should stimulate the discussion!  March, 12th, 10:00, at CERMICS. François Portier, about : Monte Carlo integration with a growing number of control variates.
Abstract: The use of control variates is a wellknown variance reduction technique in Monte Carlo integration. If the optimal linear combination of control variates is estimated by ordinary least squares and if the number of control variates is allowed to grow to infinity, the convergence rate can be accelerated, the new rate depending on the interplay between the integrand and the control functions. The standardized error is still asymptotically normal and the asymptotic variance can still be estimated by the residual variance in the underlying regression model. The ordinary least squares estimator is shown to be superior to other, possibly simpler control variate estimators, even at equal computation time. The method is applied to increase the precision of the method of maximum simulated likelihood to deal with latent variables. Its performance is found to be particularly good because of two reasons: the integrands are smooth and can thus be approximated well by polynomial or spline control functions; the number of integrands is large, reducing the computational cost since the Monte Carlo integration weights need to be calculated only once.This is joint work with Johan Segers.  January, 29th, 10:00, at CERMICS. Tony Lelièvre, about : Groupe de lecture sur “Metastability” de Bovier et Den Hollander.