T-I
Goal: derive the equilibrium phase diagram of the simplest spin-glass model, the Random Energy Model (REM). The REM is defined assigning to each configuration of the system a random energy . The random energies are independent, taken from a Gaussian distribution
.
Key concepts: average value vs typical value, large deviations, rare events, saddle point approximation, self-averaging quantities, freezing transition, .
Some probability notions relevant at large N
- We will consider random variables which depend on a parameter and which have the scaling ; the scaling means that the rescaled variable has a well defined distribution that remains of when . A standard example are partition functions of disordered systems with degrees of freedom, : here and , where is the free energy.
- A random variable depending on a parameter is self-averaging when the width of its distribution goes to zero as . When the random variable is not self-averaging, it remains distributed in the limit . If is self-averaging, then
- Let be the distributions of and . In general, quantities like have a distribution that for large takes the form where is some positive function and . This is called a large deviation form for the probability distribution, with speed . This distribution is of for the value such that : this value is the typical value of ; all the other values of are associated to a probability that is exponentially small in : they are exponentially rare .
- Averages over distributions having a large deviation form can usually be computed with the saddle point approximation for large . Let’s fix . If is a function of which scales slower than exponential of , then
because the integral is dominated by the region where , since all the other contributions are exponentially suppressed. This also implies
- We define the typical value of as
which are equivalent definitions since
Notice that while the average value of coincides with the typical value (choose in the formula above), this is in general not the case for quantities growing exponentially fast like : the average value of these exponentially scaling quantities is in general much larger than the typical value, meaning that the general inequality
is strict. When , the quantity on the left-hand-side is , which we will also call quenched in the following; the quantity on the right-hand-side is different: we call it annealed and define it with .
Problem 1.1: the energy landscape of the REM
In this problem we study the random variable , that is the number of configurations having energy . We show that for large it scales as . We show that the typical value of , the quenched entropy of the model (see sketch), is given by:
The point where the entropy vanishes, , is the energy density of the ground state. The entropy is maximal at : the highest number of configurations have vanishing energy density.
- Averages: the annealed entropy. We begin by computing the annealed entropy , which is defined by the average . Compute this function using the representation [with if and otherwise]. When does coincide with ?
- Self-averaging. For the quantity is self-averaging: its distribution concentrates around the average value when . Show that ; by the central limit theorem, show that is self-averaging when . This is no longer true in the region where the annealed entropy is negative: why does one expect fluctuations to be relevant in this region?
- Rare events vs typical values. For the annealed entropy is negative: the average number of configurations with those energy densities is exponentially small in . This implies that the probability to get configurations with those energy is exponentially small in : these configurations are rare. Do you have an idea of how to show this, using the expression for ? What is the typical value of in this region? Justify why the point where the entropy vanishes coincides with the ground state energy of the model.
Comment: this analysis of the landscape suggests that in the large limit, the fluctuations due to the randomness become relevant when one looks at the bottom of their energy landscape, close to the ground state energy. We show below that this intuition is correct, and corresponds to the fact that the partition function has an interesting behaviour at low temperature.
Problem 1.2: the free energy and the freezing transition
We now compute the equilibrium phase diagram of the model, and in particular the quenched free energy density which controls the scaling of the typical value of the partition function, . We show that the free energy equals to
At a transition occurs, often called freezing transition: in the whole low-temperature phase, the free-energy is “frozen” at the value that it has at the critical temperature .
- The thermodynamical transition and the freezing. The partition function the REM reads Using the behaviour of the typical value of determined in Problem 1.1, derive the free energy of the model (hint: perform a saddle point calculation). What is the order of this thermodynamic transition?
- Entropy. What happens to the entropy of the model when the critical temperature is reached, and in the low temperature phase? What does this imply for the partition function ?
- Fluctuations, and back to average vs typical. Similarly to what we did for the entropy, one can define an annealed free energy from : show that in the whole low-temperature phase this is smaller than the quenched free energy obtained above. Putting all the results together, justify why the average of the partition function in the low-T phase is "dominated by rare events".
Comment: the low-T phase of the REM is a frozen phase, characterized by the fact that the free energy is temperature independent, and that the typical value of the partition function is very different from the average value. In fact, the low-T phase is also a glass phase : it is a phase where a peculiar symmetry, the so called replica symmetry, is broken. We go back to this concepts in the next sets of problems.