T-I: Difference between revisions
Line 15: | Line 15: | ||
# <em> The annealed entropy.</em> We begin by computing the annealed entropy <math> \Sigma^A </math>, which is the function that controls the behaviour of the average number of configurations at a given energy, <math> \overline{\mathcal{N}(E)}= \text{exp}\left(N \Sigma^A\left( \frac{E}{N} \right)+ o(N)\right) </math>. Compute this function using the representation <math> \mathcal{N}(E)dE= \sum_{\alpha=1}^{2^N} \chi_\alpha(E) dE </math> [with <math> \chi_\alpha(E)=1</math> if <math> E_\alpha \in [E, E+dE]</math> and <math> \chi_\alpha(E)=0</math> otherwise], together with the distribution <math> p(E)</math> of the energies of the REM configurations. When does <math> \Sigma^A </math> coincide with the entropy defined above?<br> | # <em> The annealed entropy.</em> We begin by computing the annealed entropy <math> \Sigma^A </math>, which is the function that controls the behaviour of the average number of configurations at a given energy, <math> \overline{\mathcal{N}(E)}= \text{exp}\left(N \Sigma^A\left( \frac{E}{N} \right)+ o(N)\right) </math>. Compute this function using the representation <math> \mathcal{N}(E)dE= \sum_{\alpha=1}^{2^N} \chi_\alpha(E) dE </math> [with <math> \chi_\alpha(E)=1</math> if <math> E_\alpha \in [E, E+dE]</math> and <math> \chi_\alpha(E)=0</math> otherwise], together with the distribution <math> p(E)</math> of the energies of the REM configurations. When does <math> \Sigma^A </math> coincide with the entropy defined above? | ||
<br> | |||
# <em> Self-averaging quantities.</em> For <math> |\epsilon| \leq \sqrt{\log 2} </math> the quantity <math> \mathcal{N}(E) </math> is self-averaging. This means that its distribution concentrates around the average value <math> \overline{\mathcal{N}}(E) </math> when <math> N \to \infty </math>. Show that this is the case by computing the second moment <math> \overline{\mathcal{N}^2} </math> and using the central limit theorem. Show that this is no longer true in the region where the annealed entropy is negative. | # <em> Self-averaging quantities.</em> For <math> |\epsilon| \leq \sqrt{\log 2} </math> the quantity <math> \mathcal{N}(E) </math> is self-averaging. This means that its distribution concentrates around the average value <math> \overline{\mathcal{N}}(E) </math> when <math> N \to \infty </math>. Show that this is the case by computing the second moment <math> \overline{\mathcal{N}^2} </math> and using the central limit theorem. Show that this is no longer true in the region where the annealed entropy is negative. |
Revision as of 12:38, 27 November 2023
Problem 1: the energy landscape of the REM
In this exercise we characterize the energy landscape of the REM, by determining the number of configurations having energy . This quantity is a random variable. For large , we will show that its typical value is given by
The function is the entropy of the model, and it is sketched in Fig. X. The point where the entropy vanishes, , is the energy density of the ground state, consistently with what we obtained with extreme values statistics. The entropy is maximal at : the highest number of configurations have vanishing energy density.
- The annealed entropy. We begin by computing the annealed entropy , which is the function that controls the behaviour of the average number of configurations at a given energy, . Compute this function using the representation [with if and otherwise], together with the distribution of the energies of the REM configurations. When does coincide with the entropy defined above?
- Self-averaging quantities. For the quantity is self-averaging. This means that its distribution concentrates around the average value when . Show that this is the case by computing the second moment and using the central limit theorem. Show that this is no longer true in the region where the annealed entropy is negative.
- Average vs typical number. For the annealed entropy is negative, meaning that the average number of configurations with those energy densities is exponentially small in . This implies that configurations with those energy are exponentially rare: do you have an idea of how to show this, using the expression for ? Why is the entropy , controlling the typical value of , zero in this region? Why the point where the entropy vanishes coincides with the ground state energy of the model?
this will be responsible of the fact that the partition function is not self-averaging in the low-T phase, as we discuss below.
The REM: the free energy and the freezing transition
We now compute the equilibrium phase diagram of the model, and in particular the free energy density . The partition function reads
We have determined above the behaviour of the typical value of for large . The typical value of the partition function is therefore
- The critical temperature. In the limit of large , the integral defining can be computed with the saddle point method; show that a transition occurs at a critical temperature , and that the free energy density reads
- Freezing: the entropy. The thermodynamic transition of the REM is often called a freezing transition. What happens to the entropy of the model when the critical temperature is reached, and in the low temperature phase?
- Quenched vs annealed free energy. Domination by rare events
Freezing, Heavy tails, condensation
The freezing transition can also be understood in terms of extreme valued statistics, as discussed in the lecture. Define , and
- Heavy tails. Compute the distribution of the variables and show that for this is an exponential. Using this, compute the distribution of the and show that it is a power law,
What happens when ? How does the behaviour of the partition function change at the transition point? Is this consistent with the behaviour of the entropy?
- Inverse participation ratio. The low temperature behaviour of the partition function an be characterized in terms of a standard measure of condensation (or localization), the Inverse Participation Ratio (IPR) defined as:
Show that when is power law distributed with exponent , is distributed as for , and that
This last point: make an homework