LBan-1: Difference between revisions

From Disordered Systems Wiki
Jump to navigation Jump to search
Line 149: Line 149:
The asymptotic expansion for <math>E \to -\infty</math>  is :
The asymptotic expansion for <math>E \to -\infty</math>  is :
<center> <math>P(E) \approx \frac{\sigma}{\sqrt{2 \pi} |E|} e^{-\frac{E^2}{2 \sigma^2}} + O(\frac{e^{-\frac{E^2}{2 \sigma^2}}}{ |E|^2} ) </math> </center>
<center> <math>P(E) \approx \frac{\sigma}{\sqrt{2 \pi} |E|} e^{-\frac{E^2}{2 \sigma^2}} + O(\frac{e^{-\frac{E^2}{2 \sigma^2}}}{ |E|^2} ) </math> </center>
In this case it is convenient to introduce the function <math>A(E)</math> defined as <math>P(E) = \exp(A(E))</math>. hence we have:
<center> <math> A_{\text{gauss}}(E)= -\frac{E^2}{2 \sigma^2} - \log(\frac{\sqrt{2 \pi} |E|}{\sigma})+\ldots </math> </center>
From the second relation we impose <math> A_{\text{gauss}}(E_{\min}^{\text{typ}})=- \ln M</math>. For large <math>M</math> we get:
<center> <math>E_{\min}^{\text{typ}} = -\sigma \sqrt{2 \log M} \left( 1- \frac{1}{4} \frac{\log (4 \pi \log M)}{\log M} + \ldots \right)</math> </center>

Revision as of 17:08, 6 August 2025


Overview

This lesson is structured in three parts:

  • Self-averaging and disorder in statistical systems

Disordered systems are characterized by a random energy landscape, however, in the thermodynamic limit, physical observables become deterministic. This property, known as self-averaging, does not always hold for the partition function which is the quantity that we can compute. When it holds the annealed average lnZ and the quenched average lnZ coincides otherwiese we have

lnZ<lnZ
  • The Random Energy Model

We study the Random Energy Model (REM) introduced by Bernard Derrida. In this model at each configuration is assigned an independent energy drawn from a Gaussian distribution of extensive variance. The model exhibits a freezing transition at a critical temperature​, below which the free energy becomes dominated by the lowest energy states.

  • Extreme value statistics and saddle-point analysis

The results obtained from a saddle-point approximation can be recovered using the tools of extreme value statistics.

Part I

Random energy landascape

In a system with N degrees of freedom, the number of configurations grows exponentially with N. For simplicity, consider Ising spins that take two values, σi=±1, located on a lattice of size L in d dimensions. In this case, N=Ld and the number of configurations is M=2N=eNlog2.

In the presence of disorder, the energy associated with a given configuration becomes a random quantity. For instance, in the Edwards-Anderson model:

E=i,jJijσiσj,

where the sum runs over nearest neighbors i,j, and the couplings Jij are independent and identically distributed (i.i.d.) Gaussian random variables with zero mean and unit variance.

The energy of a given configuration is a random quantity because each system corresponds to a different realization of the disorder. In an experiment, this means that each of us has a different physical sample; in a numerical simulation, it means that each of us has generated a different set of couplings Jij.


To illustrate this, consider a single configuration, for example the one where all spins are up. The energy of this configuration is given by the sum of all the couplings between neighboring spins:

E[σ1=1,σ2=1,]=i,jJij.

Since the the couplings are random, the energy associated with this particular configuration is itself a Gaussian random variable, with zero mean and a variance proportional to the number of terms in the sum — that is, of order

N

. The same reasoning applies to each of the

M=2N

configurations. So, in a disordered system, the entire energy landscape is random and sample-dependent.


Self-averaging observables

A crucial question is whether the macroscopic properties measured on a given sample are themselves random or not. Our everyday experience suggests that they are not: materials like glass, ceramics, or bronze have well-defined, reproducible physical properties that can be reliably controlled for industrial applications.

From a more mathematical point of view, it means that the free energy FN(β)=NfN(β) and its derivatives (magnetization, specific heat, susceptibility, etc.), in the limit N, these random quantities concentrates around a well defined value. These observables are called self-averaging. This means that,

limNfN(β)=limNfNtyp(β)=limNfN(β)=f(β)

Hence fN(β) becomes effectively deterministic and its sample-to sample fluctuations vanish in relative terms:

limNfN2(β)fN(β)2=1.

The partition function

The partition function

ZN=exp(βNfN(β))

is itself a random variable in disordered systems. Analytical methods can capture the statistical properties of this variable. We can define to average over the disorder realizations:

  • The annealed average corresponds to the calculation of the moments of the partition function. The annealed free energy is
fann.=1βNlnZN
  • the quenched average corresponds to the average of the logarithm of the partition function, which is self-averaging for sure.
f(β)fN(β)=lnZN(β)/(βN)


Do these two averages coincide?

If the partition function is self-averaging in the thermodynamic limit, then

limNZN(β)=limNZNtyp(β)=limNZN(β)=eβNf(β)

As a consequence, the annealed and the quenched averages coincide.

If the partition function is not self-averaging, only typical partition function concentrates, but extremely rare configurations contribute disproportionately to its moments:

limNZNtyp(β)=eβNf(β)<limNZN(β)=eβNfann.(β)


There are then two main strategies to determine the deterministic value of the observable :

  • Compute directly the quenched average using methods such as the replica trick and the Parisi solution.
  • Determine the typical value ZNtyp(β) and evaluate f(β)=1βNlnZNtyp(β)

Part II

Random Energy Model

The Random energy model (REM) neglects the correlations between the M=2N configurations. The energy associated to each configuration is an independent Gaussian variable with zero mean and variance N. The simplest solution of the model is with the microcanonical ensemble.


Microcanonical calculation

Step 1: Number of states .

Let 𝒩N(E)dE the number of states of energy in the interval (E,E+dE). It is a random number and we use the representation

𝒩N(E)dE=α=12Nχα(E)dE

with χα(E)=1 if Eα[E,E+dE] and χα(E)=0 otherwise. We can cumpute its average

𝒩N(E)=α=12Nχα(E)=2N2πNexp(E22N)exp[N(ln2ϵ2/2)]

Here ϵ=E/N is the energy density and the annealed entropy density in the thermodynamic limit is

sann.(ϵ)=ln2ϵ2/2

Step 2: Self-averaging.

Let compute now the second moment

𝒩N2(E)=α=12Nχα(βαχβ)+α=12Nχα2𝒩N(E)(𝒩N(E)exp(E22N))+𝒩N(E)

We can then check the self averaging condition:

𝒩N2(E)𝒩N(E)21+1𝒩N(E)

A critical energy density ϵ*=2ln2 separates a self-averaging regime for |ϵ|<ϵ* and a non self-averaging regime where for |ϵ|>ϵ*. In the first regime, 𝒩N(E) is exponentially large and its value is determinstic (average, typical, median are the same). In the secon regime, 𝒩N(E) is exponentially small but nonzero. The typical value instead is exactly zero, 𝒩Ntyp(E)=0: for most disorder realizations, there are no configurations with energy below ϵ*N and only a vanishingly small fraction of rare samples gives a positive contribution to the average. As a result, the quenched average on the entropy density is:

s(ϵ)={ln2ϵ22,for |ϵ|<ϵ*,for |ϵ|>ϵ*

Back to canonical ensemble: the freezing transition

The annealed partition function is the average of the partition function over the disorder:

ZN(β)=dϵ𝒩N(ϵ)eβNϵ=dϵexp[N(ln2ϵ2/2βϵ)].

Using the saddle point for large N we find ϵmin=β and thus

fann.(β)=ln2/β+β2

The quenched partition function is obtained replacing the mean with the typical value:

ZNtyp.(β)=dϵ𝒩Ntyp.(ϵ)eβNϵ=ϵ*ϵ*dϵexp[N(ln2ϵ2/2βϵ)].

Using the saddle point for large N we find a critical inverse temperature βc=2ln2 separating two phases:

  • For β<βc, ϵmin=β and the annealed calculation works
  • For β>βc, ϵmin=βc and the free energy freezes to a temperature independent value. As a result, the quenched average on the free energy density is:
f(β)={ln2/β+β2,for β<βc2ln2,for β>βc

Part III

Detour: Extreme Value Statistics

Consider the REM spectrum of M energies E1,,EM drawn from a distribution p(E). It is useful to introduce the cumulative probability of finding an energy smaller than E

P(E)=Edxp(x)

We also define:

Emin=min(E1,,EM),QM(E)Prob(Emin>E)

The statistical properties of Emin are derived using three key relations:

  • First relation:
QM(E)=(1P(E))M

This relation is exact but depends on M and the precise form of p(E).

  • Second relation:
P(Emintyp)=1/M

This is an estimation of the typical value of the minimum. It is a crucial relation that will be used frequently.

  • Third relation
QM(E)=eMlog(1P(E))exp(MP(E))

This is an approximation valid for large M and around the typical value of the minimum energy. It allows to extract the universal scaling.

Back to REM

Let us analyze in detail the case of a Gaussian distribution with zero mean and variance σ2. Using integration by parts, we can write :

P(E)=Edx2πσ2ex22σ2=12πE22σ2dttet=σ2π|E|eE22σ214πE22σ2dttet

The asymptotic expansion for E is :

P(E)σ2π|E|eE22σ2+O(eE22σ2|E|2)

In this case it is convenient to introduce the function A(E) defined as P(E)=exp(A(E)). hence we have:

Agauss(E)=E22σ2log(2π|E|σ)+

From the second relation we impose Agauss(Emintyp)=lnM. For large M we get:

Emintyp=σ2logM(114log(4πlogM)logM+)