L-4: Difference between revisions

From Disordered Systems Wiki
Jump to navigation Jump to search
 
(64 intermediate revisions by the same user not shown)
Line 1: Line 1:
<Strong> Goal </Strong>: final lecture on KPZ and directed polymers at finite dimension. We will show that for <math>d>2</math> a "glass transition" takes place.




= KPZ : from 1d to the Cayley tree=


We know a lot about KPZ, but still we have much to understand:
= Directed Polymer in finite dimension =


* In  <math>d=1</math> we found <math>\theta=1/3</math> and a glassy regime present at all temperatures. The stationary solution of the KPZ equation describes, at long times, the fluctions of quantities like  <math>E_{\min}[x]  - E_{\min}[x']</math>. However it does not identify the actual distribution of <math> E_{\min}</math>  for a given <math>x</math>. In particular we have no idea from where Tracy Widom comes from.
== State of the Art ==


* In <math>d=\infty</math>, there is an exact solution for the Cayley tree that predicts a freezing transition to an 1RSB phase (<math>\theta=0</math>).
The directed polymer in random media belongs to the KPZ universality class. The behavior of this system is well understood in one dimension and in the mean-field case, more precisely for the directed polymer on the Cayley tree. In particular:
 
* In <math>d=1</math>, we have <math>\theta=1/3</math> and a glassy regime present at all temperatures. The model is integrable through a non-standard Bethe Ansatz, and the distribution of <math>E_{\min}</math> for a given boundary condition is of the Tracy–Widom type.
 
* In <math>d=\infty</math>, for the Cayley tree, an exact solution exists, predicting a freezing transition to a 1RSB phase (<math>\theta=0</math>).
 
In finite dimensions greater than one, no exact solutions are available. Numerical simulations indicate <math>\theta > 0</math> in <math>d=2</math> and a glassy regime present at all temperatures. The case <math>d > 2</math> remains particularly intriguing.


* In finite dimension, but larger than 1, there are no exact solutions. Numerical simulations find <math>\theta >0</math> in <math>d=2</math>. The case <math>d>2</math> is very interesting.


==Let's do replica!==
==Let's do replica!==
To make progress in disordered systems we have to go through the moments of the partition function. For simplicity we consider polymers starting in <math>0</math> and ending in  <math>x</math>. We recall that
To make progress in disordered systems, we need to analyze the moments of the partition function.   The first moment provide the annealed average and the second moment tell us about the fluctuantions. In particular, the partition function is self-averaging  if
* <math>V(x,\tau)</math> is a Gaussian field with
<center> 
<center> <math>
<math> 
\overline{V(x,\tau)}=0, \quad  \overline{V(x,\tau) V(x',\tau')} = D \delta^d(x-x') \delta(\tau-\tau')  
\frac{\overline{Z(x,t)^2}}{ (\overline{Z(x,t)})^2}=1  \, .
</math></center>
</math> 
* From the Wick theorem, for a generic Gaussian <math> W </math> field we have
</center> 
<center><math>
In this case annealed and the quenched average coincides in the thermodynamic limit. This strict  condition is sufficient, but not necessary. What is necessary is to show that  for large ''t''
\overline{\exp(W)} = \exp\left[\overline{W} +\frac{1}{2} (\overline{W^2}-\overline{W}^2)\right] </math></center>
<center> 
===The first moment===
<math> 
The first moment of the partition function is simple to compute and corresponds to a single replica
\frac{\overline{Z(x,t)^2}}{ (\overline{Z(x,t)})^2} < \text{const} 
<center> <math>
</math>, 
\overline{Z(x,t) } =\int_{x(0)=0}^{x(t)=x} {\cal D} x(\tau) \exp\left[- \frac{1}{T} \int_0^t d \tau \frac{1}2(\partial_\tau x)^2\right]   \overline{\exp\left[- \frac{1}{T} \int d \tau V(x(\tau),\tau ) \right]}
</center> 
</math></center>
In the following, we compute these moments via a replica calculation, considering polymers starting at <math>0</math> and ending at <math>x</math>.
Note that the term <math> T^2 \overline{W^2} = \int d \tau_1 d\tau_2 \overline{V(x,\tau_1)V(x,\tau_2)}= D t \delta_0</math> has a short distance divergence due to the delta-function.  Hence we can write:
 
<center> <math>
To proceed, we only need two ingredients:
\overline{Z(x,t) } = \frac{1}{(2 \pi t T)^{d/2}}\exp\left[ -\frac{1}{2} \frac{ x^2}{t T} \right] \exp\left[ \frac{D t \delta_0}{2T^2}  \right]
 
</math></center>
* The random potential <math>V(x,\tau)</math> is a Gaussian field characterized by
<center> <math> \overline{V(x,\tau)} = 0, \qquad \overline{V(x,\tau) V(x',\tau')} = D \, \delta^d(x-x') \, \delta(\tau - \tau'). </math> </center>
* Since the disorder is Gaussian, averages of exponentials can be computed using Wick’s theorem:
<center> <math> \overline{\exp(W)} = \exp\!\Big[\overline{W} + \frac{1}{2}\big(\overline{W^2} - \overline{W}^2\big)\Big], </math> </center>
for any Gaussian random variable <math>W</math>.
 
These two properties are all we need to carry out the replica calculation below.
 
==First Moment==
 
 
<center> <math> \overline{Z(x,t)} = \int_{x(0)=0}^{x(t)=x} \mathcal{D}x(\tau) \exp\Big[-\frac{1}{T}\int_0^t d\tau \frac{1}{2}(\partial_\tau x)^2\Big] \overline{\exp\Big[-\frac{1}{T} \int_0^t d\tau V(x(\tau),\tau)\Big]} </math> </center>
 
Due to the short-distance divergence of <math>\delta^d(0)</math>,
<center> <math> T^2 \overline{W^2} = \int d\tau_1 d\tau_2 \overline{V(x,\tau_1)V(x,\tau_2)} = D t \delta_0. </math> </center>
Hence,
 
<center> <math> \overline{Z(x,t)} = \frac{1}{(2\pi t T)^{d/2}} \exp\Big[-\frac{x^2}{2 t T}\Big] \exp\Big[\frac{D t \delta_0}{2 T^2}\Big] = Z_{free}(x,t,T)  \exp\Big[\frac{D t \delta_0}{2 T^2}\Big].  </math> </center>
==Second Moment==
For the second moment we need two replicas:
 
* Step 1
<center> <math> \overline{Z(x,t)^2} = \int \mathcal{D}x_1 \int \mathcal{D}x_2 \exp\!\Bigg[-\frac{1}{2T}\int_0^t d\tau \Big((\partial_\tau x_1)^2 + (\partial_\tau x_2)^2\Big)\Bigg] \; \overline{\exp\!\Bigg[-\frac{1}{T} \int_0^t d\tau V(x_1(\tau),\tau) - \frac{1}{T} \int_0^t d\tau V(x_2(\tau),\tau)\Bigg]}. </math> </center>
 
* Step 2: Wick’s Theorem
 
<center> <math> \overline{Z(x,t)^2} = \exp\!\Bigg[\frac{D t \delta_0}{T^2}\Bigg] \int \mathcal{D}x_1 \int \mathcal{D}x_2 \exp\!\Bigg[-\frac{1}{2T}\int_0^t d\tau \Big((\partial_\tau x_1)^2 + (\partial_\tau x_2)^2 - \frac{D}{T^2}\delta^d[x_1(\tau)-x_2(\tau)]\Big)\Bigg]. </math> </center>
 
* Step 3: Change of Coordinates
 
Let <math>X = (x_1+x_2)/2</math> and <math>u = x_1 - x_2</math>. Then:
 
<center> <math> \overline{Z(x,t)^2} = (\overline{Z(x,t)})^2 \frac{\displaystyle \int_{u(0)=0}^{u(t)=0} \mathcal{D}u \exp\!\Bigg[-\int_0^t d\tau \frac{1}{4T} (\partial_\tau u)^2 - \frac{D}{T^2} \delta^d[u(\tau)]\Bigg]} {Z_{free}(u=0,t,2T)}. </math> </center>
 
Here,
 
<center> <math> Z_{free}^2(x,t,T) = Z_{free}(X=x,t,T/2) \, Z_{free}(u=0,t,2T), \qquad Z_{free}(u=0,t,2T) = (4 \pi T t)^{d/2}. </math> </center>
 
=== Two-Replica Propagator ===
 
Define the propagator:
 
<center> <math> W(0,t) = \int_{u(0)=0}^{u(t)=0} \mathcal{D}u \exp\Big[-\int_0^t d\tau \frac{1}{4T} (\partial_\tau u)^2 - \frac{D}{T^2} \delta^d[u(\tau)]\Big]. </math> </center>
 
By the Feynman-Kac formula:
 
<center> <math> \partial_t W(x,t) = -\hat H W(x,t), \quad \hat H = -T \nabla^2 - \frac{D}{T^2} \delta^d[u]. </math> </center>
The single-particle potential is time-independent and attractive. Long-time behavior is governed by the low-energy eigenstates.
 
 
For <math>d \le 2</math>, the attractive potential always produces a bound state with energy <math>E_0<0</math>. Hence, at long times:
<center> <math> W(x,t) \sim e^{|E_0| t}  </math> </center>
This explosion means that the quenched free energy is smaller than the annealed one at all temperatures.
 
For <math>d > 2</math>,  The low-energy behavior depends on <math>D/T^2</math>:
* High temperature: the spectrum is positive and continuous. Annealed and quenched coincide, the exponent <math>\theta=0</math>.
 
* Low temperature: bound states appear.  No replica-symmetry breaking (RSB), but the quenched free energy is smaller than the annealed one. Numerical simulations show <math>\theta>0</math>.
=Back to REM: condensation of the Gibbs measure=
 
 
Thanks to the computation of <math>\overline{n(x)}</math>, we can identify an important fingerprint of the glassy phase.  Let's compare the weight of the ground state against the weight of all other states:
<center> 
<math>
\frac{\sum_\alpha z_\alpha}{z_{\alpha_{\min}}} = 1 + \sum_{\alpha \ne \alpha_{\min}} \frac{z_\alpha}{z_{\alpha_{\min}}} = 1 + \sum_{\alpha \ne \alpha_{\min}} e^{-\beta(E_\alpha - E_\min)} \sim 1 + \int_0^\infty dx\, \frac{d\overline{n(x)}}{dx} \, e^{-\beta x}= 1+ \int_0^\infty dx\, \frac{e^{x/b_M}}{b_M} \, e^{-\beta x}
</math>
</center>
 
=== Behavior in Different Phases:===
* '''High-Temperature Phase (<math> \beta < \beta_c = 1/b_M = \sqrt{2 \log2}</math>):''' 
: In this regime, the total weight of the excited states dominates over the weight of the ground state. The ground state is therefore not deep enough to overcome the finite entropy contribution. As a result, the probability of sampling the same configuration twice from the Gibbs measure is exponentially small in  the system size.
 
 
* '''Low-Temperature Phase (<math> \beta > \beta_c =1/ b_M = \sqrt{2 \log2}</math>):''' 
: In this regime, the integral is finite: 
<center> 
<math>
\int_0^\infty dx \, e^{ (1/b_M-\beta) x}/b_M = \frac{1}{\beta b_M-1}  = \frac{\beta_c}{\beta - \beta_c}
</math> 
</center> 
In this regime, the total weight of the excited states is of the same order as the weight of the ground state. Consequently, the ground state is occupied with finite probability, reminiscent of Bose–Einstein condensation. However, unlike the directed polymer in finite dimension, this condensation involves not only the ground state but also the first excited states.
 
====Overlap Distribution and Replica Symmetry Breaking:====
The structure of states can be further characterized through the overlap between two configurations <math>\alpha</math> and <math>\gamma</math>, defined as
 
<center> <math> q_{\alpha,\gamma} = \frac{1}{N} \sum_{i=1}^N \sigma_i^\alpha \sigma_i^\gamma, </math> </center>
 
which takes values in the interval <math>(-1,1)</math>. The distribution <math>P(q)</math> of the overlap between two configurations sampled from the Gibbs measure distinguishes the two phases:
 
At high temperature (<math>\beta < \beta_c</math>), the system is replica symmetric and the overlap distribution is concentrated at zero:
 
<center> <math>P(q) = \delta(q).</math> </center>
 
At low temperature (<math>\beta > \beta_c</math>), the system exhibits one-step replica symmetry breaking, and the overlap distribution becomes
 
<center> <math>P(q) = \tfrac{\beta_c}{\beta}\,\delta(q) + \Bigl(1 - \tfrac{\beta_c}{\beta}\Bigr)\,\delta(1-q).</math> </center>
 
 
== Finite Dimensional Systems==
In finite dimensions, the fluctuations of the ground-state energy are characterized by an exponent <math>\theta</math>:
 
<center> <math>\overline{\big(E_{\min} - \overline{E_{\min}}\big)^2} \sim L^{2\theta},</math> </center>
 
where <math>L</math> is the linear size of the system and <math>N = L^D</math> is the number of degrees of freedom.


=== The second moment ===
When <math>\theta < 0</math>, the critical temperature vanishes with increasing system size, leading to the absence of a glass transition. This scenario occurs in many low-dimensional systems, such as the Edwards–Anderson model in two dimensions.


* Step 1: The second moment is
When <math>\theta > 0</math>, one must extend the definition of this exponent to finite temperatures and consider the fluctuations of the free energy <math>F(L,\beta)</math>. Three representative cases are:
<center> <math>
\overline{Z(x,t)^2 } =\int {\cal D} x_1\int  {\cal D} x_2 \exp\left[-  \int_0^t d \tau  \frac{1}{2T}[(\partial_\tau x_1)^2+ (\partial_\tau x_2)^2 \right]  \overline{\exp\left[- \frac{1}{T} \int_0^t d \tau_1 V(x_1(\tau_1),\tau_1 ) - \frac{1}{T} \int_0^t d \tau_2 V(x_2(\tau_2),\tau_2 )\right]}
</math></center>


* Step 2: Use Wick and derive:
* '''Directed polymer in <math>N=1,2</math>:'''
<center> <math>
The fluctuations of the ground state exhibit a positive, temperature-independent exponent <math>\theta</math>. In this situation, only the glassy phase exists, and
\overline{Z(x,t)^2 } = \exp\left[ \frac{D  t \delta_0}{T^2}  \right]\int {\cal D} x_1\int  {\cal D} x_2 \exp\left[-  \int_0^t d \tau  \frac{1}{2T}[(\partial_\tau x_1)^2+ (\partial_\tau x_2)^2 - \frac{D}{T^2} \delta^d[x_1(\tau)-x_2(\tau)]\right]
</math></center>


* Step 3: Now  change coordinate <math>X=(x_1+x_2)/2; \; u=x_1-x_2</math> and get:
<center><math>P(q) = \delta(1-q),</math></center> because producing an excitation with vanishing overlap with the ground state is very costly.
<center> <math>
\overline{Z(x,t)^2} = (\overline{Z(x,t)})^2 \frac{\int_{u(0)=0}^{u(t)=0} {\cal D} u  \exp\left[-  \int_0^t d \tau  \frac{1}{4T}(\partial_\tau u)^2- \frac{D}{T^2} \delta^d[u(\tau)]\right]}{\int_{u(0)=0}^{u(t)=0} {\cal D} u  \exp\left[-  \int_0^t d \tau  \frac{1}{4T}(\partial_\tau u)^2\right]}
</math></center>


==Discussion==
* '''Directed polymer in <math>N=3</math>:'''
Hence, the quantity <math>\overline{Z(x,t)^2}/ (\overline{Z(x,t)})^2</math> can be computed.
The exponent <math>\theta</math> depends on the temperature: it vanishes above the glass transition and becomes strictly positive below it. Accordingly,
* The denominator  <math>\int_{u(0)=0}^{u(t)=0} {\cal D} u  \exp\left[-  \int_0^t d \tau  \frac{1}{4T}(\partial_\tau u)^2\right]  </math> is the free propagator and gives a contribution <math> \sim (4 T  t)^{d/2}</math> .
* Let us define  the numerator
<center> <math>
W(0,t)= \int_{u(0)=0}^{u(t)=0} {\cal D} u  \exp\left[-  \int_0^t d \tau  \frac{1}{4T}(\partial_\tau u)^2- \frac{D}{T^2} \delta^d[u(\tau)]\right]
</math></center>


<Strong>Remark 1:</Strong> From Valentina's lecture, remember that if
<center><math>P(q) = \delta(1-q)</math> at low temperature, and <math>P(q) = \delta(q)</math> at high temperature.</center>
<center> <math>
\frac{\overline{Z(x,t)^2}}{ (\overline{Z(x,t)})^2}=1
</math></center>
the partition function is self-averaging and <math> \overline{\ln Z(x,t)} =\ln\overline{Z(x,t)}
</math>.
The condition above is sufficient but not necessary. It is enough that <math>
\overline{Z(x,t)^2}/ (\overline{Z(x,t)})^2 <\text{const} </math>,  when <math> t\to \infty</math>, to have the equivalence between  annealed and quenched averages.


<Strong>Remark 2:</Strong> From Feynman-Kac we can write the following equation
* '''Directed polymer on the Cayley tree:'''
<center> <math>
The behavior is analogous to the Random Energy Model: <math>\theta = 0</math> in both phases. At high temperature,
\partial_t W(x,t) =-  \hat H W(x,t)
</math></center>
Here the Hamiltonian reads:
<center> <math>
\hat  H= -2 T \nabla^2 - \frac{D}{T^2} \delta^d[u]
</math></center>
The single particle  potential  is <Strong> time independent and actractive </Strong>.
<center> <math>
W(x,t) = \langle x|\exp\left( - \hat H t\right) |0\rangle
</math></center>
At large times the behaviour is dominatated by the low energy part of the spectrum.


* In <math> d\le 2</math> an actractive potential always gives a bound state. In particular the ground state has a negative energy <math> E_0 <0</math>. Hence at large times
<center><math>P(q) = \delta(q),</math></center> while at low temperature the system exhibits the one-step replica symmetry breaking picture:
<center> <math>
W(x,t) = e^{ |E_0| t}
</math></center>
grows exponentially. This means that at all temperature, when  <math>  t\to \infty</math>
<center><math> \overline{\ln Z(x,t)}  \ll \ln\overline{Z(x,t)}
</math></center>


* For <math> d > 2</math> the low part of the spectrum is controlled by the strength of the prefactor <math>\frac{D}{T^2} </math>. At high temperature we have a continuum positive spectrum, at low temperature bound states exist. Hence,  when  <math>  t\to \infty</math>
<center> <math>P(q) = \tfrac{\beta_c}{\beta}\,\delta(q) + \Bigl(1 - \tfrac{\beta_c}{\beta}\Bigr)\,\delta(1-q).</math> </center>
<center><math> \begin{cases}
\overline{\ln Z(x,t)} = \ln\overline{Z(x,t)} \quad \text{for} \; T>T_c \\
\\
\overline{\ln Z(x,t)} \ll \ln\overline{Z(x,t)} \quad  \text{for} \; T<T_c
\end{cases}
</math></center>
This transition, in <math> d =3 </math>, is between a high temeprature, <math> \theta=0</math> phase and a low temeprature <math> \theta>0</math> <Strong> no RSB </Strong> phase.

Latest revision as of 20:14, 16 September 2025


Directed Polymer in finite dimension

State of the Art

The directed polymer in random media belongs to the KPZ universality class. The behavior of this system is well understood in one dimension and in the mean-field case, more precisely for the directed polymer on the Cayley tree. In particular:

  • In , we have and a glassy regime present at all temperatures. The model is integrable through a non-standard Bethe Ansatz, and the distribution of for a given boundary condition is of the Tracy–Widom type.
  • In , for the Cayley tree, an exact solution exists, predicting a freezing transition to a 1RSB phase ().

In finite dimensions greater than one, no exact solutions are available. Numerical simulations indicate in and a glassy regime present at all temperatures. The case remains particularly intriguing.


Let's do replica!

To make progress in disordered systems, we need to analyze the moments of the partition function. The first moment provide the annealed average and the second moment tell us about the fluctuantions. In particular, the partition function is self-averaging if

In this case annealed and the quenched average coincides in the thermodynamic limit. This strict condition is sufficient, but not necessary. What is necessary is to show that for large t

,

In the following, we compute these moments via a replica calculation, considering polymers starting at and ending at .

To proceed, we only need two ingredients:

  • The random potential is a Gaussian field characterized by
  • Since the disorder is Gaussian, averages of exponentials can be computed using Wick’s theorem:

for any Gaussian random variable .

These two properties are all we need to carry out the replica calculation below.

First Moment

Due to the short-distance divergence of ,

Hence,

Second Moment

For the second moment we need two replicas:

  • Step 1
  • Step 2: Wick’s Theorem
  • Step 3: Change of Coordinates

Let and . Then:

Here,

Two-Replica Propagator

Define the propagator:

By the Feynman-Kac formula:

The single-particle potential is time-independent and attractive. Long-time behavior is governed by the low-energy eigenstates.


For , the attractive potential always produces a bound state with energy . Hence, at long times:

This explosion means that the quenched free energy is smaller than the annealed one at all temperatures.

For , The low-energy behavior depends on :

  • High temperature: the spectrum is positive and continuous. Annealed and quenched coincide, the exponent .
  • Low temperature: bound states appear. No replica-symmetry breaking (RSB), but the quenched free energy is smaller than the annealed one. Numerical simulations show .

Back to REM: condensation of the Gibbs measure

Thanks to the computation of , we can identify an important fingerprint of the glassy phase. Let's compare the weight of the ground state against the weight of all other states:

Behavior in Different Phases:

  • High-Temperature Phase ():
In this regime, the total weight of the excited states dominates over the weight of the ground state. The ground state is therefore not deep enough to overcome the finite entropy contribution. As a result, the probability of sampling the same configuration twice from the Gibbs measure is exponentially small in the system size.


  • Low-Temperature Phase ():
In this regime, the integral is finite:

In this regime, the total weight of the excited states is of the same order as the weight of the ground state. Consequently, the ground state is occupied with finite probability, reminiscent of Bose–Einstein condensation. However, unlike the directed polymer in finite dimension, this condensation involves not only the ground state but also the first excited states.

Overlap Distribution and Replica Symmetry Breaking:

The structure of states can be further characterized through the overlap between two configurations and , defined as

which takes values in the interval . The distribution of the overlap between two configurations sampled from the Gibbs measure distinguishes the two phases:

At high temperature (), the system is replica symmetric and the overlap distribution is concentrated at zero:

At low temperature (), the system exhibits one-step replica symmetry breaking, and the overlap distribution becomes


Finite Dimensional Systems

In finite dimensions, the fluctuations of the ground-state energy are characterized by an exponent :

where is the linear size of the system and is the number of degrees of freedom.

When , the critical temperature vanishes with increasing system size, leading to the absence of a glass transition. This scenario occurs in many low-dimensional systems, such as the Edwards–Anderson model in two dimensions.

When , one must extend the definition of this exponent to finite temperatures and consider the fluctuations of the free energy . Three representative cases are:

  • Directed polymer in :

The fluctuations of the ground state exhibit a positive, temperature-independent exponent . In this situation, only the glassy phase exists, and

because producing an excitation with vanishing overlap with the ground state is very costly.

  • Directed polymer in :

The exponent depends on the temperature: it vanishes above the glass transition and becomes strictly positive below it. Accordingly,

at low temperature, and at high temperature.
  • Directed polymer on the Cayley tree:

The behavior is analogous to the Random Energy Model: in both phases. At high temperature,

while at low temperature the system exhibits the one-step replica symmetry breaking picture: