L-8: Difference between revisions

From Disordered Systems Wiki
Jump to navigation Jump to search
 
(42 intermediate revisions by the same user not shown)
Line 1: Line 1:
<Strong>Goal:</Strong> we will introduce the Anderson model, discuss the behaviour as a function of the dimension. In 1d localization can be connected to the product of random matrices.
'''Goal.''' We introduce the Anderson model and study the statistical properties of its eigenstates.
In one dimension disorder leads to localization of all eigenstates, which can be understood using products of random matrices.


= Anderson model (tight bindind model)=  
= Anderson model (tight-binding model) =


We consider disordered non-interacting particles hopping between nearest neighbors  sites on a lattice. The hamiltonian reads:
We consider non-interacting particles hopping between nearest-neighbour sites of a lattice in the presence of disorder.
<center> <math>
H= - t \sum_{ <i, j> } (c_i^\dagger c_j +c_j^\dagger c_i) \sum_i \epsilon_i c_i^\dagger c_i
</math></center>
The single particle hamiltonian in 1d reads
<center> <math>
H =
\begin{bmatrix}
V_1 & -t & 0 & 0 & 0 & 0 \\
-t & V_2 & -t & 0 & 0 & 0 \\
0 & -t & V_3 & -t & 0 & 0 \\
0 & 0  & -t & \ldots &-t & 0\\
0 & 0  & 0  & -t & \ldots & -t\\
0 & 0  & 0  & 0 & -t & V_L
\end{bmatrix}
</math></center>


For simplicity we set the hopping <math>t=1 </math>. The disorder are iid random variables drawn, uniformly from the box <math>(-\frac{W}{2},\frac{W}{2})</math>.
The Hamiltonian reads


The final goal is to study the statistical properties of eigensystem
<math display="block">
<center> <math>
H =
H \psi=\epsilon \psi, \quad \text{with} \sum_n |\psi_n|^2=1
- t \sum_{\langle i,j\rangle} (c_i^\dagger c_j + c_j^\dagger c_i)
</math></center>
+
\sum_i V_i c_i^\dagger c_i .
</math>


== Density of states (DOS)==
The random variables <math>V_i</math> represent on-site disorder. 
For simplicity we set <math>t=1</math>.


In 1d and in absence of disorder, the dispersion relation is
We assume that the disorder variables are independent and identically distributed, uniformly in the interval
<math> \epsilon(k) = -2 \cos k, \quad  k \in (-\pi, \pi), -2< \epsilon(k)< 2 </math>. From the dispersion relation, we compute the density of states (DOS) :
<center><math>
\rho(\epsilon) =\int_{-\pi}^\pi \frac{d k}{2 \pi} \delta(\epsilon-\epsilon(k))=\frac{1}{\pi } \frac{1}{\sqrt{4-\epsilon^2}} \quad for \epsilon \in (-2,2)</math></center>


In presence of disorder the DOS becomes larger, and display sample to sample fluctuations. One can consider its mean value, avergaed  over disorder realization.
<math display="block">
V_i \in \left(-\frac{W}{2},\frac{W}{2}\right).
</math>


==  Eigenstates ==
In one dimension, the single-particle Hamiltonian is represented by the tridiagonal matrix


In absence of disorder the eigenstates are plane waves delocalized along all the system. In presence of disorder, three situations can occur and to distinguish them it is useful to introduce the inverse participation ratio, IPR
<math display="block">
<center><math>
H =
IPR(q)=\sum_n |\psi_n|^{2 q} \sim L^{-\tau_q}
\begin{pmatrix}
</math></center>
V_1 & -1 & 0 & 0 & \dots \\
The normalization imposes <math>\tau_1 =0 </math> and <math>\tau_0 =-d </math>.
-1 & V_2 & -1 & 0 & \dots \\
0 & -1 & V_3 & -1 & \dots \\
0 & 0 & -1 & \ddots & -1 \\
\dots & \dots & \dots & -1 & V_L
\end{pmatrix}.
</math>


* <Strong> Delocalized eigenstates</Strong> In this case, <math>|\psi_n|^{2} \approx L^{-d} </math>. Hence, we expect
We study the statistical properties of the eigenvalue problem
<center><math>
IPR(q)=L^d(1-q)  \quad \tau_q=d(1-q)
</math></center>


* <Strong> Localized eigenstates</Strong> In this case, <math>|\psi_n|^{2} \approx 1/\xi_{\text{loc}}^{1/d} </math> for <math>\xi_{\text{loc}}^{d}</math> sites and zero elsewhere. Hence, we expect
<math display="block">
<center><math>
H\psi = \epsilon \psi ,
IPR(q)=O(1)  \quad \tau_q=0
\qquad
</math></center>
\sum_{n=1}^L |\psi_n|^2 = 1 .
</math>


* <Strong> Multifractal eigenstates</Strong>  At the transition, namely at  the mobility edge, an anomalous scaling is observed elsewhere. Hence, we expect
In the one-particle sector, this becomes a discrete Schrödinger equation.
<center><math>
IPR(q)=L^d(1-q)  \quad \tau_q=D_q(1-q) 
</math></center>
Here <math>D_q</math> is q-dependent fractal dimension, smaller than <math>d</math>.


==Transfer matrices and Lyapunov exponents==
---


===Central limit theorem and log-normal distribution===
== Density of states ==


Consider a set of positive iid random variables  <math>x_1,x_2,\ldots x_N</math> with finite mean and variance. Consider the multiplicative variable
Without disorder the dispersion relation is
<center><math>
 
\Pi_N= \Prod_{n=1}^N x_i
<math display="block">
</math></center>
\epsilon(k) = -2\cos k,
In the large N limit, the CLT applies and its logarithm is a Gaussian variable of the form:
\qquad
<center><math>
k\in(-\pi,\pi).
\log \Pi_N= \gamma N + \gamma_2 \sqrt{N} \chi
</math>
</math></center>
 
Here, <math>\chi</math> is a Gaussian number of zero mean and unit variance, <math>\gamma, gamma_2</math> are constant that we can determine. Show that
The energy band is therefore
<center><math>
 
\gamma =\overline{\ln x}< \ln \overline{x}, \quand \gamma_2= \sqrt{\overline{(\ln x)^2}-(\overline{\ln x})^2}
<math display="block">
</math></center>
-2 < \epsilon < 2 .
</math>
 
The density of states is
 
<math display="block">
\rho(\epsilon)
=
\int_{-\pi}^{\pi}
\frac{dk}{2\pi}
\delta(\epsilon-\epsilon(k))
=
\frac{1}{\pi\sqrt{4-\epsilon^2}}
\qquad
(\epsilon\in(-2,2)).
</math>
 
In the presence of disorder the density of states broadens and becomes sample dependent.
 
== Transfer matrices ==
 
The discrete Schrödinger equation reads
 
<math display="block">
-\psi_{n+1} - \psi_{n-1} + V_n \psi_n = \epsilon \psi_n .
</math>
 
It can be rewritten as
 
<math display="block">
\begin{pmatrix}
\psi_{n+1} \\
\psi_n
\end{pmatrix}
=
T_n
\begin{pmatrix}
\psi_n \\
\psi_{n-1}
\end{pmatrix}
</math>
 
with
 
<math display="block">
T_n =
\begin{pmatrix}
V_n-\epsilon & -1 \\
1 & 0
\end{pmatrix}.
</math>
 
Iterating gives
 
<math display="block">
\begin{pmatrix}
\psi_{n+1} \\
\psi_n
\end{pmatrix}
=
\Pi_n
\begin{pmatrix}
\psi_1 \\
\psi_0
\end{pmatrix},
\qquad
\Pi_n = T_n T_{n-1} \cdots T_1 .
</math>
 
Thus the wavefunction is controlled by a product of random matrices.
 
=== Remark ===
 
The transfer matrix formulation rewrites the problem as a recursion, starting from initial data and propagating the solution along the chain.
 
This corresponds to a Cauchy problem rather than a boundary value problem. Nevertheless, the growth properties of the solutions contain direct information about the nature of the eigenstates: exponential growth is associated with localization, while bounded solutions correspond to extended states.
 
== Furstenberg theorem (physical formulation) ==
 
The exponential growth of the product of random matrices
<math display="block">
\Pi_n = T_n T_{n-1} \cdots T_1
</math>
can be understood as a matrix analogue of the law of large numbers. A set of simple sufficient conditions for this behavior is provided by a theorem of Harry Furstenberg.
 
To quantify this growth, we introduce the norm of a matrix <math>A</math> as
<math display="block">
\|A\|^2 = \frac{a_{11}^2 + a_{12}^2 + a_{21}^2 + a_{22}^2}{2}.
</math>
 
In physical terms, the mechanism behind the theorem relies on two key ingredients:
 
=== (i) Stretching ===
 
The transfer matrices must be able, with nonzero probability, to expand vectors.
 
More precisely, there exist realizations of <math>T_n</math> such that for some direction <math>v</math>,
<math display="block">
\|T_n v\| = \sigma_{\max}(T_n)\, \|v\|,
\qquad \sigma_{\max}(T_n) > 1.
</math>
 
Equivalently, <math>\sigma_{\max}^2(T_n)</math> is the largest eigenvalue of
<math display="block">
T_n^T T_n.
</math>
 
This ensures that typical products contain episodes of exponential amplification.
 
=== (ii) Mixing of directions ===
 
At each step, the transfer matrix changes the direction of the vector <math>(\psi_n,\psi_{n-1})</math>.
 
In the clean case, the same matrix is applied at every step, so the angular dynamics is deterministic and can be written as
<math display="block">
\theta_{n+1}=F(\theta_n).
</math>
Once the initial direction is fixed, the whole sequence is fixed. The system retains a perfect memory of its initial orientation.
 
In the disordered case, the transfer matrix varies from step to step, and the angular dynamics becomes
<math display="block">
\theta_{n+1}=F(\theta_n,T_n).
</math>
The direction is then continuously reshuffled and progressively loses memory of its initial value.
 
More importantly, there is no finite set of directions that is invariant under all transfer matrices. As a consequence, the dynamics does not get trapped into special directions and effectively explores the projective space.
 
This absence of invariant directions is what is meant by '''mixing of directions'''.
 
=== Consequence ===
 
Under these conditions, the norm of the product grows exponentially with probability one:
<math display="block">
\lim_{n\to\infty} \frac{1}{n} \log \|\Pi_n\| = \gamma > 0,
</math>
where <math>\gamma</math> is the Lyapunov exponent.
 
This result is the matrix analogue of the fact that the logarithm of a product of independent random variables becomes self-averaging (exercise 15).
 
== Verification of Furstenberg's hypotheses for the Anderson model ==
 
We now check explicitly that the transfer matrices of the one-dimensional Anderson model satisfy the two conditions stated above.
 
The transfer matrices are
<math display="block">
T_n =
\begin{pmatrix}
V_n-\epsilon & -1 \\
1 & 0
\end{pmatrix}.
</math>
 
=== (i) Stretching ===
 
We first verify that the matrices expand at least one direction with nonzero probability.
 
For a generic <math>2\times2</math> matrix, the maximal stretching factor is controlled by the largest eigenvalue of <math>T_n^T T_n</math>. Here
<math display="block">
T_n^T T_n =
\begin{pmatrix}
(V_n-\epsilon)^2+1 & -(V_n-\epsilon) \\
-(V_n-\epsilon) & 1
\end{pmatrix}.
</math>
 
The trace is
<math display="block">
\mathrm{Tr}(T_n^T T_n) = (V_n-\epsilon)^2 + 2,
</math>
and the determinant is
<math display="block">
\det(T_n^T T_n) = \det(T_n)^2 = 1.
</math>
 
Therefore the eigenvalues satisfy
<math display="block">
\lambda_+ \lambda_- = 1,
</math>
and the largest one obeys
<math display="block">
\lambda_+ \ge 1.
</math>
 
More precisely, as soon as <math>V_n-\epsilon \neq 0</math>, one has
<math display="block">
\lambda_+ > 1.
</math>
 
Thus the matrices expand at least one direction except for the fine-tuned case <math>V_n=\epsilon</math>, which has zero probability for a continuous disorder distribution. The stretching condition is therefore satisfied almost surely.
 
=== (ii) Mixing of directions ===
 
We now verify that the angular dynamics is not confined to a finite set of directions.
 
Define the angle
<math display="block">
\theta_n = \arctan\!\left(\frac{\psi_{n-1}}{\psi_n}\right),
</math>
which represents the direction of the vector <math>(\psi_n,\psi_{n-1})</math>.
 
The transfer matrix induces the map
<math display="block">
\theta_{n+1} =
\arctan\!\left(\frac{1}{V_n-\epsilon-\tan\theta_n}\right).
</math>
 
For fixed <math>\theta_n</math>, this expression depends continuously on the random variable <math>V_n</math>. If the disorder has a continuous distribution (for instance Gaussian), the image of a given angle is not restricted to a finite set.
 
As a consequence, the sequence <math>\theta_n</math> is not confined to a finite set of directions. More precisely, there is no finite collection of directions that is invariant under all transfer matrices.
 
The randomness of <math>V_n</math> continuously reshuffles the direction and prevents the dynamics from being trapped into special directions. This ensures that the projective dynamics effectively explores the angular space.
 
This is what is meant by '''mixing of directions''' in the present context.
 
=== Conclusion ===
 
The transfer matrices of the one-dimensional Anderson model satisfy both conditions:
 
# they expand at least one direction with nonzero probability;
# they do not confine the angular dynamics to a finite set.
 
As a consequence,
<math display="block">
\lim_{n\to\infty} \frac{1}{n}\log \|\Pi_n\| = \gamma > 0,
</math>
so the Lyapunov exponent is positive for generic disorder.
 
---
 
== Localization length ==
 
The transfer-matrix recursion corresponds to fixing the wavefunction at one boundary and propagating it through the system.
 
Typical solutions grow exponentially
 
<math display="block">
|\psi_n|\sim e^{\gamma n}.
</math>
 
However a physical eigenstate must satisfy boundary conditions at both ends of the system. Matching two such solutions leads to exponentially localized eigenstates
 
<math display="block">
|\psi_n|\sim e^{-|n-n_0|/\xi_{\text{loc}}}.
</math>
 
The localization length is
 
<math display="block">
\xi_{\text{loc}}(\epsilon)=\frac{1}{\gamma(\epsilon)}.
</math>
 
Thus in one dimension arbitrarily weak disorder localizes all eigenstates. 
This result is consistent with the scaling theory of localization discussed earlier, which predicts that for <math>d\le2</math> disorder inevitably drives the system toward the insulating regime.
 
---
 
== Fluctuations ==
 
Quantities such as
 
<math>
|\psi_n|,\quad \|\Pi_n\|,\quad G
</math>
 
show strong sample-to-sample fluctuations, while their logarithm is self-averaging.
 
For instance
 
<math display="block">
\ln|\psi_n|
\sim
\gamma n + O(\sqrt n)
</math>
 
so that the logarithm of the wavefunction performs a random walk with a positive drift.

Latest revision as of 13:36, 23 March 2026

Goal. We introduce the Anderson model and study the statistical properties of its eigenstates. In one dimension disorder leads to localization of all eigenstates, which can be understood using products of random matrices.

Anderson model (tight-binding model)

We consider non-interacting particles hopping between nearest-neighbour sites of a lattice in the presence of disorder.

The Hamiltonian reads

H=ti,j(cicj+cjci)+iVicici.

The random variables Vi represent on-site disorder. For simplicity we set t=1.

We assume that the disorder variables are independent and identically distributed, uniformly in the interval

Vi(W2,W2).

In one dimension, the single-particle Hamiltonian is represented by the tridiagonal matrix

H=(V11001V21001V3100111VL).

We study the statistical properties of the eigenvalue problem

Hψ=ϵψ,n=1L|ψn|2=1.

In the one-particle sector, this becomes a discrete Schrödinger equation.

---

Density of states

Without disorder the dispersion relation is

ϵ(k)=2cosk,k(π,π).

The energy band is therefore

2<ϵ<2.

The density of states is

ρ(ϵ)=ππdk2πδ(ϵϵ(k))=1π4ϵ2(ϵ(2,2)).

In the presence of disorder the density of states broadens and becomes sample dependent.

Transfer matrices

The discrete Schrödinger equation reads

ψn+1ψn1+Vnψn=ϵψn.

It can be rewritten as

(ψn+1ψn)=Tn(ψnψn1)

with

Tn=(Vnϵ110).

Iterating gives

(ψn+1ψn)=Πn(ψ1ψ0),Πn=TnTn1T1.

Thus the wavefunction is controlled by a product of random matrices.

Remark

The transfer matrix formulation rewrites the problem as a recursion, starting from initial data and propagating the solution along the chain.

This corresponds to a Cauchy problem rather than a boundary value problem. Nevertheless, the growth properties of the solutions contain direct information about the nature of the eigenstates: exponential growth is associated with localization, while bounded solutions correspond to extended states.

Furstenberg theorem (physical formulation)

The exponential growth of the product of random matrices Πn=TnTn1T1 can be understood as a matrix analogue of the law of large numbers. A set of simple sufficient conditions for this behavior is provided by a theorem of Harry Furstenberg.

To quantify this growth, we introduce the norm of a matrix A as A2=a112+a122+a212+a2222.

In physical terms, the mechanism behind the theorem relies on two key ingredients:

(i) Stretching

The transfer matrices must be able, with nonzero probability, to expand vectors.

More precisely, there exist realizations of Tn such that for some direction v, Tnv=σmax(Tn)v,σmax(Tn)>1.

Equivalently, σmax2(Tn) is the largest eigenvalue of TnTTn.

This ensures that typical products contain episodes of exponential amplification.

(ii) Mixing of directions

At each step, the transfer matrix changes the direction of the vector (ψn,ψn1).

In the clean case, the same matrix is applied at every step, so the angular dynamics is deterministic and can be written as θn+1=F(θn). Once the initial direction is fixed, the whole sequence is fixed. The system retains a perfect memory of its initial orientation.

In the disordered case, the transfer matrix varies from step to step, and the angular dynamics becomes θn+1=F(θn,Tn). The direction is then continuously reshuffled and progressively loses memory of its initial value.

More importantly, there is no finite set of directions that is invariant under all transfer matrices. As a consequence, the dynamics does not get trapped into special directions and effectively explores the projective space.

This absence of invariant directions is what is meant by mixing of directions.

Consequence

Under these conditions, the norm of the product grows exponentially with probability one: limn1nlogΠn=γ>0, where γ is the Lyapunov exponent.

This result is the matrix analogue of the fact that the logarithm of a product of independent random variables becomes self-averaging (exercise 15).

Verification of Furstenberg's hypotheses for the Anderson model

We now check explicitly that the transfer matrices of the one-dimensional Anderson model satisfy the two conditions stated above.

The transfer matrices are Tn=(Vnϵ110).

(i) Stretching

We first verify that the matrices expand at least one direction with nonzero probability.

For a generic 2×2 matrix, the maximal stretching factor is controlled by the largest eigenvalue of TnTTn. Here TnTTn=((Vnϵ)2+1(Vnϵ)(Vnϵ)1).

The trace is Tr(TnTTn)=(Vnϵ)2+2, and the determinant is det(TnTTn)=det(Tn)2=1.

Therefore the eigenvalues satisfy λ+λ=1, and the largest one obeys λ+1.

More precisely, as soon as Vnϵ0, one has λ+>1.

Thus the matrices expand at least one direction except for the fine-tuned case Vn=ϵ, which has zero probability for a continuous disorder distribution. The stretching condition is therefore satisfied almost surely.

(ii) Mixing of directions

We now verify that the angular dynamics is not confined to a finite set of directions.

Define the angle θn=arctan(ψn1ψn), which represents the direction of the vector (ψn,ψn1).

The transfer matrix induces the map θn+1=arctan(1Vnϵtanθn).

For fixed θn, this expression depends continuously on the random variable Vn. If the disorder has a continuous distribution (for instance Gaussian), the image of a given angle is not restricted to a finite set.

As a consequence, the sequence θn is not confined to a finite set of directions. More precisely, there is no finite collection of directions that is invariant under all transfer matrices.

The randomness of Vn continuously reshuffles the direction and prevents the dynamics from being trapped into special directions. This ensures that the projective dynamics effectively explores the angular space.

This is what is meant by mixing of directions in the present context.

Conclusion

The transfer matrices of the one-dimensional Anderson model satisfy both conditions:

  1. they expand at least one direction with nonzero probability;
  2. they do not confine the angular dynamics to a finite set.

As a consequence, limn1nlogΠn=γ>0, so the Lyapunov exponent is positive for generic disorder.

---

Localization length

The transfer-matrix recursion corresponds to fixing the wavefunction at one boundary and propagating it through the system.

Typical solutions grow exponentially

|ψn|eγn.

However a physical eigenstate must satisfy boundary conditions at both ends of the system. Matching two such solutions leads to exponentially localized eigenstates

|ψn|e|nn0|/ξloc.

The localization length is

ξloc(ϵ)=1γ(ϵ).

Thus in one dimension arbitrarily weak disorder localizes all eigenstates. This result is consistent with the scaling theory of localization discussed earlier, which predicts that for d2 disorder inevitably drives the system toward the insulating regime.

---

Fluctuations

Quantities such as

|ψn|,Πn,G

show strong sample-to-sample fluctuations, while their logarithm is self-averaging.

For instance

ln|ψn|γn+O(n)

so that the logarithm of the wavefunction performs a random walk with a positive drift.