L-8: Difference between revisions

From Disordered Systems Wiki
Jump to navigation Jump to search
No edit summary
Line 38: Line 38:
==Transfer matrices and Lyapunov exponents==
==Transfer matrices and Lyapunov exponents==


=== Product of random variables and Central limit theorem ===
Consider a set of positive iid random variables  <math>x_1,x_2,\ldots x_N</math> with finite mean and variance and compute their product
<center><math>
\Pi_N=  \prod_{n=1}^N x_i, \quad \text{or} \;  \ln \Pi_N=  \sum_{n=1}^N \ln x_i
</math></center>
For large N, the Central Limit Theorem predicts:
<center><math>
\log \Pi_N= \gamma N + \gamma_2 \sqrt{N} \chi
</math></center>
* <math>\chi</math> is a Gaussian number of zero mean and unit variance
* <math>\gamma, \gamma_2</math> are  N indepent and can be written as 
<center><math>
\gamma =\overline{\ln x}< \ln \overline{x}, \quad \gamma_2= \sqrt{\overline{(\ln x)^2}-(\overline{\ln x})^2}
</math></center>
=== Log-normal distribution ===
The distribution of <math>\Pi_N</math> is log-normal
<center><math>
P(\Pi_N) d \Pi_N = \frac{1}{ \sqrt{2 \pi \gamma_2^2 N}} \exp\left[-\frac{(\ln(\Pi_N)-\gamma N)^2}{2 \gamma_2^2 N}\right] \frac{d\Pi_N}{\Pi_N}
</math></center>
<Strong> Quenched  and Annealed averages </Strong>
To compute the moments of the log-normal distribution, it is convenient to introduce the variable
<center><math> X \equiv \ln(\Pi_N) </math></center> which is Gaussian distributed:
<center><math> p(X) = \frac{1}{ \sqrt{2 \pi \sigma^2}} \exp\left[-\frac{(X-\mu)^2}{2 \sigma^2}\right] </math></center>
with <math>\mu =\gamma N</math> and <math>\sigma^2=\gamma_2^2 N</math>.
The moments of <math>\Pi_N</math> can be easily computed: <center><math>\overline{\Pi_N^n} = \int dX \, e^{nX} p(X) = \exp\left[\mu n +\sigma^2 \frac{n^2}{2} \right]=\exp\left[(\gamma n +\gamma_2^2 \frac{n^2}{2})N \right]  </math> </center>
The variable <math>\Pi_N</math> is therefore not self-averaging (see Valentina's lecture 1) since its fluctuations grow with <math>N</math> faster than its mean:
<center><math> \frac{\overline{\Pi_N^2}}{(\overline{\Pi_N})^2}= \exp\left[\gamma_2^2 N \right] </math></center>
Hence, <math>\Pi_N</math> is not self-averaging, while <math>\ln \Pi_N</math> is self-averaging.
In particular, the mean <math>\overline{\Pi_N} = \exp[(\gamma+\gamma_2^2/2) N]</math> grows much faster than the typical value <math>\Pi_N^{\text{typ}} \equiv \exp(\gamma N)</math>.


== Product of random matrices==
== Product of random matrices==

Revision as of 18:44, 3 March 2026

Goal: we will introduce the Anderson model, discuss the behaviour as a function of the dimension. In 1d localization can be connected to the product of random matrices.

Anderson model (tight binding model)

We consider disordered non-interacting particles hopping between nearest neighbors sites on a lattice. The hamiltonian reads:

H=t<i,j>(cicj+cjci)iVicici

The single particle hamiltonian in 1d reads

H=[V1t0000tV2t0000tV3t0000tt0000tt0000tVL]

For simplicity we set the hopping t=1. The disorder are iid random variables drawn, uniformly from the box (W2,W2).

The final goal is to study the statistical properties of eigensystem

Hψ=ϵψ,withn=1L|ψn|2=1

Density of states (DOS)

In 1d and in absence of disorder, the dispersion relation is ϵ(k)=2cosk,k(π,π),2<ϵ(k)<2. From the dispersion relation, we compute the density of states (DOS) :

ρ(ϵ)=ππdk2πδ(ϵϵ(k))=1π14ϵ2for ϵ(2,2)

In presence of disorder the DOS becomes larger, and display sample to sample fluctuations. One can consider its mean value, avergaed over disorder realization.

Transfer matrices and Lyapunov exponents

Product of random matrices

Let's consider again the Anderson Model in 1d. The eigensystem is well defined in a box of size L with Dirichelet boundary condition on the extremeties of the box.

Here we will solve the second order differential equation imposing instead Cauchy boundaries on one side of the box. Let's rewrite the previous eigensystem in the following form

[ψn+1ψn]=[Vnϵ110][ψnψn1]

We can continue the recursion

[ψn+1ψn]=[Vnϵ110][Vn1ϵ110][ψn1ψn2]

It is useful to introduce the transfer matrix and their product

Tn=[Vnϵ110],Πn=TnTn1T1

The Schrodinger equation can be written as

[ψn+1ψn]=Πn[ψ1ψ0]=[π11π12π21π22][ψ1ψ0]

Fustenberg Theorem

We define the norm of a 2x2 matrix:

Πn2=π112+π212+π122+π2222

For large N, the Fustenberg theorem ensures the existence of the non-negative Lyapunov exponent, namely

limnlnΠnn=γ0

In absence of disorder γ=0 for ϵ(2,2). Generically the Lyapunov is positive, γ>0, and depends on ϵ and on the distribution of Vi.

Consequences

Localization length

Together with the norm, also |ψn|2 grows exponentially with n. We can write

ln|ψn|γn+γ2χn

which means that ln|ψn| is performing a random walk with a drift.


However, our initial goal is a properly normalized eigenstate at energy ϵ. We need to switch from Cauchy, where you set the initial condition, to Dirichelet or vonNeuman, where you set the behaviour at the two boundaries. The true eigenstate is obtained by matching two "Cauchy" solutions on the half box and imposing the normalization. Hence, we obtain a localized eigenstate and we can identify

ξloc(ϵ)γ1(ϵ)


Fluctuations

We expect strong fluctuations on quantites like |ψn|,Πn,G,, while their logarithm is self averaging.