L-8
Goal. We introduce the Anderson model and study the statistical properties of its eigenstates. In one dimension disorder leads to localization of all eigenstates, which can be understood using products of random matrices.
Anderson model (tight-binding model)
We consider non-interacting particles hopping between nearest-neighbour sites of a lattice in the presence of disorder.
The Hamiltonian reads
The random variables represent on-site disorder. For simplicity we set .
We assume that the disorder variables are independent and identically distributed, uniformly in the interval
In one dimension, the single-particle Hamiltonian is represented by the tridiagonal matrix
We study the statistical properties of the eigenvalue problem
In the one-particle sector, this becomes a discrete Schrödinger equation.
---
Density of states
Without disorder the dispersion relation is
The energy band is therefore
The density of states is
In the presence of disorder the density of states broadens and becomes sample dependent.
Transfer matrices
The discrete Schrödinger equation reads
It can be rewritten as
with
Iterating gives
Thus the wavefunction is controlled by a product of random matrices.
Remark
The transfer matrix formulation rewrites the problem as a recursion, starting from initial data and propagating the solution along the chain.
This corresponds to a Cauchy problem rather than a boundary value problem. Nevertheless, the growth properties of the solutions contain direct information about the nature of the eigenstates: exponential growth is associated with localization, while bounded solutions correspond to extended states.
Furstenberg theorem (physical formulation)
The exponential growth of the product of random matrices can be understood as a matrix analogue of the law of large numbers. A set of simple sufficient conditions for this behavior is provided by a theorem of Harry Furstenberg.
To quantify this growth, we introduce the norm of a matrix as
In physical terms, the mechanism behind the theorem relies on two key ingredients:
(i) Stretching
The transfer matrices must be able, with nonzero probability, to expand vectors.
More precisely, there exist realizations of such that for some direction ,
Equivalently, is the largest eigenvalue of
This ensures that typical products contain episodes of exponential amplification.
(ii) Mixing of directions
At each step, the transfer matrix changes the direction of the vector .
In the clean case, the same matrix is applied at every step, so the angular dynamics is deterministic and can be written as Once the initial direction is fixed, the whole sequence is fixed. The system retains a perfect memory of its initial orientation.
In the disordered case, the transfer matrix varies from step to step, and the angular dynamics becomes The direction is then continuously reshuffled and progressively loses memory of its initial value.
More importantly, there is no finite set of directions that is invariant under all transfer matrices. As a consequence, the dynamics does not get trapped into special directions and effectively explores the projective space.
This absence of invariant directions is what is meant by mixing of directions.
Consequence
Under these conditions, the norm of the product grows exponentially with probability one: where is the Lyapunov exponent.
This result is the matrix analogue of the fact that the logarithm of a product of independent random variables becomes self-averaging (exercise 15).
Verification of Furstenberg's hypotheses for the Anderson model
We now check explicitly that the transfer matrices of the one-dimensional Anderson model satisfy the two conditions stated above.
The transfer matrices are
(i) Stretching
We first verify that the matrices expand at least one direction with nonzero probability.
For a generic matrix, the maximal stretching factor is controlled by the largest eigenvalue of . Here
The trace is and the determinant is
Therefore the eigenvalues satisfy and the largest one obeys
More precisely, as soon as , one has
Thus the matrices expand at least one direction except for the fine-tuned case , which has zero probability for a continuous disorder distribution. The stretching condition is therefore satisfied almost surely.
(ii) Mixing of directions
We now verify that the angular dynamics is not confined to a finite set of directions.
Define the angle which represents the direction of the vector .
The transfer matrix induces the map
For fixed , this expression depends continuously on the random variable . If the disorder has a continuous distribution (for instance Gaussian), the image of a given angle is not restricted to a finite set.
As a consequence, the sequence is not confined to a finite set of directions. More precisely, there is no finite collection of directions that is invariant under all transfer matrices.
The randomness of continuously reshuffles the direction and prevents the dynamics from being trapped into special directions. This ensures that the projective dynamics effectively explores the angular space.
This is what is meant by mixing of directions in the present context.
Conclusion
The transfer matrices of the one-dimensional Anderson model satisfy both conditions:
- they expand at least one direction with nonzero probability;
- they do not confine the angular dynamics to a finite set.
As a consequence, so the Lyapunov exponent is positive for generic disorder.
---
Localization length
The transfer-matrix recursion corresponds to fixing the wavefunction at one boundary and propagating it through the system.
Typical solutions grow exponentially
However a physical eigenstate must satisfy boundary conditions at both ends of the system. Matching two such solutions leads to exponentially localized eigenstates
The localization length is
Thus in one dimension arbitrarily weak disorder localizes all eigenstates. This result is consistent with the scaling theory of localization discussed earlier, which predicts that for disorder inevitably drives the system toward the insulating regime.
---
Fluctuations
Quantities such as
show strong sample-to-sample fluctuations, while their logarithm is self-averaging.
For instance
so that the logarithm of the wavefunction performs a random walk with a positive drift.