Web Graphs — PageRank-like Importance Measures
FR

Appendix A — Perron-Frobenius Theorem

In 1907, Oskar Perron (1880-1975) published a theory of strictly positive matrices, which Georg Ferdinand Frobenius (1849-1917) extended in 1908, 1909, and 1912 to the case of nonnegative matrices1. The Perron-Frobenius theorem, which summarizes this theory, is in a sense the cornerstone of most convergence algorithms for stochastic matrices, and in particular of PageRank algorithms. We therefore thought it worthwhile to include a proof in the appendix, since in addition to guaranteeing the convergence of the algorithms, the concept of flow is inherent to the proof (strict inequality propagation lemma).

Theorem A.1 (Perron-Frobenius).

Let be a nonnegative square matrix of size , irreducible. Then, there exists an eigenvalue of , denoted , strictly positive, which has the following properties:

  1. There exist a left eigenvector and a right eigenvector associated with that are strictly positive.
  2. For any eigenvalue of , .
  3. The eigenspace associated with has dimension .
  4. For any nonnegative matrix less than or equal to , and for any eigenvalue of , , with equality only if .
  5. If has cyclicity , then the eigenvalues of with modulus are exactly , where , and the associated eigenspaces have dimension .

Proof.

To prove the Perron-Frobenius theorem, we will need the following lemma, which we shall call the strict inequality propagation lemma.

Lemma A.1.

If is a nonnegative irreducible matrix of size , and if and are two nonnegative vectors such that , with at least one component for which there is strict inequality, then .

Indeed, let be a component for which there is strict inequality. Since the matrix is irreducible, for each component , there exists such that . We deduce that , with strict inequality at component . By using the operator , we ensure that each component will “benefit” by propagation from the strict inequality present at . In other words, .

With this lemma in hand, we can now proceed to the proof itself.

  1. Consider the set of nonnegative vectors in with norm (we use the -norm: if , ).

    For each in , we set

    Clearly, is a nonnegative real number, since it is bounded below by and above by . Now consider . is a (finite) strictly positive real number, since we have the bound:

    To show that is an eigenvalue, consider a sequence of elements of such that converges to . Since is compact, it is possible (by the Bolzano-Weierstrass theorem) to extract a subsequence that converges to a vector .

    is a (right) eigenvector of , associated with the eigenvalue . Indeed, we have . If equality does not hold, then by Lemma 1.1,

    which contradicts2 the maximality of .

    We therefore have , which proves that is a (right) eigenvector of , associated with . The strict positivity of is guaranteed by the strict inequality propagation lemma.

    By reasoning with , we find an eigenvalue associated with a strictly positive left eigenvector of . To prove (a), it suffices to show that . Without loss of generality, by possibly interchanging and , assume . It then suffices to take such that 3 and to observe that , which forces , hence equality.

  2. The proof is the same as for showing that . If , with , then , hence .

  3. The fact that is a simple eigenvalue is also proved using the strict inequality propagation lemma. Indeed, if the eigenspace has dimension greater than or equal to 2, there exists an eigenvector not collinear with . The vector is also an eigenvector associated with . By construction, it is nonzero, nonnegative, and at least one of its components is zero. By propagation of strict inequality, iterations of will make this component strictly positive, which is a contradiction and proves that is a simple root.

  4. Same proof as (b): if , with , then , hence .

    Equality case: forces . is therefore strictly positive, since it is collinear with . is a nonnegative matrix satisfying , hence .

  5. Without loss of generality, by considering , we may assume that . Consider the following equivalence relation on the components: two components and are equivalent if there exists a component such that and are strictly positive, or a component such that and are strictly positive. Then, the cyclicity of equals the number of equivalence classes among the components. Indeed, the cyclicity of corresponds to the cyclicity on the underlying graph (the vertices corresponding to the components and the edges to the nonzero coefficients ). Since the graph is strongly connected (because the matrix is irreducible), this cyclicity is found by placing in the same equivalence class the successors and predecessors of each component.

    Let , …, be the equivalence classes for the predecessor/successor relation, arranged in order of succession.

    • If is a -th root of unity, then is an eigenvalue of , and an associated eigenvector is , with

      Indeed, for in , we have

    • Let be an eigenvalue of modulus , and an associated eigenvector. We then write:

      In fact, equality must hold, otherwise by the strict inequality propagation lemma, would no longer be the maximal eigenvalue. Note in passing that is collinear with . For complex numbers, the absolute value of a sum equals the sum of the absolute values only if all (nonzero) elements have the same phase. Since the coefficients of are nonnegative, this implies that all predecessors of a given component under have the same phase. Moreover, by construction, the successors also have the same phase. We can deduce that all components in the same equivalence class have the same phase. Furthermore, if is the phase of class , then . By cyclicity, and using the convention , we have:

      is therefore a -th root of unity.

      Finally, note that given everything stated above, the vector is collinear with the vector defined by

      The eigenspace associated with therefore has dimension .

      Q.E.D.


  1. 1Helmut Wielandt – 1910-2001 – presented a simpler approach to the problem in 1950, which is the one used nowadays.
  2. 2Up to normalizing
  3. 3 is an eigenvalue of , so there exists an associated right eigenvector.
Esc