From: israel@math.ubc.ca (Robert Israel)
Subject: Re: Eigenvalues of Hermitian tridiagonal matrices
Date: 29 Aug 2000 00:27:31 GMT
Newsgroups: sci.math
Summary: [missing]
In article <39A58D53.D7A3F41F@swarthmore.edu>,
Mike Seifert wrote:
> Hey there everyone... I'm trying to get a handle on the absolute
>values of the eigenvalues of a matrix W having a specific form, namely:
> - W_{ij} = 0 if i=j
> - W_{ij} = a_i if i=j+1
> - W_{ji} = -a_i if i=j+1
> - W_{ij} = 0 otherwise
> where {a_i} is a set of pure imaginary numbers. In other words, this
>matrix is tridiagonal, antisymmetric and Hermitian.
> I know that the eigenvalues are real, and that their sum is zero (since
>tr W = 0.) I can also use the absolute row/column sum norm to place a
>bound on the eigenvalues, and Schur's lemma to place a limit on the sum
>of the squares. Gerschgorin's theorem doesn't help me much: since the
>main diagonal entries are all zero, it's equivalent the the norms I mentioned.
> However, none of these use the fact that the matrix is tridiagonal or
>antisymmetric. Are there any obscure/arcane linear algebra formulas out
>there that might be useful?
You might consider W^2. Its eigenvalues of course are the squares of the
eigenvalues of W. And since the nonzero matrix elements of W^2 are
a_i^2 + a_{i-1}^2 along the main diagonal and -a_i a_{i+1} two diagonals
over, Gerschgorin's theorem may be useful here.
Other facts:
1) If P is the matrix whose diagonal elements are alternately +1 and -1,
then P W P^(-1) = P W P = -W. Therefore the eigenvalues of W are
symmetric about 0: if w is an eigenvalue, so is -w, with the same
multiplicity (and P transforms eigenvectors for w to eigenvectors for -w).
2) det W = 0 if W has an odd number of rows, otherwise it is the product
of - a_j^2 for odd j.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia
Vancouver, BC, Canada V6T 1Z2
==============================================================================
From: Dave Rusin
Subject: Re: Eigenvalues of Hermitian tridiagonal matrices
Date: Thu, 24 Aug 2000 20:48:38 -0500 (CDT)
To: mseifer1@swarthmore.edu
Summary: [missing]
In article <39A58D53.D7A3F41F@swarthmore.edu> you write:
> Hey there everyone... I'm trying to get a handle on the absolute
>values of the eigenvalues of a matrix W having a specific form, namely:
>
> - W_{ij} = 0 if i=j
> - W_{ij} = a_i if i=j+1
> - W_{ji} = -a_i if i=j+1
> - W_{ij} = 0 otherwise
>
> where {a_i} is a set of pure imaginary numbers. In other words, this
>matrix is tridiagonal, antisymmetric and Hermitian.
>
> I know that the eigenvalues are real, and that their sum is zero (since
>tr W = 0.) I can also use the absolute row/column sum norm to place a
>bound on the eigenvalues, and Schur's lemma to place a limit on the sum
>of the squares. Gerschgorin's theorem doesn't help me much: since the
>main diagonal entries are all zero, it's equivalent the the norms I mentioned.
>
> However, none of these use the fact that the matrix is tridiagonal or
>antisymmetric. Are there any obscure/arcane linear algebra formulas out
>there that might be useful?
Probably, but since I don't know them, I'd rather not post to the newsgroup
(not yet, anyway). Meanwhile let me point out that you can try to
analyze this algebraically: I am _not_ sure this is the best way to
discuss the eigenvalues here, but at least it's something.
Let P_n be the characteristic polynomial det(W^(n) - xI) of the upper-left
n x n minor of your matrix W. Then, expanding out the defining determinant
along the last couple of rows we get the recursion
P_n = -x P_{n-1} - (a_n)^2 P_{n-2}
as you have noted, -(a_n)^2 is a positive real number A_n, so
(tossing in some correcting negative signs) we see we need to analyze
the roots of the polynomials
Q1 = x
Q2 = x^2 + A2
Q3 = x^3 + (A2+A3)x
and so on, with Q_n = x Q_{n-1} + A_n Q_{n-2}.
From the recursion we can formulate a description of the general Q_n:
these polynomials are alternately even or odd, and have the form
x^n + (\sum A_i) x^{n-2} + (\sum_{i < j-1} A_i A_j) x^{n-4} ...
where the coefficient of x^{n - 2k} is the sum of all products of
exactly k distinct A_i's, excluding all such summands containing
two consecutive A_i's as factors. That is, you could compute Q_n
formally by expanding \prod( x^2 + A_i ) in a the ring
Z[x, A2, A3, ...]/(A2 A3, A3 A4, A4 A5, ...)
(you'd get a lot of extra factors of x which could be ignored.)
I don't think that this description is better for computing
the Q_n rapidly (the recursion is best) but maybe this
description will help you or someone else compute roots of the Q's.
I don't know if this is all known to you already. It looks like a
small advance since I can one-up your observation that the roots
sum to zero: better, they occur in _pairs_ which sum to zero.
You might find the crowd at sci.math.num-analysis to be better
at matrix algebra.
dave