I just came across a matrix of the form
$A:=\begin{pmatrix}
0&-\frac{c_0}{b_0}&0&\cdots&0\\-\frac{a_1}{b_1}&0&-\frac{c_1}{b_1}&\cdots&0\\0&-\frac{a_2}{b_2}&0&\cdots&0\\ \vdots&\vdots&\ddots&\ddots&-\frac{a_{N-1}}{b_{N-1}}\\0&\cdots&\cdots&-\frac{a_N}{b_N}&0
\end{pmatrix}$ for some N$\in \mathbb{Z}^+$ where $a_n=-\frac{1}{2}\alpha(\beta^2n^2-rn), \ b_n=1+\alpha(\beta^2n^2+r), and \ c_n=-\frac{1}{2}\alpha(\beta^2n^2+rn)\ $such that $\alpha, \beta,r$ are known real constants.

From the Gershgorin circle theorem, I know that its maximum eigenvalue must lie in the Gershgorin discs. However, despite it being quite sparse, I could not get an explicit formula for its maximum eigenvalue.

I have tried solving the Av=$\lambda$v equation, where v is an eigenvector and $\lambda$ is an eigenvalue, in which I obtain a recurrence relation, but I didn't have an initial boundary condition. The equation det($\lambda$I-A)=0, where I is the identity matrix, also gives me a complicated equation that I can't solve.

Can anyone tell me what I have missed or is this an impossible-to-solve problem?

I think you got the penultimate entry in the rightmost column wrong. And probably the bottom row, too, unless your matrix is (N+1)x(N+1). But why bother with -a_i/b_i, -c_i/b_i etc at all? Why not just write a_i and c_i, and be done with it? It's the same problem.
–
TonyKMay 27 '10 at 23:23

I think the problem who tonyK suggest is little bit more complicated. because the recursive relations are more general. In another direction, I would like to mention that a particular and interesting case of the above matrices occurs when it is symmetric. Probably you know, but if not, they are called Jacobi matrices and are related to Orthogonal polynomials and Riemann-Hilbert problems. The Percy Deift's book have a nice exposition about it. Orthogonal Polynomials and Random matrices: A Riemann-Hilbert approach
–
Leandro May 27 '10 at 23:59

Leandro: TonyK merely suggested making the problem notationally nicer. They are the same problem. (For instance, to explicitly get from the original version to the new formulation, take the special case where all denominators are -1.) @unknown (google): I'm also wondering whether the entries are real or complex. I hope you don't mind that I incorporated TonyK's suggestions. If you want, you can revert the edit.
–
Jonas MeyerMay 28 '10 at 1:06

Sorry, my question was a bit unclear. I hope it makes more sense now why I wrote -a_i/b_i,.etc I agree that it can be simplified according to TonyK. TonyK: My matrix is, in fact,(N+1)x(N+1).
–
user6358May 28 '10 at 8:59

2 Answers
2

Let me try to expand a little bit the problem (so it's too long for a usual comment).

Consider the determinant $D_N=D_N(\lambda;a_1,\dots,a_{N-1};b_1,\dots,c_{N-1})$ of the corresponding matrix $\lambda-A$. Expanding the determinant along the first row gives
$$
D_N(\lambda;a_1,\dots,a_{N-1};b_1,\dots,b_{N-1})
=\lambda D_{N-1}(\lambda;a_2,\dots,a_{N-1};b_2,\dots,b_{N-1})
-a_1b_1D_{N-2}(\lambda;a_3,\dots,a_{N-1};b_3,\dots,b_{N-1});
$$
in other words,
$$
D_N/D_{N-1}=\lambda-\frac{a_1b_1}{D_{N-1}/D_{N-2}}
=\dots
=\lambda-\frac{a_1b_1}{\lambda-\dfrac{a_2b_2}{\lambda-\dfrac{a_3b_3}{\dots
-\dfrac{a_{N-1}b_{N-1}}{\lambda}}}}.
$$
In order to get some information about the asymptotics of the zero(s) of $D_N(\lambda)/D_{N-1}(\lambda)$ one really have to have some knowledge about the $a_ib_i$, $i=1,2,\dots$. This reduces the problem to a problem for the related family of orthogonal polynomials and even Deift's book is too advanced, it is the best source on this.

Sorry, I edited while you were writing and made the notation inconsistent. I'm not sure of the best way to fix this.
–
Jonas MeyerMay 28 '10 at 1:27

1

After Jonas, the question was re-edited again. I won't edit my answer again. My remark would be that $c_n/b_n\sim a_n/b_n\sim-2$ as $n\to\infty$, and computing the determinant of the matrix with entries $\lambda$ along the main diagonal and $2$ along the auxiliary diagonals is an easy task.
–
Wadim ZudilinMay 28 '10 at 10:53

Solving this recursive equation should give you the characteristic equation of the symmetric tridiagonal matrix. Working out the roots of this equation should then give you all the eigenvalues for the original matrix as they are similar.

To be more precise: the entries $\frac{c_i}{b_i}$ and $\frac{b_i}{a_i}$ may not be of the same sign but since all the constants are known you can check if this is true for every $i$. If this is satisfied then the procedure above works in theory. If they are not all the same sign then it's back to the drawing board.
–
alext87May 28 '10 at 13:46