Right at least once with high probability

Menu

Category Archives: Graph Theory

Lately I’ve come across a lot of linear algebra. Apparently it’s pretty important. One topic I always thought was pretty fantastic was spectral graph theory, or analyzing graphs from their adjacency or similar matrices. I never really got a chance to look into it at school, but I started trying to understand some very basic results which I think are pretty fascinating. Of course, after I proved these results I looked at more organized approaches to spectral graph theory only to find that I had missed the main point entirely. That’s why I wouldn’t call the content of this post spectral graph theory. It’s more just facts about the Laplacian matrix.

For a simple graph on vertices we’ll define its Laplacian (I will drop the from now on) as the matrix with entries as follows:

This matrix has a lot of interesting properties. It’s obviously symmetric. It’s also singular, since it maps to because the degree is the only positive term and is exactly the number of s in each row. We can actually say more interesting things about its eigenvalues. First we’ll show that is positive semidefinite, then we’ll show that the multiplicity of the eigenvalue is exactly the number of components in .

The fact that is positive semidefinite is proven in a short and elegant fashion on Wikipedia. Naturally I disregarded it and set out to prove it on my own. This resulted in a proof that I am not really pleased with since it unnecessarily makes use of some pretty high-powered ideas to prove what is probably a very simple result.

At any rate, there are two facts we’re going to have to use that I won’t prove here. The first is Sylvester’s Criterion, which states that a symmetric matrix is positive definite if and only if its leading principal minors are positive. To review in case you forgot (because I did), a minor is the determinant of the matrix you get when you cut out a set of rows and a set of columns. A principal minor is the determinant of the matrix you get when the indices of those rows and columns are the same. A leading principle minor means you cut out the rows and columns from to (i.e. the upper left submatrix of size ).

The second fact we’re going to use is the extension of this to positive semidefinite matrices. It says that symmetric a matrix is positive semidefinite if and only if all its principal minors are nonnegative. I came across this fact when my friend linked me to his convex optimization homework here[1], where it is stated without proof. Unfortunately there doesn’t seem to be an elegant justification even assuming Sylvester’s criterion. Overall though it is a bit useless. Sylvester’s criterion at least provides a sort of computationally efficient way to test for positive definiteness. The extension clearly does no such thing, since it requires checking minors.

With those out of the way here’s how we’re going to proceed. We’re going to only deal with Laplacians of connected matrices for now and the extension will be simple enough. We will encounter matrices which are almost Laplacian but have least one diagonal element that is too large by an integer amount. Phrased another way, this is a matrix of the form where is a laplacian and is a nonzero positive semidefinite diagonal matrix with integer entries. I’m going to call this a super-Laplacian. Be warned that this is probably horribly non-standard, although it shouldn’t be.

Our first claim is that a super-Laplacian matrix is positive definite. This can be shown by induction on (the size of the matrix). The case where is trivial. The inductive step needs a simple fact. If we take a super-Laplacian (or even a normal Laplacian) and throw out the same subset of rows and columns we’re going to get a super-Laplacian matrix. The way I understand this is throwing out rows and columns is like deleting a set of vertices from the graph, except we’re not reducing the degrees of the vertices that had edges going into the set we just threw out. Futhermore, since we’re only working with connected graphs for the time being, a vertex in the set of vertices we threw out had to be adjacent to some remaining vertex (and so that diagonal entry will be too large), so our modified matrix is super-Laplacian.

Getting back to the proof, we see that all the principal minors of a super-Laplacian are positive by our assumption. All that remains to be shown to satisfy Sylvester’s Criterion is that the determinant of the matrix itself is positive. Any super-Laplacian can be constructed by adding to its diagonal one entry at a time from a Laplacian matrix. When we increase one diagonal entry from the Laplacian, what happens to its determinant? Consider the traditional expansion along a row or column containing that entry. We increase it (increase since we’re on the diagonal so the associated sign is positive) by the minor resulting from throwing out that vertex, i.e. the determinant of a super-Laplacian. By our assumption this is positive and so we added a nonzero quantity to the determinant. Further additions do not cause problems since we are similarly increasing by the determinant of a super-Laplacian. So our matrix has positive determinant and by Sylvester’s Criterion it is positive definite.

We’re basically done now by the extended version of Sylvester’s Criterion. A principal minor of a Laplacian is the determinant of a super-Laplacian matrix which is nonnegative, so the Laplacian of a connected graph is positive semidefinite. Extension to graphs with components follows by relabeling vertices (or switching rows and columns, depending on how you want to think about it) to get a block matrix of the form

where are positive semidefinite since they’re connected components. The block matrix is positive semidefinite, which can be seen straight from the definition. So all Laplacians are positive semidefinite.

The other result we’re going to show is a relation between the eigenvalues of the Laplacian and the number of components in the graph. What we just did already suggests as much, but we’ll show it a bit more explicitly. Let be the eigenvalues of ordered from least to greatest. Clearly they’re all nonnegative and there are of them (counting multiplicity) since is diagonalizable. We’re going to show that the multiplicity of the eigenvalue is exactly the number of components in (recall that every has eigenvector and so is always ).

If has components, then we can find linearly independent vectors in its null space. For ease of visualization we can relabel to form a block diagonal matrix as we did above, and then the independent eigenvectors are just with the appropriate sizes of blocks. So .

If has fewer than components, say components, then we can find columns (and rows) to throw out and we will have a super-Laplacian matrix, since we can throw out one vertex from each component giving us a block diagonal matrix of super-Laplacians. This is full rank, so our original matrix is of rank at least , so the dimension of the null space of is at most , and we can conclude that .

Like I said, this is far from the heart of spectral graph theory, but it’s a fun thing to play around with since the concepts are really basic and easy to understand. Another thing to note that I haven’t proven: the second smallest eigenvalue of is called the algebraic connectivity of . We saw a little bit of how that might work since if and only if is connected. Apparently a larger means it is more connected, although not quite in the sense of -vertex or -edge connectivity. That more or less sums up the basic facts on Wikipedia, but I’m sure there are many more simple observations to be made.

Sources:
[1] The material for this course helped me a lot in trying to approach this problem, and of course provided the extended Sylvester’s Criterion. I’m in the Stanford Online versionand I really enjoy it.