Main menu

Post navigation

Max Cut and the Smallest Eigenvalue

In the Max Cut problem, we are given an undirected graph and we want to find a partition of the set of vertices such that as many edges as possible have one endpoint in and one endpoint in , and are hence cut by the partition.

It is easy, as recognized since the 1970s, to find a partition that cuts half of the edges and that, thus, is at least half as good as an optimal solution. No approximation better than 1/2 was known for this problem, until Goemans and Williamson famously proved that one could achieve a .878… approximation using semidefinite programming (SDP).

No other approximation algorithm achieving an approximation asymptotically better than 1/2 is known, and it seems that a fundamental difficulty is the following. Suppose we prove that a certain algorithm achieves approximation . Then, given a graph in which the optimum is, say, , the algorithm and its analysis must provide a certificate that the optimum cut in the given graph is , and there is no general technique to prove upper bounds to the Max Cut optimum of a general graph other than Semidefinite Programming. (And see here and here for negative results showing that large classes of linear programming relaxations are unable to give such certificates.)

Spectral techniques can prove upper bounds to Max Cut in certain cases (and can be seen as special cases of the upper bounds provided by the Goemans-Williamson relaxation).

In the simplified case in which is a -regular graph, let be the adjacency matrix of and be the eigenvalues of ; then it is easy to show that

where is the fraction of edges cut by an optimal solution. Unfortunately (1) does not quite have an approximate converse: there are graphs where but .

The following fact, however, is always true and well known:

if and only if contains a bipartite connected component.

Is there an “approximate” version of the above statement characterizing the cases in which is small? Surprisingly, as far as I know the question had not been considered before.

For comparison, the starting point of the theory of edge expansion is the related fact

if and only if is disconnected.

Which can be rephrased as:

if and only if there is a non-empty , such that .

Cheeger’s inequality characterizes the case in which is small:

If there is a non-empty , such that , then ;

If then there is a non-empty , such that .

For a subset , and a bipartition of , we say that an edge fails [to be cut by] the bipartition if is incident on but it is not the case that one endpoint is in and one endpoint is in . (This means that either both endpoints are in , or both endpoints are in , or one endpoint is in and one endpoint is not in .) Then we can express the well-known fact about as

If there is a non-empty , and a bipartition of with at most failed edges, then ;

If , then there is a non-empty , and a partition of with at most failed edges.

The following notation makes the similarity with Cheeger’s inequality clearer. Define the edge expansion of a graph as

Let us define the bipartiteness ratio of as

that is, as the minimum ratio between failed edges of a partition of a set over .

Then Cheeger’s inequality gives

and our results give

This translates into an efficient algorithm that, given a graph such that , finds a set and a bipartition of such that at least a fraction of the edges incident on are cut by the bipartition. Removing the vertices in and continuing recursively on the residual graph yields a .50769… approximation algorithm for Max Cut. (The algorithm stops making recursive calls, and uses a random partition, when the partition of found by the algorithm has too many failed edges.)

The paper is entirely a by-product of the ongoing series of posts on edge expansion: the question of relations between spectral techniques to max cut was asked by a commenter and the probabilistic view of the proof of Cheeger’s inequality that I wrote up in this post was very helpful in understanding the gap between and .