Overview: This reading group aims at covering great papers in theoretical computer science from 2009 to 2010. Each week one volunteer participant will present one of the selected papers. The presenter is expected to show complete proof details for the main results in the paper in the (at most) 2.5-hour talk. The reading group lasts for 6 weeks from the week of Aug. 2, 2010.

Finding a stainer tree of a small cost is one of the most famous combinatorial optimization problem, which arises in many application. The problem of finding the minimal cost stainer tree is known to be NP-hard, but possesses a simple algorithm with approximation factor of 2. There was a number of papers which were improving on this factor of 2 to the best known 1.55. However all of them were purely combinatorial, though there was posed a question by Vazirani and Rajagopalan at SODA'99 concerning LP-relaxation of the problem, in particular whether there is a relaxation with integrality gap smaller than 2. Current paper answers positively to the last question as well as beats the previously known upper bound with a factor of 1.39.

As the paper is a sequel of a long line of research it contains a lot of involved technical machinery but also includes a number of interesting combinatorial ideas. One of the main novel ideas there is iterative usage of LP relaxation, i.e. obtaining a fractional solution to LP they reduce problem randomly with probabilities related to the solution, then rewriting and solving LP they reduce problem again repeatedly until they obtain a very simple instance.

Hope the talk will be interesting to those who likes linear relaxations techniques or more generally optimization problems on combinatorial objects.

MAX CUT is to find a cut in the graph with maximum cardinality. The seminal result by Goemans and Williamson [GW95] achieves an approximation ratio of 0.878 with a relaxation of Semidefinite programming (SDP). Assuming the unique games conjecture, this is actually the best we can hope for polynomial time algorithms.

This paper describes a new spectral approximation algorithm for MAX-CUT, which achieves an approximation of 0.531. In particular, if an optimal solution cuts 1-\epsilon fraction of edges, the algorithm find a cut with 1-O(\sqrt(\epsilon)) fraction of edges.

One ingredient of this result is an analog of Cheeger's inequality for the smallest eigenvalue of the adjacent matrix of a graph. The recursive algorithm presented also has a connection with the SDP relaxation of [GW95].

Among the important open questions of computational complexity, Unique Games Conjecture [Kho02] is one of the very few that looks like it could ``go either way''. The conjecture states that for a certain constraint satisfaction problem, called Unique Games, it is NP-hard to distinguish between instances that are almost satisfiable (at least (1 - \epsilon) of the contraints can be staisfied) and almost completely unstatisfiable (at most \epsilon of the contraints can be satisfied).

On one hand, a sequence of works have shown that this conjecture has several important implications, in particular showing that for many important computational problems, the currently known approximation algorithms have optimal approximation ratios. For example, Raghavendra [Rag08] showed that if UGC is true, for every constraint satisfaction problem, there is a sharp approximation threshold \tau: for every \epsilon > 0, one can achieve a (\tau - \epsilon) approximation in polynomial time, but obtaining a (\tau + \epsilon) approximation is NP-hard.

On the other hand, people try to attack the conjecture by designing various algorithms for Unique Games, even for the problem in some special cases. For example, Kolla [Kol10] showed that Unique Games on graphs with expension is easy, in the sense that given an almost satisfiable Unique Games instance where the ``label extended graph'' (think of the constraint graph related to the instance) with few large eigenvalues, it is easy to efficiently find a good solution the instance.

In this paper, the authors present a subexponential time algorithm for Unique Games on general graphs, by proving the following theorem.

Theorem. There is some absolute constant \alpha > 0 and an exp(kn^{\epsilon^{\alpha}})-time algorithm that given a (1 - \epsilon)-satisfiable Unique Games instance of alphabet size k, outputs an assignment satisfying (1 - \epsilon^{\alpha}) fraction of the constraint.

The main component of the algorithm is a new result on graph decomposition which states that for every \delta > 0 and a regular n-vertex graph G, by changing at most \delta fraction of G's edges, one can break G into disjoint parts so that the stochastic adjacency matrix of the induced graph on each part has at most n^{\epsilon} eigenvalues larger than (1 - \eta) (where \epsilon, \eta depend polynomially on \delta). Then, by combining this decomposition with Kolla's algorithm [Kol10] on graphs with few large eigenvalues, we find a good solution for the given instance.

[KhoMos]
The authors consider the problem of approximately solving a system of homogeneous linear equations over reals, where each equation contains at most three variables. Informally, they show that it is NP-hard to distinguish whether there is a non-trivial (not all zero) assignment that satisfies a 1 - \delta fraction of the equations, or whether every non-trivial assignment fails to satisfy a constant fraction of the equations even approximately (with "margin" \Omega(\sqrt(\delta))).

This paper has generated some excitement, since it represents *possible* progress towards proving the Unique Games Conjecture. The main missing piece to prove the UGC would be a reduction from three-variable equations to two-variable equations.

The result is highly technical, so it is unlikely that I will be able to present the full details in one afternoon. I will focus on one of the main tools, the development of linearity and dictatorship tests for functions from f:R^n -> R.

[HRV]
Several years ago, Hastad, Impagliazzo, Levin, and Luby provided a method to construct pseudorandom generator from one-way function.

From the perspective of cryptography, the [HILL] paper shows that a very powerful and useful cryptographic primitive (namely, pseudorandom generator) can be constructed from minimal assumption for complexity-based cryptography (namely, one-way function).

From the perspective of pseudorandomness, the paper provides strong evidence that pseudorandom bits can be generated very efficiently, with smaller computational resources than the "distinguishers" to whom the bits should look random.

In this paper, authors have provided a more efficient method, in the following sense. In [HILL] construction, the seed length is O(n^{10}). In this paper, the seed length is O(n^4).

The key of this paper's construction is a new notion of next-block pseudoentropy, and there are several steps in the main construction of this paper.

[GolJer]
Among counting problems in #P under approximation-preserving reductions, three classes of interreducible problems have been well studied. The three classes are: 1) Problems that are FPRASable;2) Problems AP-interreducible with #BIS and 3) Problems AP-interreducible with #SAT. Although not rigorously proved, it is believed that they are distinct complexity classes.

In this paper, the authors consider the approximation complexity of computing the partition function(paramiterized by q and \gamma) of the ferromagnetic q-state Potts Model which, viewing as a polynomial as q, is equivalent to the Tutte polynomial Tutte(q, \gamma).

The complexity of exact computing Tutte(q, \gamma) has been completely classified for all complex q, \gamma, while in the approximate sense, a wide range of (q,\gamma) has not been well-studied. This paper gives the first evidence that Tutte(q,\gamma) is #BIS-hard in the region q>2 and \gamma>0. Thus, we might say Tutte(q, \gamma) in this (q, \gamma) region is hard to approximate under the assumption that #BIS admits no FPRAS, which is stronger than RP \neq NP.

In technical aspect, other than using standard AP-reduction techniques, the authors introduce a new one which exploits the first order phase transition of the random cluster model and new gadgets are designed accordingly. The phase transition property is proved via stochastic domination results between different random graph models.