Wednesday, June the 28th
Complexity bounds for primal-dual methods minimizing the model of objective function

We provide Frank-Wolfe (Conditional Gradients) method with a convergence analysis allowing to approach a primal-dual solution of convex optimization problem with composite objective function. Additional properties of complementary part of the objective (strong convexity) significantly accelerate the scheme. We also justify a new variant of this method, which can be seen as a trust-region scheme applying the linear model of objective function. Our analysis works also for a quadratic model, allowing to justify the global rate of convergence for a new second-order method. To the best of our knowledge, this is the first trust-region scheme supported by the worst-case complexity bound.

Léon BottouFacebook

Thursday, June the 29th
Looking for a Missing Signal.

We know how to spot object in images, but we must learn on more images than a human can see in a lifetime. We know how to translate text (somehow), but we must learn it on more text than a human can read in a lifetime. We know how to learn playing Atari games, but we must learn it by playing more games than any teenager can endure. The list is long. We can of course try to pin this inefficiently to some properties of our algorithms. However, we can also take the point of view that there is possibly a lot of signal in natural data that we simply do not exploit. I will report on two works in this direction. The first one establishes that something as simple as a collection of static images contains nontrivial information about the causal relations between the objects they represent. The second one, time permitting, shows how an attempt to discover structure in observational data led to a clear improvement of Generative Adversarial Networks.