Support Accommodation: We provide accommodations to participants, but rooms are available on a first come, first served basis. Priority will be given to students. (Accommodation Facilities) Meal: We provide the dinner on December 20th and the breakfast on December 21st to all participants.

I will survey generalities of lattice paths enumeration first. Then several attempts to generalizing lattice paths will be discussed. And I will finish the talk by showing recent progresses and results of generalizations of lattice paths.

There are several bijections between permutations and Laguerre histories by Foata-Zeilberger, Françon-Viennot, and so on. These bijections give continued fraction formulas as a generating function for some statistics of permutations. Using continued fraction expansion, it is shown that the sequence of coefficients of Eulerian polynomial is symmetric and unimodal. Recently, Blanco and Petersen (arXiv:1206.0803v2) conjectured a $q$-analogue of Eulerian polynomial as an expansion formula for inversions and excedances in the symmetric group. In this talk, we prove this conjecture.

Also, we can find a similar expansion for the eulerian polynomials of fixed-point free permutations of type B. This formula gives an answer to a conjecture of Mongelli (J. Combin. Theory Ser. A 120 (2013), no. 6, 1216–1234) about the unimodality of coefficients in the corresponding Eulerian polynomials.

<Talk III> Kyomin Jung
"Tipping point analysis and influence maximization in social networks"

Diffusion of information, rumors or epidemics via various social networks has been extensively studied for decades. In particular, Kempe, Kleinberg, and Tardos (KDD '03) proposed the general threshold model, a generalization of many mathematical models for diffusion on networks which is based on utility maximization of individuals in game theoretic consideration. Despite its importance, the analysis under the threshold model, however, has concentrated on special cases such as the submodular influence (by Mossel-Roch (STOC '07)), homogeneous thresholds (by Whitney(Phys. Rev. E. '10)), and locally tree-like networks (by Watts(PNAS '02)). We first consider the general threshold model with arbitrary threshold distribution on arbitrary networks. We prove that only if (essentially) all nodes have degrees omega(log n), the final cascade size is highly concentrated around its mean with high probability for a large class of general threshold models including the linear threshold model, and the Katz-Shapiro pricing model. We also prove that in those cases, somewhat surprisingly, the expectation of the cascade size is asymptotically independent of the network structure if initial adopters are chosen by public advertisements, and provide a formula to compute the cascade size. Our formula allows us to compute when a phase transition for a large spreading (a tipping point) happens. We then provide a novel algorithm for influence maximization that integrates a new message passing based influence ranking and influence estimation methods in the independent cascade model.

<Talk IV> Choongbum Lee
"Resilience of random graphs"

A typical result in graph theory can be read as following: under certain conditions, a given graph $G$ has some property $mathcal{P}$. For example, Tur'an's theorem says that every $n$-vertex graph with more than $(1-frac{1}{r-1}){n choose 2}$ edges contains $K_{r}$ as a subgraph. As another example, a classical theorem of Dirac asserts that every $n$-vertex graph of minimum degree at least $n/2$ contains a cycle that passes through every vertex of the graph. Recently, there has been a trend in extremal graph theory where one revisits such classical results, and attempts to see how strongly $G$ possesses the property $mathcal{P}$. In this talk, we discuss several recent progress on this topic where we consider random subgraphs of $G$ and study the threshold probability for which the resulting graph has property $mathcal{P}$ with high probability. These problems lie at the intersection of extremal and probabilistic combinatorics. We will mostly focusing on results around the two theorems mentioned above.

We are interested in characterizing when a $3$-coloring of a cycle in a $3$-colorable planar graph does not extend to a $3$-coloring of the entire graph. Gr"otzsch's Theorem states that a triangle-free planar graph $G$ is $3$-colorable. Moreover, every $3$-coloring of a $4$-cycle or a $5$-cycle in $G$ extends to all of $G$. Given a $3$-coloring of a cycle $C$ of length at most $8$ in a triangle-free planar graph, it is characterized when the $3$-coloring extends to the entire graph. We extend this result to cycles of length $9$. Namely, we characterize all situations where a $3$-coloring of a cycle of length $9$ in a triangle-free planar graph does not extend to a $3$-coloring of the whole graph.

Parameterized complexity deals with algorithms for decision problems whose instances consist of a pair $(x, k)$, where $k$ is a secondary measurement of the input known as the parameter. A major goal in parameterized complexity is to investigate whether a problem with parameter $k$ admits an algorithm with running time $f(k)cdot |x|^{O(1)}$, where $f$ is a function depending only on the parameter and $|x|$ represents the input size. Parameterized problems that admit such algorithms are called fixed-parameter tractable and the class of all such problems is denoted FPT.

A closely related concept is that of kernelization. A kernelization algorithm, or just kernel, for a parameterized problem $Pi$ takes an instance $(x, k)$ of the problem and, in time polynomial in $|x| + k$, outputs an instance $(x',k')$ such that $|x'|,k'leq g(k)$ for some function $g$, and $(x,k)in Pi$ if and only if $(x',k')in Pi$. The function $g$ is called the size of the kernel and may be viewed as a measure of the “compressibility” of a problem using polynomial-time preprocessing rules. It is a folklore result that a decidable problem is in FPT if and only if it has a kernelization algorithm. However, the kernel that one obtains in this way is typically of size at least exponential in the parameter. A natural problem in this context is to find polynomial or linear kernels for problems that are in FPT.

During the last decade, a plethora of results emerged on linear kernels for graph-theoretic problems restricted to {sl sparse} graph classes. A celebrated result by Alber emph{et al}. for {sc Dominating Set} on planar graphs prompted an explosion of research papers on linear kernels on planar graphs. Guo and Niedermeier designed a general framework and showed that problems that satisfy a certain ``distance property'' have linear kernels on planar graphs. Bodlaender emph{et al}. provided a meta-theorem for problems to have a linear kernel on graphs of bounded genus. Fomin emph{et al}. extended these results for bidimensional problems on $H$-minor-free and apex-minor-free graphs. Recently, Kim emph{et al}. further pushed the boundary of meta-kernelization to include $H$-topological-minor-free graphs.

In the talk, we shall trace the trail of ideas for kernelization on sparse graphs after warming up with a minimal set of background notions to appreciate the progress.