Probability in Biology and Computer ScienceOrg: David Brydges and Ed Perkins (UBC)
[PDF]

SHANKAR BHAMIDI, University of North Carolina, Chapel HillFlows, first passage percolation and random disorder in networks [PDF]

Consider a connected network and suppose each edge in the network has a random positive weight. Understanding the structure and weight of the shortest path between nodes in the network is one of the most fundamental problems studied in modern probability theory and goes under the name {\it first passage percolation}. It arises as a fundamental building block in many interacting particle system models. In the modern context these problems take on an additional significance with the minimal weight measuring the cost of sending information while the number of edges on the optimal path (hopcount) representing the actual time for messages to get between vertices in the network.

The aim of this talk is to describe a heuristic based on continuous time branching processes which gives very easily, a wide array of asymptotic results for random network models in terms of the Malthusian rate of growth and the stable age distribution of associated branching process. These techniques allow us to solve not only first passage percolation problems rigorously but also understand functionals such as the degree distribution of shortest path trees, congestion across edges as well as asymptotics for ``betweeness centrality'' a concept of crucial interest in social networks, in terms of Cox processes and extreme value distributions. These techniques also allow one to exactly solve models of ``weak disorder'' in the context of the stochastic mean field model of distance.

STEVE EVANS, University of California at BerkeleyTrickle-down processes and their boundaries [PDF]

It is possible to represent a number of Markov chains that appear in applications of probability
to computer science as an evolving sequence of connected subsets of a directed acyclic graph that grow in the following way: initially, all vertices of the
graph are unoccupied, particles are fed in one-by-one
at a distinguished source vertex, successive particles proceed along
directed edges according to an appropriate stochastic mechanism,
and each particle comes to rest once it encounters
an unoccupied vertex. Examples include
the binary and digital search tree processes,
the random recursive tree process and generalizations of it
arising from nested instances of Pitman's two-parameter
Chinese restaurant process,
tree-growth models associated with Mallows' $\phi$ model of random permutations
and with Sch\"utzenberger's non-commutative $q$-binomial theorem, and a
construction due to Luczak and Winkler that grows uniform random binary trees
in a Markovian manner.
I will introduce a general framework that encompasses such Markov chains
and then characterize
their asymptotic behavior by analyzing in detail their Doob-Martin compactifications,
Poisson boundaries and tail $\sigma$-fields. This is joint work with Rudolf Gr\"ubel and
Anton Wakolbinger.

CHRISTOPH HAUERT, University of British ColumbiaSanctioning institutions for governing the commons [PDF]

Cooperation represents a key organizing principle in genetic and cultural
evolution. Yet cooperation is a conundrum because cooperators make a
sacrifice to benefit others. Despite the fact that groups
of cooperators outperform groups of defectors, Darwinian selection or
utilitarian principles based on rational choice favors defectors.
Nevertheless, cooperation is ubiquitous in biological and social systems.
Indeed, cooperation can be stabilized by punishing defectors.
Punishment is also ubiquitous in nature - ranging from toxin producing microorganisms to law
enforcement institutions. But how can initially
rare, costly punishment behavior gain a foothold in
a population? In nature, individuals carefully select their
interaction partners and refuse to participate in risky enterprises. Such
voluntary participation prevents deadlocks in states of mutual defection and
thus promotes cooperation - but fails to stabilize it.
However, the combined efforts of punishment and volunteering can change the odds in favor of cooperation.
Under the stochastic dynamics of finite populations the
freedom to withdraw leads to prosocial coercion. To date, theory and
experiments emphasize the role of such peer-punishment. At least in human societies
peer-punishment has been largely superseded by sanctioning institutions and
vigilantism deemed illegal. This can be modeled by introducing the opportunity for
pool-punishment, which represents a precursor of executive power and echoes
Elinor Ostroms principles for 'Governing the
Commons'. Pool-punishment always incurs costs to those committed to it even
if no one requires reprimanding. Interestingly, our model predicts that
individuals nevertheless trade the higher efficiency of peer-punishment for the
increased stability of pool-punishment to maintain cooperation.

Percolation is concerned with the existence of an infinite path in a (Bernoulli) random subgraph of the lattice $Z^D$. We can rephrase this as the existence of a Lipschitz embedding (or an injective graph homomorphism) of the infinite line $Z$ into the random subgraph. What happens if we replace the line $Z$ with another lattice $Z^d$? I'll answer this for all values of the two dimensions $d$ and $D$, and the Lipschitz constant. Based on joint works with Dirr, Dondl, Grimmett and Scheutzow.

Existing biometric identification systems, such as those used in trusted traveler programs, attempt to identify an individual from an enrollment database of $n$ people. The output is either the name of an enrolled person, or a rejection message indicating that no match was found.
Traditionally, no measure of confidence is given to the output; an individual is either granted or denied access.

In this presentation, we propose an extension to existing biometric systems by applying a calibration function to the $n$ matching scores. We introduce a computationally-light calculation that can be applied either as a post-processing filter or embedded directly into an algorithm to yield perfectly calibrated probability-based scores. In addition to attaching a meaningful confidence measure to the output, the proposed methodology is also shown to improve the overall performance of a biometric system.

We apply our calibration theorem to an actual data set consisting of nearly 60,000 iris images. By comparing the detection error trade-off ($DET$) curves, we show that our score calibration post-processing filter reduces the area under the $DET$ curve from $2.41$ to $0.17$, and reduces the equal error rate ($EER$) from $5.40 \%$ to $2.84 \%$.

This is joint work with Dmitry Gorodnichy at the Canada Border Services Agency.

We study models describing the evolution of a sexual (diploid) population with mutation and selection in the specific scales of the biological framework of adaptive dynamics. We take into account the genetics of the reproduction. Each individual is characterized by two allelic traits at a specific locus and by the associated phenotype. The population size is assumed to be large and the mutation rate small. We prove that under a good combination of the scales, the population process is approximated in a long time scale by a jump process describing the successive homozygote equilibria of the population. If the mutation steps are small the process is thus approximated by a deterministic differential equation generalizing the well known canonical equation of the adaptive dynamics derived for asexual populations in previous works.
This work is a joint work with Pierre Collet (Ecole Polytechnique) and Hans Metz (Leiden).

We consider the following random graph model, which is motivated by mobile wireless networks. At time 0, take a Poisson point process over $R^2$ with constant intensity to be the nodes of the graph and let each node move independently according to Brownian motion. At any time $t$, we have an edge between every pair of nodes for which their distance at time $t$ is at most $r$. We study three fundamental features in this model: detection (the time until a given target point---which may be either fixed or moving---is within distance $r$ to some node of the graph), coverage (the time until all points inside a finite box are detected by the graph), and percolation (the time until a given node belongs to the infinite connected component of the graph). We obtain precise asymptotics for these features by combining ideas from stochastic geometry, coupling, and multi-scale analysis. ( Joint work with Alistair Sinclair, Perla Sousi and Alexander Stauffer.)

Laci Lovasz has made numerous seminal contributions to theoretical computer science, especially
to the development of algorithms, in recognition of which he was awarded a Kyoto Prize this year.
His Local Lemma is a simple, easily proved, but powerful tool for proving the existence of
objects with desired properties. It turns out, somewhat surprisingly, that it can also be used to
construct or find the desired objects. This survey talk will discuss the lemma and its algorithmic
applications.