Online Program monitoring is an effective technique for detecting bugs and
security attacks in running applications. Extending these tools to monitor
parallel programs is challenging because the tools must account for
inter-thread dependences and relaxed memory consistency models. Existing
tools assume sequential consistency and often slow down the monitored
program by orders of magnitude. In this talk, we present a novel approach
that avoids these pitfalls by not relying on strong consistency models or
detailed inter-thread dependence tracking. Instead, we only assume that
events in the distant past on all threads have become visible; we make no
assumptions of the relative ordering of more recent events on other
threads. To overcome the potential state explosion of considering all the
possible orderings among recent events, we adapt two techniques from
static dataflow analysis, reaching definitions and reaching expressions,
to the domain of dynamic parallel monitoring and show how our adapted
analysis can be used in two popular memory and security tools. Significant
modifications to these techniques are proposed to ensure the correctness
and efficiency of our approach.

Our simulation study on a collection of Splash-2 and Parsec 2.0 benchmarks
demonstrates the benefits in trading off a low false positive rate for (i)
reduced overhead and (ii) the ability to run on relaxed consistency models.

Michelle Goodstein is a Ph.D. student in Computer Science at Carnegie Mellon Uni-versity, advised by Todd Mowry. She is interested in dynamic parallel program analysis, and works in the Log-Based Architecture Group. She received dual B.S. degrees in Mathematics and Computer Science from the University of Washington. Michelle is a former recipient of the Clare Booth Luce Fellowship.