Computing is undergoing a major shift. Third-party applications hosted in
online software markets have become ubiquitous on all kinds of platforms:
mobile phones, Web browsers, gaming devices, even household robots. These
applications often include yet more third-party code for advertising,
analytics, etc. These trends have dramatically increased the amount
of bad code throughout the software stack - buggy code, malicious code,
code that overcollects private information intentionally or by accident,

Over the past three decades, approximation algorithm techniques have been successful in analyzing the approximability of many problems. However many other classic problems are still poorly-understood in terms of approximability. Long-standing gaps between approximation ratios and hardness of approximation results present challenges to our algorithmic techniques. I will present two very different examples from my work to illustrate how various techniques were used to tackle long-standing gaps.

Narrative is a significant genre for both conventional and computational media. As we increase our computational understanding of narrative, we increase our ability to enable meaningful experiences in computational media that are created automatically, on demand and tailored to context. In this talk I'll present an overview of the research from my lab developing computational models of narrative that draw upon ideas from narrative theory, cognitive psychology and linguistic pragmatics.

In 2020-2022, Exascale systems will be put in service in multiple locations in the world. Studies and projections agree that these systems will suffer more frequent failures and data corruptions than current systems. The challenge is clear: how to make sure that Exascale application executions complete and provide correct results? Finding solutions to this problem is not trivial. In particular scaling existing solutions will not work.

I create and deploy interactive systems that use a combination of human and machine intelligence to operate robustly in real-world settings. Unlike prior work in human computation, my “Crowd Agent” model allows crowds of people to support continuous real-time interactive systems that require ongoing context.

The rise of Internet-scale geo-replicated services has led to considerable upheaval in the design of modern data management systems. Namely, given the availability, latency, and throughput penalties associated with classic mechanisms such as serializable transactions, a broad class of systems (e.g., "NoSQL") has sought weaker alternatives that reduce the use of expensive coordination during system operation, often at the cost of application integrity. When can we safely forego the cost of this expensive coordination, and when must we pay the price?

Data centers run a range of important applications with ever
increasing performance demands, from cloud and server computing to Big
Data and eScience. However, the scaling of CPU frequency has stalled
in recent years, leading to hardware architectures that no longer
transparently scale software performance. Two trends stand out: 1)
Instead of frequency, hardware architectures increase the number of
CPU cores, leaving complex memory system performance and CPU
scheduling tradeoffs exposed to software. 2) Network and storage I/O

Epic makes healthcare software for mid-size and large medical groups, hospitals and integrated healthcare organizations, where the stakes for securing and protecting information are incredibly high. In his role on the Security team at Epic, Noah is responsible for building security into enterprise applications. In this talk, he’ll discuss the technical challenges of implementing something as simple as secure passwords, from hashing itself, to hash function conversions to choosing the right encryption algorithms and protocols. He’ll explain emerging models for mobile device security.