Calendar

During the fall 2018 semester, the Computational Social Science (CSS) and the Computational Sciences and Informatics (CSI) Programs have merged their seminar/colloquium series where students, faculty and guest speakers present their latest research. These seminars are free and are open to the public. This series takes place on Fridays from 3-4:30 in Center for Social Complexity Suite which is located on the third floor of Research Hall.

If you would like to join the seminar mailing list please email Karen Underwood.

Online social systems, comprised of social media services and platforms including social networking (e.g. Facebook, LinkedIn), microblogging (e.g. Twitter, Sina Weibo) and crowdsourcing (e.g. Wikipedia, OpenStreetMap) applications, continue to gain traction among an ever-increasing global user base. The growing reliance upon online social systems to augment an individual’s daily workflow and the resulting interdependence between human and technical systems provide sufficient evidence to classify them as socio-technical systems. These interdependencies are complex in nature and are best defined from a complex adaptive system (CAS) perspective.

It is through a CAS lens that this dissertation examines two types of adaptation in online social systems using an array of Computational Social Science (CSS) tools. In the first type of adaptation, human actors are no longer the sole participants in online social systems, since social bots, or automated software mimicking humans, have emerged as potential threats to stifle or amplify certain online conversation narratives. This section of the dissertation addresses adaptation to these new types of actors by presenting a novel social bot analysis framework designed to determine the pervasiveness and relative importance of social bots within various online conversations. In the second form of adaptation, individual citizens and government entities modify their behaviors in relation to each other through censorship circumvention or detection. This section of the dissertation investigates the rise of digital censorship in online social systems, creating a new agent-based model inspired by the findings from an evaluation of a Turkish digital censorship campaign.

The social bot analysis framework results consistently showed that while users identified as social bots only comprised a small portion of total accounts within the overall research corpus, they account for a significantly large portion of prominent centrality rankings across all observed online conversations. Furthermore, bot classification results, when using multiple bot detection platforms, exhibited minimal overlap, thus affirming that different bot detection algorithms focus on the various types of bots that exist. Finally, the results of the Turkish digital censorship campaign showed marginal effectiveness as some Turkish citizens circumvented the censorship policies, thus highlighting an individual decision cycle to risk punishment and engage in online activities. The recognition of this citizen decision cycle served as the basis for the adaptation to digital censorship model, which used empirical evidence to stylize and template a simulation censorship environment.

This dissertation examines the integration of complexity theory and computational tools into U.S. foreign policy. It identifies ways to improve the Department of Defense’s main analytic framework to ensure a more accurate reflection of complex systems and it provides a holistic assessment of the integration of computational tools into Joint campaigns. Based on this analysis, this dissertation advocates the incorporation of Agent Based Models (ABMs) as simulations to support both analysis and foreign policy development at all levels of the foreign policy enterprise. To aid this integration two Mesa based ABM libraries are provided. (1) Multi-level Mesa, the first Python based multi-level library to facilitate the integration and evolution of layered adaptive networks. This library goes beyond existing multi-level libraries by providing greater user flexibility and allowing for the integration and adaption of more complex networks. (2) Distributed Space Mesa, a first attempt at starting a Distributed Mesa meta-library. This library provides modest time improvements to spatial Mesa ABMs and critical lessons for the continued development of a suite of distributed Mesa libraries.

As the world becomes more dense, connected, and complex, it is increasingly difficult to answer “what-if” questions about our cities and populations. Most modeling and simulation tools struggle with scale and connectivity. We present a new method for creating digital twin simulations of city infrastructure and populations from open source and commercial data. We transform cellular location data into activity patterns for synthetic agents and use geospatial data to create the infrastructure and world in which these agents interact. We then leverage technologies and techniques intended for massive online gaming to create 1:1 scale simulations to answer these “what-if” questions about the future.

Bios:

Ben Intoy is a full stack developer at Deloitte Consulting LLP. He received his PhD in Physics at Virginia Tech in 2015 where he used high throughput computing simulations to study stability properties of cyclically competing species in varying spatial dimensions. Ben then went to the University of Minnesota, Twin Cities campus, as a postdoctoral research associate where he used tools he learned in his PhD to abstractly study the origin of life on earth and the probability of finding life elsewhere in the universe. In fall 2018 Ben went to the Deloitte Arlington VA Office to work on the FutureScape project (www.futurescape.ai).

Dan Baeder is a data scientist at Deloitte Consulting LLP, and has been on the FutureScape project since joining the firm last year. While at Deloitte, Dan has focused on the use of cellular phone geolocation data for the development of synthetic traffic models, as well as the application of geospatial analysis techniques to human behavior modeling. He is a noted R-phile in a sea of Python users. Dan received an MS in Public Policy and Management with a focus on data analytics from Carnegie Mellon University in 2018.

Coastal flooding is the most expensive type of natural disaster in the United States. Policy initiatives to mitigate the effects of these events are dependent upon understanding flood victim responses at an individual and municipal level. Agent-Based Modeling (ABM) is an effective tool for analyzing community-wide responses to natural disaster, but the quality of the ABM’s performance is often challenging to determine. This paper discusses the complexity of the Protective Action Decision Model (PADM) and Protection Motivation Theory (PMT) for human decision making regarding hazard mitigations. A combined (PADM/PMT) model is developed and integrated into the MASON modeling framework. The ABM implements a hind-cast of Hurricane Sandy’s damage to Sea Bright, NJ and homeowner post-flood reconstruction decisions. It is validated against damage assessments and post-storm surveys. The contribution of socio-economic factors and built environment on model performance is also addressed and suggests that mitigation for townhouse communities will be challenging.

Bio:

KIM McELIGOT is a PhD candidate in the Department of Systems Engineering and Operations Research at the George Mason University. His research interests include federation of computational fluid dynamics coastal flood modeling with geo-spatial agent-based modeling for individual and community level flood mitigation policy analysis. He holds an M.S. in Systems Engineering from Johns Hopkins University, and an M.A. in National Security and Military Affairs from the U.S. Naval War College. His email address is kmceligo@masonlive.gmu.edu.