University of California Complexity Events

Streaming videos of past talks, wikis for complexity summer school events and documents, are open for further discussion on our wiki discussion pages.

2007-2008

The Human Sciences and Complexity Seminars began in 2005-2006 on a biweekly basis through intercampus interactive video-meetings. They are in their third year. The seminar wiki, http://intersci.ss.uci.edu/wiki, has a diverse and multidisciplinary speaker list, with emphases on empirical, simulation, network-dynamical, and mathematical modeling of complex processes in the full range of human sciences. UC Davis CSC joins UCSD, UCI and UCLA in 2008-2009 along with participants from complexity research centers.

Field/Subfield: Public Policy and Criminology/Game Theory and Simulation.

Topic: "The Dynamics of Deterrence".

Cosponsored by the UCLA Human Complex Systems Program.

Abstract: Because punishment is scarce, costly, and painful, optimal enforcement strategies will minimize the amount of actual punishment required to effectuate deterrence. If potential offenders are deterrable, increasing the conditional probability of punishment (given violation) can reduce the amount of punishment actually inflicted, by "tipping" a situation from its high-violation equilibrium to its low-violation equilibrium. Compared to random or "equal opportunity" enforcement, dynamically concentrated sanctions can reduce the punishment level necessary to tip the system. Game theory and some simple and robust Monte Carlo simulations demonstrate these results, which, in addition to their potential for reducing both crime and incarceration, may have implications for both management and regulation.

This presentation explores the potential benefits of a "multiple models" approach to understand the myriad phenomena of Burning Man, an annual event involving the emergence and deconstruction of a city of nearly 50,000 people in the Nevada desert. The presentation is based upon fieldwork participation.

Burning Man is a unique object of study for bottom-up emergence of complex behavior because unlike other festivals, it is heavily participant driven and includes many features of a functioning city such as food dispensation and repair shops, but without money, corporate involvement, or modern telecommunications.

I will show photographs of Burning Man. If you are a member of Facebook, you can find a selection of photographs here: [1]

A system like this has many qualities to consider when crafting a simulation. For example, how does information diffuse quickly through such a large population? Who makes decisions and how? What organizational features promote its successful functioning as a gift economy? And many other questions.

These questions are interesting in terms of real-world application. Just as few people foresaw non-profit organizations as viable alternatives to shareholder driven business, so too may Burning Man hold lessens and offer challenges to today's assumptions about urban culture.

To develop a simulation that addresses these many questions, more than one modeling / analytical technique is needed. I will briefly demonstrate how three modeling tools are particularly helpful. These tools are drawn from a wide toolbox of options. I will not go into great detail about these tools (Richard Dawkin's theory of memes, Michael Thompson's Cultural Theory, and of course basic principles of multi-agent modeling). However, I will give brief overviews and demonstrate their use through the example of Burning Man.

Besides exploring Burning Man as a challenging simulation problem, I wish to convey two other lessons. One additional lesson is the general advance of utilizing multiple analytical techniques. Using one technique gives us one view. Using several gives us more views. The result of multiple views is more than the sum of the individual takes on a problem. Another lesson is that this approach works very well in our <Human Complex Systems Minor> <http://www.registrar.ucla.edu/Catalog/catalog07-08-336.html]> <M100>. By learning about and using multiple analytical techniques, students gain both appreciation and skill in examining complex phenomena.

This presentation does not include a computer simulation because I am still in the research and exploration phase of the project. The presentation acts as a basis for future fieldwork and a computer model at a later date.

Dynamical game theory offers an approach to modeling complex networks. This research focuses on dynamics in the formation processes
of a mutual consent network in game theory setting: the Co-Author Model, in which there is limited time for multiple collaborations. In
this article, an assumption of limited observation (to those with whom one is connected) is applied and analytical results are
derived. Then, 2 parameters are varied: the number of individuals in the
network and the probability of the links in the network in its initial state. A
simulation result shows a finding that is consistent with an analytical result
for a state of equilibrium, while it shows in addition a different possible
equilibrium. Further, the simulation results also exhibit more predictive behaviors
of the network as its size increases, violating the notion that complexity
increases as the number of its elements increases. Existing empirical findings are presented and integrated to explain, from the model, real-world formation and sustainable processes of entrepreneurial organization in terms of networks. It
shows behaviors of members of entrepreneurial organizations that are
contradictory to the their self-benefiting motives, including existence of
cohesive "old boys" clubs, formation of new "young Turks" organizations as a
counter movement, and information diffusion despite the competitive nature of
the network.

SHORT ABSTRACT: Transdisciplinary complexity science approaches provide new methods and opportunities to comprehensively understand the coupling dynamics of human and ecological systems. As we face environment-related threats to human welfare, health and security in the coming century, a holistic approach to sustainable management across a range of spatial and temporal scales is important to scientific progress as well as for the political decision-making and development processes.

LONG ABSTRACT: The relationships between human and natural ecological systems are extremely complex. Understanding such complicated interactions and predicting or altering future dynamics from the standpoint of a single scientific discipline or from a reductionist position is almost certain to miss crucial mechanisms and dynamics. A holistic, complex systems approach is needed.

Systems approaches in ecological studies have made considerable progress in understanding the fundamental ecological interactions in relatively "pristine" ecosystems, especially through the establishment of systems ecology and ecosystems theory. Human systems – also regarded as complex – have also been analyzed from a systems perspective. There are many models on basic economic and social processes, as well as practical applications in urban and regional planning, transportation system design, etc.

Recent environmental, economic and political demands are requiring better understanding of the linkage between the ecological and human social systems, especially in the context of the development of management strategies for a sustainable world. We are facing several environment-related threats to human welfare, health and security in the coming century. The respective questions for a holistic approach to such dynamic human-environmental interactions are extremely important and significant for the scientific progress, as well as for the political decision-making and development processes in the face of inevitable environmental change.

New methods and approaches in the science of complexity have provided us a new opportunity to comprehensively understand the coupling dynamics of human and ecological systems, and the prospects of sustainable management across a range of spatial and temporal scales. New complex systems approaches also shed a new light on co-evolution trajectories and the resilience of coupled social and ecological systems. In this talk, I will outline this latest development and research results in this emerging and exciting research area, along with some of my own works.

Abstract We present four illustrative projects in which we have been invited to intervene on a large ecological scale in the aftermath of massive resource extractions from diverse environments whose sustainability is profoundly compromised. Each work proposes a response and new proposal to four different large systems at risk. These are the North American Rain Forest, the Endangered Meadows of Europe, the Green Heart of Holland, and the water sources of Peninsula Europe in relationship to global warming and glacial meltdown. Includes discourse, presentation of readings & imagery, discussion.

Saturday was chosen to allow more undergrads to attend (the afternoon segment may be extended if more speakers are added to the schedule). The name of our bldg is the Anteater Instructional & Research Bldg (formerly 18C parking lot) on East Peltason Drive, between Gabrielino Drive & Los Trancos Drive. There is a new 4-story parking structure built right next to us. Adjacent to our bldg is a sign marked Henry Samueli School of Engineering. Once in the bldg, take the elevator to the 3rd floor and once you step out of the elevator turn left. The entrance into the video teleconference center is door #3030. Lunch is at 12:20 in room 3010 Anteater I&R room over.

During and after our catered lunch (The Asian Delight sweet and sour chicken) we will have three discussions

Discussion of Materiality and Cognition

Larri Li's suggestion that we coordinate, across the 4 campuses, a proposal to the "Ten plus Ten" alliance with China for an enhanced social science, environment and sustainable development, being coordinated throgh UCOP and UCR.

Short Abstract. Simulations in evolutionary computation, artificial life and artificial culture create emergent global patterns of behavior not evident from their underlying local rules. However, emergence rarely climbs higher than one level. Higher levels are reached in natural systems by the capture of emergences from their native media, to new media with different material and physical properties. This is intermediation, a process lacking in most simulations.

Long Abstract. The quest at the center of evolutionary computation has been to understand the emergence of emergence in nature and how to implement it in simulations. It is not difficult to simulate the emergence of global patterns of behavior from local primitive rules of causation, and thus understand and explain the emergence of one level of complexity, the first order of emergence. But how can we build simulations which then capture the emergent patterns of behavior at one level and use them as the local primitives to build yet a higher level of global patterning, a second order of emergence? This is the problem which has been the focus of recent workshops on "Dynamical Hierarchies" in Australia, "Dynamic Ontology" in Italy, and "Computational Synthesis" in California. It is, as stated in the invitation to the meeting on "Levels of Reality," "possibly the single most relevant and unresolved problem in science and philosophy." Significant progress is likely to result from the realization that the "capture" of emergences often takes place by the transference of information from one medium to another, a process I refer to as intermediation. In the process of representing reality in computer code materiality is often lost. Thus intermediation is often overlooked in computer simulations of social and cultural multiagent (multicausal) models. Information is always instantiated on a marker, and the materiality of that marker changes the character of the information it conveys. Each marker, information carrier, or medium, consequently has a life of its own: its own audience and capacity, its own costs of inscription, maintenance and presentation. Each has its own vagueness and ambiguity, its own accuracy, precision and repeatability, and ultimately its own robustness, durability, failure and decay. The world is re-presented in the medium of mind as concepts and ideas, re-presented as speech or written text, and embodied in a myriad of material contrivances, signs, symbols and suggestions of behavior, in a world of technological artifacts and ways of doing things - a material cultural environment that is, in many ways, more immersive, durable and compelling than the ideas themselves.

It is these artifacts of technology which constitute distributed material cultural cognition, the processes which mediate our daily lives. Intermediation, in simulation and in nature is often conceptualized as hierarchical, with level building upon level of complexity. However, emergence takes place not only upward, but laterally and downward with significant feedback taking place through heterarchical circuits. We must also take into account at least two temporally different meanings of levels of emergence which might best be described as evolutionary and instantaneous. Not only do we see complexity arise through evolutionary time, we see it maintained at every instant. If the causal regularities that govern the world of physics should change instantaneously, all higher level processes would stop as quickly. With regard to computer models of social and other processes, we must begin to represent the materiality of the real world in the medium of simulation. In other words, we must imbue our simulations with simulations of the media of materiality within the media of computation.

As an aside, this question is insightfully pursued by Greg Egan in his science fiction book PERMUTATION CITY, and the entailments discussed in his description of the relative robustness of the "Lambertians" versus the "Copies."

Abstract. Conceptual blends are ways that networks or relations as metaphors are used to generate and map our thought. In this talk I will explore some ways that systems of human cultural practice bring simple perceptual abilities into coordination with the social and material world to produce complex cognitive accomplishments.

(A new paper by Murray Leaf on Empirical Formalism is nicely complementary to these two talks, e.g., p. 9: in "Abandoning the traditional view that minds are the essentially passive contemplators of independently existing objects," and arguing "that objects are constructs in which the activity of minds plays an essential part” (Jones 1975: xx)."

IMBS UCI Only January 18 4-5

Abstract This paper examines the problem of designing mechanisms with learning properties that help guide agents to play desired equilibrium strategies. I introduce the concept of supermodular implementation where the mechanisms are constructed to induce supermodular games, i.e games with strategic complementarities. These supermodular mechanisms receive the valuable characteristics of supermodular games such as their learning properties. A social choice function (scf) is supermodular implementable if it is implementable with a supermodular mechanism. In quasilinear environments, I prove that if a scf can be implemented by a mechanism that generates bounded strategic substitutes - as opposed to strategic complementarities - then this mechanism can be converted into a supermodular mechanism that implements the scf. If the scf also satisfies some efficiency criterion, then I show that it is supermodular implementable with budget-balancing transfers. Then I address the multiple equilibrium problem. I provide general sufficient conditions for a scf to be implementable with a supermodular mechanism whose equilibria are contained in the smallest interval among all supermodular mechanisms. I also give conditions for supermodular implementability in unique equilibrium. Finally, the paper deals with general preferences by providing a Supermodular Revelation Principle.

Marschak at UCLA Only January 18 1-3 READ

DWIGHT READ, Professor, Department of Anthropology and Department of Statistics, Chair of the UCLA Interdisciplinary Program in Human Complex Systems (Field/Subfield: Anthropology/Cultural Anthropology, Biological Anthropology, Archaeology) will be presenting a Marschak Colloquium on the topic:

THE CLASSIFICATION OF THINGS: BEYOND QUANTITATIVE METHODS IN THE NATURAL AND SOCIAL SCIENCES

in the UCLA Anderson Gold Hall, School of Management, Entrepreneurs Hall, 3rd. Floor, Room C-301 on Friday January 18 from 1 to 3 p.m. This presentation is cosponsored by the UCLA Human Complex Systems Program and the UCLA Department of Anthropology. Professor Read's abstract and biography are below, followed by a summary of the talks scheduled so far for the Marschak Colloquium for the rest of 2007-2008. Please note that we have recently added the June 6 talk by Mat McCubbins of UCSD. All are welcome to attend.

Abstract Whether done formally or informally, classification is ubiquitous as a way to construct knowledge systems with goal making more
predictable our interaction with artifacts (or things), whether they are material or conceptual. We can distinguish two broad
approaches to classification. One approach stems from a dogmatic philosophical tradition in which order is formally a priori
and imposed on the phenomena of interest; the other from a skeptical philosophical tradition in which one uncovers order through
empirical observation and identification of structuring processes. Statistical methods, with their goal of expressing patterning
“in the aggregate,” easily take on the character of imposed order through the device of a “universal experiment” that
conceptually enables the assumptions of statistical methods to be satisfied for any well-defined population. Order is thereby
imposed, rather than uncovered. Classifications based on the methods of numerical taxonomy illustrate the very real risk,
though, of ending up with discordance between imposed order and order generated by structuring processes for the phenomena of
interest. A skeptical approach requires, instead, that we first make evident what constitutes those structuring processes and
the kind of order they produce. Analytical methods need to be formulated in accordance with those structuring processes, with
goal making evident patterning in the phenomena of interest. Examples from archaeology will be used to illustrate the two
approaches to classification and the contradictory results obtained through these two approaches to classification. The insights
obtained through a skeptical approach to classification based on making evident patterning in the data of interest will be
demonstrated. The conflict between these two approaches problem becomes particularly acute with classification of human
produced artifacts (both material and conceptual) where the ordering processes are cultural and not known a priori, thus
methodologically producing what I have called the “double bind” of needing the classification first to determine what are the
ordering processes, yet needing the ordering processes first to do the classification. The “double bind” problem does not have
a statistical solution due to homogeneity assumptions underlying statistical methods and instead satisfactory classification
methods depend on a recursive as a way to resolve the “double bind” problem.

Abstract A simulated analytical game-theoretic model provides a theoretical framework and integrates existing empirical findings to explain emergence and sustainability processes of entrepreneurial organizations in networks. It shows behaviors of entrepreneurial organizations such as cohesive conglomerations, emergence of its resultant rivalry organizations, and information diffusion that are contradictory to the competitive nature of the network due to their members' self-benefiting motives. Furthermore, the results also generate phase evolution of the network, not unlike the results concluded from empirical findings, and exhibit more predictive behaviors of the network as its size increases, violating the notion that complexity increases as the number of its elements increases.

Model dynamics focus on formation of mutual consent networks in the game theoretic co-author model, in which there is limited time for multiple collaborations. An assumption of limited observation (to those with whom one is connected) is applied and analytical results are derived from variation in two parameters: the number of individuals in the network and the probability of links in the network in its initial state.

News: Crutchfield, one of the most productive 6-year resident faculty of SFI, moved a few years ago to UC Davis, where he founded the Complexity Sciences Center (CSC). He and some of his center postdocs and faculty will be connecting through their UC Davis videoconference center for talks later this year and next. He is also exploring the possibilities for an intercampus graduate program in complexity sciences and we have both had discussions with the UC Office of the President, where there is a plan for just such programs in the future.

Other News: Co-Director Rick Riolo, at Cosma's former CSCS home at Michigan, has contacted us and our UCSD experts to explore what video and teleconference equipment they should buy in order to connect with our program and with SFI (Santa Fe Institute).

Abstract We present a stochastic model for networks with arbitrary degree distributions and average clustering coefficient. Many descriptions of networks are based solely on their computed degree distribution and clustering coefficient. We propose a statistical model based on these characterizations. This model generalizes models based solely on the degree distribution. We present alternative parameterizations of the model. Each parameterization of the model is interpretable and tunable. We present a simple Markov Chain Monte Carlo (MCMC) algorithm to generate networks with the specified characteristics. We provide an algorithm based on MCMC to infer the network properties from network data and develop statistical inference for the model. The model is generalizable to include mixing based on attributes and other complex social structure.

Bio:
Michael Kearns has been a professor in the Department of Computer Information and Science at the University of Pennsylvania since 2002, where he holds the National Center Chair in Resource Management and Technology. He also has a secondary appointment in the Operations and Information Management (OPIM) department of the Wharton School, and until July 2006 was the co-director of Penn's interdisciplinary Institute of Research in Cognitive Science. His primary research interests are in machine learning, probabilistic artificial intelligence, computational game theory and economics, and computational finance. Kearns often blends problems from these areas with methods from theoretical computer science and related disciplines. While the majority of his work is mathematical in nature, Kerans has also participated in a variety of systems and experimental work, including spoken dialogue systems, software agents, and human-subject experiments in strategic interaction.

Abstract:
We have been conducting behavioral experiments in which human subjects attempt to solve challenging graph-theoretic optimization problems through only local interactions and incentives. The primary goal is to shed light on the relationships between network structure and the behavioral and computational difficulty of different problem types. To date, we have conducted experiments in which subjects are incentivized to solve problems of graph coloring, consensus, independent set, and an exchange economy game. I will report on thought-provoking findings at both the collective and individual behavioral levels, and contrast them with theories from theoretical computer science, sociology, and economics. This talk discusses joint work with Stephen Judd, Sid Suri, and Nick Montfort

Abstract for Statistical methods for complex systems. This talk is a summary of the tools people should use to study complex systems, covering statistical learning and data-mining, time series analysis, cellular automata, agent-based models, evaluation techniques and simulation, information theory and complexity measures.