Blog by James Millington, PhD

Tag: Social

When you haven’t done something for a while it’s often best not to rush straight back in at the intensity you were at before. So here’s a nice easy blog to get me going again (not that I was blogging intensely before!).

I didn’t blog about it at the time (unsurprisingly), but back in late June 2013 I went to visit a colleague of mine in Madrid, Dr Francisco Seijo. Francisco and I met back at something I did blog about, the 2009 US-IALE conference in Snowbird. Since then we’ve been discussing how we can use the idea of coupled-human and natural systems to investigate Mediterranean landscapes.

Example of Traditional Fire Knowledge. The ‘pile-burning’ technique involves raking, piling and igniting leaves. This contrasts with ‘a manta’ broadcast burning in which leaves and ground litter are burned across larger areas. Photos by the authors of the paper.

After a brief field visit by me, an interview campaign by Francisco, collection of secondary data from other sources (aerial photography and official fire statistics) and some desk analysis, we recently published our first paper on the work. Entitled Forgetting fire: Traditional fire knowledge in two chestnut forest ecosystems of the Iberian Peninsula and its implications for European fire management policy, and published in the journal Land Use Policy, the article presents the results of our mixed-methods and interdisciplinary approach. Building on Francisco’s previous examination of ‘pre-industrial anthropogenic fire regimes’ we to to investigate differences between the fire regimes and management approaches of chestnut forest ecosystems in two municipalities in central Spain. In the paper we also discuss ideas of Traditional Ecological Knowledge (TEK), the related idea of Traditional Fire Knowledge (TFK), and discuss them in light of contemporary fire management approaches in Europe.

The full abstract is below with links to the paper. I’ll stop here now as this rate of blogging it making me quite dizzy (but hopefully I’ll be back for more soon).

Human beings have used fire as an ecosystem management tool for thousands of years. In the context of the scientific and policy debate surrounding potential climate change adaptation and mitigation strategies, the importance of the impact of relatively recent state fire exclusion policies on fire regimes has been debated. To provide empirical evidence to this ongoing debate we examine the impacts of state fire exclusion policies in the chestnut forest ecosystems of two geographically neighbouring municipalities in central Spain, Casillas and Rozas de Puerto Real. Extending the concept of ‘Traditional Ecological Knowledge’ to include the use of fire as a management tool as ‘Traditional Fire Knowledge’ (TFK), we take a mixed-methods and interdisciplinary approach to argue that currently observed differences between the municipalities are useful for considering the characteristics of “pre-industrial anthropogenic fire regimes” and their impact on chestnut forest ecosystems. We do this by examining how responses from interviews and questionnaire surveys of local inhabitants about TFK in the past and present correspond to the current biophysical landscape state and recent fire activity (based on data from dendrochronological analysis, aerial photography and official fire statistics). We then discuss the broader implications of TFK decline for future fire management policies across Europe particularly in light of the published results of the EU sponsored FIRE PARADOX research project. In locations where TFK-based “pre-industrial anthropogenic fire regimes” still exist, ecosystem management strategies for adaptation and mitigation to climate change could be conceivably implemented at a minimal economic and political cost to the state by local communities that have both the TFK and the adequate social, economic and cultural incentives to use it.

One of the interesting things we show with the model, which was not readily at the outset of our investigation, is that parent agents with above average but not very high spatial mobility fail to get their child into their preferred school more frequently than other parents – including those with lower mobility. This is partly due to the differing aspirations of parents to move house to ensure they live in appropriate neighbourhoods, given the use of distance (from home to school) to ration places at popular schools. In future, when better informed by individual-level data and used in combination with scenarios of different education policies, our modelling approach will allow us to more rigorously investigate the consequences of education policy for inequalities in access to education.

I’ve pasted the abstract below and because JASSS is freely available online you’ll be able to read the entire paper in a few months when it’s officially published. Any questions before then, just zap me an email.

Millington, J.D.A., Butler, T. and Hamnett, C. (forthcoming) Aspiration, Attainment and Success: An agent-based model of distance-based school allocation Journal of Artificial Societies and Social Simulation

AbstractIn recent years, UK governments have implemented policies that emphasise the ability of parents to choose which school they wish their child to attend. Inherently spatial school-place allocation rules in many areas have produced a geography of inequality between parents that succeed and fail to get their child into preferred schools based upon where they live. We present an agent-based simulation model developed to investigate the implications of distance-based school-place allocation policies. We show how a simple, abstract model can generate patterns of school popularity, performance and spatial distribution of pupils which are similar to those observed in local education authorities in London, UK. The model represents ‘school’ and ‘parent’ agents. Parental ‘aspiration’ to send their child to the best performing school (as opposed to other criteria) is a primary parent agent attribute in the model. This aspiration attribute is used as a means to constrain the location and movement of parent agents within the modelled environment. Results indicate that these location and movement constraints are needed to generate empirical patterns, and that patterns are generated most closely and consistently when schools agents differ in their ability to increase pupil attainment. Analysis of model output for simulations using these mechanisms shows how parent agents with above-average – but not very high – aspiration fail to get their child a place at their preferred school more frequently than other parent agents. We highlight the kinds of alternative school-place allocation rules and education system policies the model can be used to investigate.

This week I visited one of my former PhD advisors, Prof John Wainwright, at Durham University. We’ve been working on a manuscript together for a while now and as it’s stalled recently we thought it time we met up to re-inject some energy into it. The manuscript is a discussion piece about how agent-based modelling (ABM) can contribute to understanding and explanation in geography. We started talking about the idea in Pittsburgh in 2011 at a conference on the Epistemology of Modeling and Simulation. I searched through this blog to see where I’d mentioned the conference and manuscript before, but to my surprise, before this post I hadn’t.

In our discussion of what we can learn through using ABM, John highlighted the work of Kurt Godel and his incompleteness theorems. Not knowing all that much about that stuff I’ve been ploughing my way through Douglas Hofstadter’s tome ‘Godel, Escher and Bach: An Eternal Golden Braid’ – heavy going in places but very interesting. In particular, his discussion of the concept of recursion has taken my notice, as it’s something I’ve been identifying elsewhere.

The general concept of recursion involved nesting, like Russian dolls, stories within stories (like in Don Quixote) and images within images:

Computer programmers of take advantage of recursion in their code, calling a given procedure from within that same procedure (hence their love of recursive acronyms like PHP [PHP Hypertext Processor]). An example of how this works is in Saura and Martinez-Millan’s modified random clusters method for generating land cover patterns with given properties. I used this method in the simulation model I developed during my PhD and have re-coded the original algorithm for use in NetLogo [available online here]. In the code (below) the grow-cover_cluster procedure is called from within itself, allowing clusters of pixels to ‘grow themselves’.

However, rather than get into the details of the use of recursion in programming, I want to highlight two other ways in which recursion is important in social activity and its simulation.

The first, is in how society (and social phenomena) has a recursive relationship with the people (and their activities) composing it. For example, Anthony Gidden’s theory of structuration argues that the social structures (i.e., rules and resources) that constrain or prompt individuals’ actions are also ultimately the result of those actions. Hence, there is a duality of structure which is:

“the essential recursiveness of social life, as constituted in social practices: structure is both medium and outcome of reproduction of practices. Structure enters simultaneously into the constitution of the agent and social practices, and ‘exists’ in the generating moments of this constitution”. (p.5 Giddens 1979)

Another example comes from Andrew Sayer in his latest book ‘Why Things Matter to People’ which I’m also progressing through currently. One of Sayer’s arguments is that we humans are “evaluative beings: we don’t just think and interact but evaluate things”. For Sayer, these day-to-day evaluations have a recursive relationship with the broader values that individuals hold, values being ‘sedimented’ valuations, “based on repeated particular experiences and valuations of actions, but [which also tend], recursively, to shape subsequent particular valuations of people and their actions”. (p.26 Sayer 2011)

However, while recursion is often used in computer programming and has been suggested as playing a role in different social processes (like those above), its examination in social simulation and ABM has not been so prominent to date. This was a point made by Paul Thagard at the Pittsburgh epistemology conference. Here, it seems, is an opportunity for those seeking to use simulation methods to better understand social patterns and phenomena. For example, in an ABM how do the interactions between individual agents combine to produce structures which in turn influence future interactions between agents?

Second, it seems to me that there are potentially recursive processes surrounding any single simulation model. For if those we simulate should encounter the model in which they are represented (e.g., through participatory evaluation of the model), and if that encounter influences their future actions, do we not then need to account for such interactions between model and modelee (i.e., the person being modelled) in the model itself? This is a point I raised in the chapter I helped John Wainwright and Dr Mark Mulligan re-write for the second edition of their edited book “Environmental Modelling: Finding Simplicity in Complexity”:

“At the outset of this chapter we highlighted the inherent unpredictability of human behaviour and several of the examples we have presented may have done little to persuade you that current models of decision-making can make accurate forecasts about the future. A major reason for this unpredictability is because socio-economic systems are ‘open’ and have a propensity to structural changes in the very relationships that we hope to model. By open, we mean that the systems have flows of mass, energy, information and values into and out of them that may cause changes in political, economic, social and cultural meanings, processes and states. As a result, the behaviour and relationships of components are open to modification by events and phenomena from outside the system of study. This modification can even apply to us as modellers because of what economist George Soros has termed the ‘human uncertainty principle’ (Soros 2003). Soros draws parallels between his principle and the Heisenberg uncertainty principle in quantum mechanics. However, a more appropriate way to think about this problem might be by considering the distinction Ian Hacking makes between the classification of ‘indifferent’ and ‘interactive’ kinds (Hacking, 1999; also see Hoggart et al., 2002). Indifferent kinds – such as trees, rocks, or fish – are not aware that they are being classified by an observer. In contrast humans are ‘interactive kinds’ because they are aware and can respond to how they are being classified (including how modellers classify different kinds of agent behaviour in their models). Whereas indifferent kinds do not modify their behaviour because of their classification, an interactive kind might. This situation has the potential to invalidate a model of interactive kinds before it has even been used. For example, even if a modeller has correctly classified risk-takers vs. risk avoiders initially, a person in the system being modelled may modify their behaviour (e.g., their evaluation of certain risks) on seeing the results of that behaviour in the model. Although the initial structure of the model was appropriate, the model may potentially later lead to its own invalidity!” (p. 304, Millington et al. 2013)

The new edition was just published this week and will continue to be a great resource for teaching at upper levels (I used the first edition in the Systems Modeling and Simulation course I taught at MSU, for example).

More recently, I discussed these ideas about how models interact with their subjects with Peter McBurney, Professor in Informatics here at KCL. Peter has written a great article entitled ‘What are Models For?’, although it’s somewhat hidden away in the proceedings of a conference. In a similar manner to Epstein, Peter lists the various possible uses for simulation models (other than prediction, which is only one of many) and also discusses two uses in more detail – mensatic and epideictic. The former function relates to how models can bring people around a metaphorical table for discussion (e.g., for identifying and potentially deciding about policy trade-offs). The other, epideictic, relates to how ideas and arguments are presented and leads Peter to argue that by representing real world systems in a simulation model can force people to “engage in structured and rigorous thinking about [their problem] domain”.

John and I will be touching on these ideas about the mensatic and epideictic functions of models in our manuscript. However, beyond this discussion, and of relevance here, Peter discusses meta-models. That is, models of models. The purpose here, and continuing from the passage from my book chapter above, is to produce a model (B) of another model (A) to better understand the relationships between Model A and the real intelligent entities inside the domain that Model A represents:

“As with any model, constructing the meta-model M will allow us to explore “What if?” questions, such as alternative policies regarding the release of information arising from model A to the intelligent entities inside domain X. Indeed, we could even explore the consequences of allowing the entities inside X to have access to our meta-model M.” (p.185, McBurney 2012)

Thus, the models are nested with a hope of better understanding the recursive relationship between models and their subjects. Constructing such meta-models will likely not be trivial, but we’re thinking about it. Hopefully the manuscript John and I are working on will help further these ideas, as does writing blog posts like this.

This week on the SIMSOC listserv was a request from Annie Waldherr & Nanda Wijermans for modellers of social systems to complete a short questionnaire on the sort of criticism they receive. The questionnaire is only two short questions, one asking what field you are in and the other asking you to ‘Describe the criticism you receive. For instance, recall the questions or objections you got during a talk you gave. Feel free to address several points.’

Here was my quick response to the second question:

1) Too many ‘parameters’ in agent-based models (ABM) make them difficult to analyse rigorously and fully appreciate the uncertainty of (although I think this kind of statement highlights the mis-understanding some have of how ABM can be structured – often models of this type are more reliant on rules of interactions between agents than individual parameters).

2) The results of models are seen as being driven by the assumptions of the modeller than by the state of the real world. That is, modellers may learn a lot about their models but not much about the real world (see similar point made by Grimm [1999] in Ecological Modelling 115)

I think it would have been nice to have a third question offering an opportunity to suggest how we can, or should, respond to these critisisms. Here’s what I would have written if that third question was there:

ii) show that the model parameter space has been widley explored (e.g., via use of techniques like Latin hypercube sampling).

To address 2) we need to make sure that:

iii) when documenting our models (see i) we fully justify the rationale of our models, hopefully with reference to real world data;

iv) we acknowledge and emphasise that the current state of ABM means that usually they can be no more than metaphors or sophisticated analogies for the real world but that they are useful for providing alternative means to think about social phenomena (i.e., they have heuristic properties).

If you’re working in this area go and share your thoughts by completing the short questionnaire , or leaving comments below.

A couple of weeks ago I visited King’s Department of Education to give a seminar I entitled Agent-based simulation for distance-based school allocation policy analysis. The aim was to introduce agent-based modelling to those unaware and hopefully open a debate on how it might be used in future education research. This all came about as I’ve been working on modelling the drivers and consequences of school choice with Profs Chris Hamnett and Tim Butler here in King’s Geography Department.

In their recent research, Chris and Tim looked at the role geography plays in educational inequalities in East London. Many UK local education authorities (LEAs) use spatial distance as a key criterion in their policy for allocating school places: people that live closer to a school rank get allocated to it before those that live farther away. This is necessary because it’s often the case that more people want to send their children to a school than there are places available at it. For example, you can read about the criteria the Hackney LEA uses in their brochure for 2012.

Using data from several LEAs, Chris and Tim showed empirically how this distance criterion is related to school popularity. School popularity is indicated for example by the ratio of school applicants to the number of places available at the school (A:P) – some schools have very high ratios (e.g. up to 8 applications per place) and others very low (e.g. down to around one application per place). Furthermore, this spatial allocation criterion is an important influence on parents’ strategies for school applications, dependent on the location of their home relative to schools and their ability to move home.

These allocation rules, combined with parent’s strategies, produce patterns and relationships between schools’ GCSE achievement levels, A:P ratio and the maximum distance that allocated pupils live from the school. In Barking, for example, we see in the figure below that more popular schools have higher percentages of pupils achieving five GCSE’s with grades A* – C, and that these same popular schools also have the smallest maximum distances (i.e. pupils generally live very close to the school).

This spatial pattern can also seen when we look at maps of the locations of successful and unsuccessful applicants to popular and less popular schools in Hackney. For example, looking at the figure below (found in Hamnett and Butler 2011) we can see how successful applicants to The Bridge Academy (a popular school) are more tightly clustered around the it than those for Clapton Girls’ Technology College (not such a popular school).

The geography of this school allocation policy, combined with differences in parents’ circumstances, suggests this issue is a prime candidate for study using agent-based modelling. Agent-based simulation modelling might be useful here because it provides a means to represent interactions between individual actors with different attributes (in this case schools and parents) across space and time. Once the simulation model structure (e.g. rules of interactions between agents) has been established, it can then be used to examine the potential effects of things like opening or closing schools (i.e. changes in external conditions) or changes in school allocation policy rules or parents’ application strategies (i.e. internal model relationships and rules).

I developed an initial ‘model’ as a proof of concept and which you can try out yourself. Things have progressed from that proof of concept model, and the model now represents changes in cohorts of school applicants and pupils through time, including the potential for parents to move house to be more likely to get their child into a desired school.

In the seminar with the Department of Education guys I presented some ouput from the recent modelling. I showed how the abstract model with relatively few and simple assumptions can start from random conditions to reproduce empirical spatial patterns in school applications and attainment outcomes like those described above (see the figure below)

I also presented early results from using the simulation model to explore implications of potential policy alternatives (such as closing failing schools). These ideas were generally welcomed in the seminar but there were some interesting questions about the what the model assumptions might entail for maintaining existing policy assumptions and intentions (what we might term the rhetoric of modelling).

I’m exploring some of these questions now, including for example issues of how we define a ‘good’ school and how parents’ school application strategies might change as allocation rules change. These will feed into a research manuscript that I’ll continue to work on with Chris and Tim.

These last few days I’ve been up in Edinburgh visiting folks at the Forestry Commission’s Northern Research Station to discuss the socio-ecological modelling of potential woodland creation I’ve been working on recently. I also got to talk with Derek Robinson at the University of Edinburgh about some of these issues. Everyone seemed interested in what I’ve been doing, particularly with the ideas I’ve been bouncing around relating to the work Burton and Wilson have been doing on post-productivist farmer self-identities, how these self-identities might change, how they might influence adoption of woodland planting and how we might model that. For example, I think an agent-based simulation approach might be particularly useful for exploring what Burton and Wilson term the ‘‘temporal discordance’ in the transition towards a post-productivist agricultural regime”. And I also think there’s potential to tie it in with work like my former CSIS colleague Xiaodong Chen has been doing using agent-based approaches to model the effects of social norms on enrollment in payments for ecosystem services (such as woodland creation).

I was away on holiday for a couple of weeks after the RGS. On returning, I’ve been preparing for King’s Geography tutorials with the incoming first year undergraduates. The small groups we’ll be working will allow us to discuss and explore critical thinking and techniques about issues and questions in physical geography. Looking forward a busy autumn term!

The Leverhulme Trust makes awards in support of research and education with special emphasis on original and significant research that aims to remove barriers between traditional disciplines. Their Early Career Fellowships are awarded across all disciplines and in 2010 approximately 70 were expected to be awarded to individuals to hold at universities in the UK. Given the emphasis on original, significant and cross-disciplinary research made by the Trust I looked for something that matched my research skills in coupled human and natural systems modelling but that pushed work in that area in a new direction. I thought back to the ideas about model narratives I have previously explored with David O’Sullivan and George Perry (but have not worked on since then) and Bill Cronon’s plenary address at the Royal Geographical Society in 2006 on the need for ‘sustainable narratives’. With that in mind, and given the UK Forestry and Climate change report I had been reading, I decided to make a pitch for a project that would explore how narratives from the use of models could help individuals identify how local actions transcend scales to mitigate global climate change in the context of the anticipated woodland planting that will be ongoing in the UK in future years. It proved to be a successful pitch!

I’m sure I will blog plenty more about the project in the future, so for now I will just leave you with the proposal rationale (below). I’m looking forward to getting to work on this when I get back to London, but before that there’s plenty more things to get done on the Michigan forest landscape ecological-economic modelling.

Model narratives for climate change mitigationThe abstract, vast, and systemic narratives that dominate the issue of global climate change do little to illustrate to individuals and groups how their actions might contribute to mitigate the effects of what is often framed as a global problem (Cronon 2006). Ways to improve the ability of individuals and groups to identify how their local actions transcend scales to mitigate global climate change are needed. In this research I will explore how narratives produced from computer simulation models that represent individuals’ actions can provide people with insights into how their behaviour affects system properties at a larger scale. Although the narrative properties of simulation models have been highlighted (O’Sullivan 2004), the use of models to develop localised narratives of climate change which emphasise individual agency has yet to be explored. Confronting individuals with these narratives will also help researchers reveal important underlying, and possibly implicitly held, assumptions that influence choices and behaviour.

This research will address the following general questions:

How can computer simulation models be better used to reveal to individuals how their local actions can contribute to global environmental issues such as Climate Change Mitigation (CCM)?

What are the narrative properties of simulation models and how can they be exploited to help individuals find meaning about their actions as they relate to global climate change?

By using simulation tools to spur reflection what can we learn about the factors influencing individuals’ choices and behaviour with regards CCM options?

Answering these questions will require a uniquely interdisciplinary research approach that spans the physical sciences, social sciences and humanities. Such ground-breaking, boundary-crossing work is necessary if we are to re-connect the physical sciences with the publics they intend to benefit and find solutions to large-scale and pressing environmental problems. For example, one of the key findings from a recent report by the National Assessment of UK Forestry and Climate Change Steering Group (Read et al. 2009) was that “[t]he extent to which the potential for additional [greenhouse gas] emissions abatement through tree planting is realized … will be determined in large part by economic forces and society’s attitudes rather than by scientific and technical issues alone” (p.xvii). The report also argued the need “to better understand and consider the role of different influences affecting choices and behaviour. Without the appropriate emotional, cultural or psychological disposition, information will make no difference.” (p.210). Narratives based on scientific understanding which portray how individuals can make a difference to large-scale, diffuse environmental issues will be important for fostering such a disposition. Simulation models – quantitative representations of reality which provide a means to logically examine how high-level and large-scale patterns are generated by lower-level and smaller-scale processes and events – have the potential to contribute to the construction of these narratives.