A Measure of Science – Sciblogshttp://sciblogs.co.nz
SciblogsWed, 07 Dec 2016 02:10:30 +0000en-UShourly1https://wordpress.org/?v=4.6.1Scientists need to hold policy-makers to accounthttp://sciblogs.co.nz/a-measure-of-science/2014/09/30/scientists-need-to-hold-policy-makers-to-account/
http://sciblogs.co.nz/a-measure-of-science/2014/09/30/scientists-need-to-hold-policy-makers-to-account/#commentsMon, 29 Sep 2014 18:22:20 +0000http://sciblogs.co.nz/a-measure-of-science/?p=1121Over on Public Address last week, New Zealand Association of Scientists President, Dr Nicola Gaston, wrote a very important post on Science and Democracy. When politicians ignore scientific advice, or special interests seek to undermine such advice, how should scientists react? Dr Gaston considers the guidance on offer for scientists in such circumstances in the […]

Dr Gaston’s post is very timely. In Nicky Hager’s Dirty Politics it was alleged that ex-MP Katherine Rich arranged for the posting of “hits” on a popular blog in order to undermine scientific advice about the health risks of consuming alcohol or fatty, sugary foods. While Rich, who is Chief Executive of the Food and Grocery Council, is no longer a politician, the blog also appears to have been used by current members of the government to leak information for political gain. By leaking to a blogger who operated outside the ethical constraints that bind mainstream journalists, the government lent the blog an authority that was then exploited to mount attacks on the credibility of New Zealand scientists.

In such an environment, it is increasingly difficult for scientists to remain neutral. Indeed, a group of prominent public health researchers recently called publicly for Rich to resign from the board of the Health Promotion Agency, a government-funded sponsor of public health programmes.

On Monday, Sir Peter Gluckmanweighed in, drawing on his impressions of the recent Science Advice to Goverments conference. In his blog post, and in the recent report on Science and Society, A Nation of Curious Minds, Sir Peter says that the Royal Society of New Zealand (RSNZ) should “develop a code of practice for scientists on public engagement” [1]. Sir Peter cites the Code of Conduct developed by the Science Council of Japan as the best example of such a code: it “outlines not only the responsible conduct of research but also the social responsibility of science organisations and scientists to engage with the public and policy makers based on their expert knowledge” [1].

Holding policymakers to account

For the most part the Japanese code covers very similar ground to the Royal Society’s code of ethics, but specifically includes a section entitled Science and Society that deals with engagement with the public and with policy-makers. This section was added to the code in 2013 after the Fukushima Dai-ichi nuclear disaster, which highlighted a “need for scientists to re-examine whether they had truly responded to the trust and mandate given to them by society” [2].

The revision of the code followed an independent enquiry into the disaster that found “fault in nuclear regulators for not paying sufficient attention to improvements in nuclear safety standards, as recommended by the International Atomic Energy Agency” and found that “the Japanese Nuclear Industrial and Safety Agency had been promoting nuclear energy without being open about the inherent risks” [3].

Most importantly, there is a clause in the Japanese code that goes well beyond anything that is in our own:

In the event that a policy decision is made that diverges from the advice of the scientific community, scientists shall request, as necessary, accountability to society from the policy planner and/or decision maker.

In Japan, it seems that the nuclear industry and the government was guilty of ignoring available scientific evidence about the safety of the Fukushima Dai-ichi power plant [4]:

“.. research on the Jogan tsunami of 869 AD has shown that such heights should not be considered “unanticipated” along the part of the Japanese coast that includes the Fukushima
nuclear complex. However these probabilities were ultimately dismissed through the internal discussion of the division on the grounds that they were ‘academic’.”

Under their new code, Japanese scientists would have had a mandate to publicly challenge their government to raise safety standards and to fully explain the risks as well as the benefits of nuclear power. This new code empowers Japanese scientists to hold policymakers to account in public if they ignore or dismiss the scientific advice they receive. It is a call it arms for the Japanese science community.

The critic and conscience

There is no equivalent clause in our code, although it does state that members should have the aim of “fostering of informed critical responses to issues relating to science, technology, or the humanities”. Rather, the intent of the Japanese code seems much closer to that of the New Zealand Education Act, which states that universities must act as the “critic and conscience of society”. Indeed, academics in New Zealand often turn to this description of their role when confronting the difficult decision of when to speak out on controversial issues.

But while New Zealand academics are empowered in this way, scientists in our Crown Research Institutes are not. And as Dr Gaston points out in her blog, even academic scientists have come under attack when trying to give science advice on difficult matters.

Could New Zealanders be put at risk by officials who ignore scientific evidence? If you don’t think it could happen here, I suggest you read the independent report into last year’s Fonterra botulism scare: some of the findings of this report echo those of the Fukushima Dai-Ichi enquiry. We got away with it in the end; sadly, the Japanese didn’t.

Most scientists accept that scientific knowledge is just one of the factors that politicians and policy-makers must take into account when making decisions. In return, scientists expect their advice to be weighed seriously by politicians, whether or not that advice is ultimately followed. Indeed, John Key was the first New Zealand Prime Minister to appoint a Chief Science Advisor, but as he says “we don’t always like his advice and we don’t always listen to him.”

However, there is a difference between weighing scientific advice alongside other concerns, and the undermining or outright rejection of that advice as being without merit. When politicians neglect to weigh the scientific evidence adequately, lives and livelihoods are put at risk. It is chilling to compare the dismissal of the risks of tsunami by Japan’s nuclear industry (‘academic’) to Key’s put down on BBC Hardtalk of Dr Mike Joy, the Massey University scientist who has drawn attention to the deteriorating quality of our rivers and lakes:

He’s one academic, and like lawyers, I can provide you with another one that will give you a counterview.

Dr Gaston and Sir Peter have kicked off a very timely discussion of how best to ensure that the voice of the scientific community is not muzzled or ignored by politicians or policy-makers, as so tragically occurred in Japan. We have an opportunity to strengthen our Code of Ethics* in a way that would protect the ability of both academic and CRI scientists to speak out on difficult issues. The public’s confidence and trust in the scientific community rests on our ability and willingness to stand up when the public interest is threatened.

* Although a change to the Code of Ethics would not supersede the employment relationships between the CRIs and their scientists, their act requires that CRIs “comply with any applicable ethical standards”, which would presumably include relevant sections of the Royal Society’s code.

]]>http://sciblogs.co.nz/a-measure-of-science/2014/09/30/scientists-need-to-hold-policy-makers-to-account/feed/3Science and its privilege in the policy arenahttp://sciblogs.co.nz/a-measure-of-science/2014/08/25/science-and-its-privilege-in-the-policy-arena/
http://sciblogs.co.nz/a-measure-of-science/2014/08/25/science-and-its-privilege-in-the-policy-arena/#commentsSun, 24 Aug 2014 20:00:44 +0000http://sciblogs.co.nz/a-measure-of-science/?p=1116Scientific evidence is held in high regard by New Zealand’s government and its public officials, and frequently plays a significant role in the policy arena. As the late Sir Paul Callaghan said, “‘Science is the compass on the voyage we must all make into the twenty-first century.” But as government moves to appoint science advisors […]

]]>Scientific evidence is held in high regard by New Zealand’s government and its public officials, and frequently plays a significant role in the policy arena. As the late Sir Paul Callaghan said, “‘Science is the compass on the voyage we must all make into the twenty-first century.” But as government moves to appoint science advisors across its Ministries, it is worth reflecting on why science should be valued so highly. Why should scientific evidence be privileged over other inputs into the policy-making process?

“Scientists should stick to the facts,” concluded the Vancouver Sun after an interview with the New Zealand Prime Minister’s Chief Science Advisor, Sir Peter Gluckman, on the role of scientists in policy. Sir Peter has argued that scientists must act as brokers of knowledge – not advocates – when providing advice to policy-makers. This world-view, one that is held by many scientists, rests on the assumption that science itself is value-free, providing a source of fact that is uncontaminated by society’s prejudice: the value of science stems from its very lack of values.

Science is committed to two ideals: (1) the world is intelligible; and (2) acquiring knowledge is hard. To understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests.

The scientific value-system prizes openness, evidence-based debate and acceptance of human fallibility. It is these values, in fact, that make science so useful in the policy process.

If one accepts that science comes with values, then one must also accept that these will not always align with the values of those that employ their services. In Canada, the ability of government scientists to comment openly on scientific issues has been tightly constrained. And closer to home the commercial and political interests of the Crown Research Institutes have not always been reconciled easily with the principles of open debate that underpin the scientific method. In the worst-case, the scientific values that underpin good scientific advice can be undermined or become distorted.

In these circumstances, the portrayal of scientific advice as impartial and free of interests can be problematic. Consider the clash of interests of a government scientist, whose job it is to test water quality, and that of a community that suspects its water supply may be unsafe. The scientist may place greater weight on a test that minimises false positives, especially if they are employed to undertake many such tests. The community would likely prefer that the scientist administer a test that minimises false negatives, to ensure their health is not inadvertently put at risk.

The scientist cannot meet the community’s needs by acting merely as a knowledge broker. The scientist can succeed only through engagement with the community: by helping the community consider a range of evidence, by participating in open dialogue, and by developing an understanding of the interests of all. The philosopher Richard Rorty wrote that science is privileged because it encompasses “tolerance, respect for the opinions of others, a willingness to listen, reliance on persuasion rather than force”. The voices of scientists should be privileged because they bring both the knowledge and the values of science to the policy arena.

Should Hamilton fluoridate its water supply? The science seems clear: the fluoridation of water supplies is a safe and cost effective way for communities to improve dental health. The puzzle for many scientists is why society debates these things at all. Part of the answer lies in confirmation bias: those who are concerned about the modern chemical industry are more likely to seek out studies that are critical of fluoridation, while ignoring those (the majority) that are not. The responsibility of the science advisor here extends well beyond the provision of evidence; the science advisor must also take responsibility for how the community uses the evidence.

Again, the job of the science advisor is not just to deliver the facts, but to engage democratically to assist the community to weigh the full breadth of evidence. They must be prepared to listen to and learn from the community. They must understand the values of the communities they serve. The successful science advisor must be an advocate for scientific values and not simply a broker of fact.

The authority of the scientific voice does not derive from its lack of values, but from the strength of the values on which it is based.

This article was published last month in Public Sector37:2, 24 (2014).

]]>http://sciblogs.co.nz/a-measure-of-science/2014/08/25/science-and-its-privilege-in-the-policy-arena/feed/5Misogyny in sciencehttp://sciblogs.co.nz/a-measure-of-science/2014/07/09/misogyny-in-science/
http://sciblogs.co.nz/a-measure-of-science/2014/07/09/misogyny-in-science/#commentsTue, 08 Jul 2014 20:15:28 +0000http://sciblogs.co.nz/a-measure-of-science/?p=1108I had to wait until fourth form for my first lesson about feminism. I went to an all-boys school in provincial New Zealand, where classes on contemporary political issues were few and far between. So I sat up straight when my maths teacher told us that feminists believed that “All men are rapists.” For his […]

]]>I had to wait until fourth form for my first lesson about feminism. I went to an all-boys school in provincial New Zealand, where classes on contemporary political issues were few and far between. So I sat up straight when my maths teacher told us that feminists believed that “All men are rapists.” For his fourteen-year old audience, he felt the need to clarify “What they mean is that all men are capable of rape.” But he had thought long and hard about this: “Remember boys, if you ever meet a feminist, just tell her that all women are prostitutes.”

What a dick.

Today I work in physics at the University of Auckland, where it is my turn to teach the next generation of young scientists, technologists and entrepreneurs. As a country, it is universally acknowledged that we have a shortage of skilled people in these areas. When I look out at the faces that fill my lecture theatre, one of the sources of this shortage is clear: there are far too few young women who are prepared to consider careers in science and engineering (there is some data on this here at Wiki New Zealand).

I am very proud to call some of these women my colleagues. Dr Michelle Dickinson, a nanotechnologist at the University of Auckland and the MacDiarmid Institute, is one of New Zealand’s leading science communicators. She tweets, blogs and appears regularly on TV to make science accessible to a broad range of people. So when I see things like this,

it makes me angry and ashamed.

In case you think this is not misogyny, I, like Michelle, am an active science communicator and I have never been subject to this sort of condescending hostility. And in case you think this is a one-off occurrence, female academics who actively communicate seem to be on the receiving end of this sort of thing all the time. That the ability of a female scientist should be questioned by a male colleague in this way is appalling. That this man, like my fourth form maths teacher, seems to suffer no fear of the consequences of such unprofessional behaviour is chilling.

]]>http://sciblogs.co.nz/a-measure-of-science/2014/07/09/misogyny-in-science/feed/4Evidence-based science policyhttp://sciblogs.co.nz/a-measure-of-science/2014/06/11/evidenced-based-science-policy/
http://sciblogs.co.nz/a-measure-of-science/2014/06/11/evidenced-based-science-policy/#commentsTue, 10 Jun 2014 18:37:52 +0000http://sciblogs.co.nz/a-measure-of-science/?p=1091In May last year, the New Zealand Herald ran an editorial in which it declared: Science has been a black hole for taxpayers’ money. Governments of all stripes agree that science is something they should fund without knowing very much about it. Ironically, the editorial went on to praise the virtues of the National Science […]

]]>In May last year, the New Zealand Herald ran an editorial in which it declared:

Science has been a black hole for taxpayers’ money. Governments of all stripes agree that science is something they should fund without knowing very much about it.

Ironically, the editorial went on to praise the virtues of the National Science Challenges (NSCs), which had been announced a few weeks later. Ironic, because the NSCs are shaping up to be one of the biggest black holes that science has sent the taxpayer’s way in a long time.

In this post I want to introduce the emerging science of “science policy”, which represents a new approach to evaluating the outcomes of science and innovation investment and offers hope for rescuing science funding from the Herald’s black hole.

The dark arts

Many governments invest considerable amounts in science and innovation, and it is generally agreed that this investment is of great benefit to society. But what types of investment work best? Would it be better to provide R&D tax credits for all firms, or should we fund specific projects in strategically important industries? Should we fund blue skies research at our universities or pour money into product development at Callaghan Innovation?

For the most part, we are not yet able to answer these questions with any rigor. Science and innovation policy as it is practised today is a dark art.

As with many empirical questions in social sciences, a good deal of the difficulty lies in distinguishing correlation from causation. When the government funds a science project, it will typically choose to fund the research groups with the best track records or the best ideas, preferably both – yet these are also the teams that are the most likely to succeed without government funding. Good research groups may press on regardless, or may find other means to support their work. Even if the projects that the government funds are successful, it can’t be sure whether the funding it provided was necessary for this success.

This prevents the government from putting a value on the research it funds, and makes it very hard to assess how good the decision-making processes it uses to allocate this funding might be.

Medicine was a similarly murky affair once upon-a-time, but the invention of the randomised double-blind controlled trial sixty years ago means that when your doctor prescribes a new drug, she can be confident of its effectiveness. In such a trial, some patients in a group are randomly selected to be given a treatment, while the remainder receive a placebo. Only at the end of the trial once the data is in is it revealed which patients received the treatment and which didn’t. Researchers can learn whether the drug caused any effect, because they can compare the outcomes of those patients who received the drug with those who didn’t (the control) without any bias.

Could this approach be used to evaluate the value of investments in science and innovation?

An unfortunate experiment

In mid-2012, the newly formed Ministry of Business, Innovation and Employment (MBIE) inadvertently conducted such a trial, albeit by accident. When it assessed the quality of the funding proposals that it had received, the Ministry failed to ensure that each proposal received an equal number of external peer reviews. Some proposals received just a single peer review while others received up to four.

As I wrote in a post last year, this exposed their funding decisions to a potential bias. Even if two proposals were of equal merit, the proposal that by chance received more reviews would also be more likely to have at least one negative review. A cautious Ministry might be reluctant to fund proposals that received a negative review, even if all others were positive. Proposals that received more reviews would then be less likely to be funded than equally good proposals that, by chance, received fewer.

Indeed, more than a third of the proposals that only received one review were funded, while only one quarter of those that received two or more were successful. Was MBIE too conservative in its funding decisions?

To answer this question, we need to know how likely it is that this could have been generated by chance in the absence of bias. It turns out that without bias, one in every twelve funding rounds would produce such a skewed result, so while one might be suspicious, the data does not allow us to draw a solid statistical conclusion. Nevertheless, this example illustrates how we might use randomness to evaluate the effectiveness of our decision-making processes.

From an art to a scienceWhile unintentional experiments such as this can reveal interesting information about the quality of decision-making by funding agencies, it would be better to undertake such studies purposefully, rather than by accident.

There are methods for studying the effectiveness of our investments in science that are fairer than randomly allocating the number of external reviews. It is these new approaches, which make use of the big data sets that are increasingly becoming available, that are driving the science of science policy.

Such an approach was recently used to test the quality of decision-making by the US National Institutes of Health (NIH), which invests billions of dollars every year in medical research. The conclusion? Projects rated poorly by the NIH, but funded nonetheless, produced just as much impact as those with that were rated the best. This suggests that the NIH funding panels are choosing to support some proposals that turn out to have low impact, while rejecting other proposals that would have delivered higher impact. This is valuable information for an organisation that spends more than US$100 million dollars each year on evaluating proposals.

Closer to home, Adam Jaffe, Director of Wellington-based Motu Economic and Public Policy Research, is currently undertaking a similar study of Marsden funded projects. Using the discontinuity regression method, Jaffe is comparing the subsequent academic performance of those who just made it over the threshold for funding to that of those who just missed out*, on the assumption that differences in the quality of proposals and teams that are being compared will be small. Proposals that just missed out on funding are effectively being used as a control group for those that just made it.

Once the study is complete, Jaffe will be in a position to estimate the scientific impact that a Marsden grant generates. If he finds that the Marsden allocation process suffers from the same problems as that of the NIH, the fund may be able to take steps to improve this process and thereby increase its impact.

So far Jaffe’s study only considers publications and their citations, but with access to more data it should also be possible to assess some of the less tangible social and economic benefits that come from Marsden-funded research. The Marsden fund may eventually be able to determine whether the PhD students it supports go on to have more successful careers or found more companies than students funded by other scholarships. Evidence like this is the sort of thing that would persuade Treasury to put more money into blue skies research (or less, if the results are negative).

Keeping scoreJaffe is able to do this for the Marsden fund because it has been operating for 20 years. Over that time it has kept high-quality records of its decision-making processes: these records detail what was funded, what wasn’t funded, and why. Yet the Marsden fund represents less than 5% of New Zealand’s public spending on science and innovation, and unfortunately good records of the processes used to allocate the remaining 95% have not been, and are not being, kept.

It is even difficult to establish what it is that the government chose to fund, let alone what it chose not to.

This loss of information can be partly attributed to the volatility in the way science is funded in New Zealand, including the regular restructuring of funding agencies themselves (MoRST, FRST, MSI, MBIE, Callaghan Innovation, …) and the churn in the funding schemes they administer (PGSF, NERF, Research for Industry, Smart Ideas, …). In contrast, the Marsden fund has been managed continuously by the Royal Society of New Zealand using a relatively stable process for the last two decades.

There also seems to be a bureaucratic reluctance by the government agencies that administer these funds to collect and curate the sort of data that might be useful for for evaluation. In response to a recent query from New Zealand Association of Scientists President, Dr Nicola Gaston, concerning possible gender bias in its grant allocation processes, MBIE responded that

Gender information is not necessary for the function of allocating research funding

Unfortunately, international evidence suggests that women researchers in many countries tend to receive less funding than men. By not collecting data on gender, MBIE cannot know whether similar biases exists here. It may well be missing an opportunity both to increase the impact of the research it funds** and to remove one of the barriers that impede the careers of women scientists.

Even if it has no immediate use for it, MBIE should be collecting data where reasonable and practicable to enable future studies of impacts and funding processes.

Escaping the black holeWith new methodologies available such as discontinuity regression and a better understanding of the need to collect data, one would hope that within a few years we will be in a position to rigorously evaluate the impact of our newest funding mechanism, the National Science Challenges.

Sadly, this is unlikely to be possible.

The problem lies in the difficulty of identifying a control group for the NSCs: the way that they have been selected and established makes it very difficult to establish what the world would look like without them. Would the science proposed have been carried out anyway? Was the panel that chose the NSCs subject to bias? We will never know, because the processes used to choose the ten challenges and assemble the challenge teams have not been transparent. We have no records of challenges that weren’t chosen or team members that weren’t named on the challenge proposals.

For each individual challenge, MBIE notes***:

Because of the focus on ‘best teams’ an effective outcome for each Challenge will be to generate a single proposal – there can only be one ‘best team’

In other words, the NSC process has made it impossible to establish a control group by design. And unless those that are putting together the NSCs can outperform the NIH, it is very unlikely that the teams for each challenge will be the ‘best’.

The NSCs do represent a significant increase in funding for science in New Zealand, and there is a school of thought that we should just get on and make them work as best we can. I have much sympathy for this point of view, and have indeed pulled my sleeves up, together with a number of my colleagues, to put together a proposal for the “Science for Technological Innovation” challenge.

Yet at the same time I am aware that the design and implementation of the NSCs represents a wasted opportunity. Sir Peter Gluckman, the Prime Minister’s Chief Science Advisor, has called for the greater use of scientific evidence in government policy-making. I agree; it’s well past time that we started using evidence in making science policy.

* OK, it’s a bit more complicated but this gives you the basic idea.

** It is worth noting that the Marsden fund collects gender information and finds no bias in its allocation process.

]]>http://sciblogs.co.nz/a-measure-of-science/2014/06/11/evidenced-based-science-policy/feed/9Timing is everythinghttp://sciblogs.co.nz/a-measure-of-science/2014/02/17/timing-is-everything/
http://sciblogs.co.nz/a-measure-of-science/2014/02/17/timing-is-everything/#commentsSun, 16 Feb 2014 18:00:00 +0000http://sciblogs.co.nz/a-measure-of-science/?p=994Today, I will be reflecting on the importance of good science communication at the University of Waikato’s International Symposium on “Transforming Engagement on Controversial Science and Technology”. There is a lot to say, and a lot that has been said, about science communication. In this post, however, I want to reflect on an aspect of […]

]]>Today, I will be reflecting on the importance of good science communication at the University of Waikato’s International Symposium on “Transforming Engagement on Controversial Science and Technology”. There is a lot to say, and a lot that has been said, about science communication. In this post, however, I want to reflect on an aspect of science communication that is often overlooked.

Sir Peter Gluckman, in his role as the Prime Minister’s Chief Science Advisor, wrote last year about scientists, the media and society. In his essay, he warns scientists of the dangers of becoming advocates for a particular cause, instead arguing that scientists need to act as knowledge brokers for society. Sir Peter’s article is well worth reading, but I think it neglects an important aspect of science communication – namely, that of first response.

Scientists as first responders

Both the 2011 and 2013 winners of the Prime Minister’s Science Media Communication Prize have distinguished themselves by their willingness to step forward during a crisis.

On 4 September 2010, Dr Mark Quigley from the University of Canterbury was woken at 4.35am by a 7.1 magnitude earthquake … and in its aftermath became the spokesperson for the New Zealand science community. Mark was not chosen for this role by the Royal Society of New Zealand or the Ministry of Civil Defence. Rather, in the midst this crisis, he stepped up – reacting quickly, calmly and knowledgeably to the unfolding events. Over the coming months, Mark’s face became familiar to many of us, as he explained the science behind what Canterbury was experiencing and the subsequent risks it faced.

Mark was in the right place at the right time to act, and he was prepared. He had been blogging about his research at DrQuigs.com for a number of years prior to the 2010-11 earthquake sequence. When the first earthquake struck, his blog provided a fast, reliable communication platform for getting his science out. After the quakes, the readership of his blog sky-rocketed as the public turned to it for information on unfamiliar phenomena like liquefaction and the risks of aftershocks.

When the 22 February 2011 earthquake struck, it was noticeable how much better the geoscience community was prepared. Mark’s efforts after the September quake had set an important example for the science community of the need for an effective first response. After a major disaster, we have learned that the public and the media have an immediate need for scientific information and analysis to help them understand what is happening and to allow them to make good decisions.

This is something that can only come from well-prepared, articulate scientists, who can think on their feet and who are comfortable using modern social media. Such a response is not something that a corporate communications team or a national academy can provide.

Fonterra gets the bot

If Mark showed us the value of good science communication in a crisis, Fonterra’s recent contamination scare illustrated the costs when science is communicated poorly. In August 2013, New Zealand’s Ministry for Primary Industries ordered a recall of products containing whey protein from several batches produced by Fonterra in 2012, due to the possibility of contamination by Clostridium botulinum. This quickly became international news, as some of the products affected by the recall included infant formula sold around the world.

This was a major crisis for New Zealand that saw our dollar plummet by US$0.03 in just a few days. As the story unfolded, as with the Canterbury earthquakes, the New Zealand public once again expected timely information and analysis from the science community.

One of the few scientists who did step up was Dr Siouxsie Wiles, a microbiologist at the University of Auckland. [An honourable mention should also go to Prof John Brooks of AUT (see one of his blog posts here).] Like Mark Quigley, Siouxsie is an active blogger, but unlike Mark, her expertise did not directly align with the science behind unfolding crisis. Dismayed by the lack of expert comment, Siouxsie blogged to debunk misinformation and explain the science behind the tests that had been used to detect the contamination.

Sacred cows

The need for the science community to respond to the public need for facts in a crisis undermines one of the ‘sacred cows’ of science communication: that scientists should only speak to the media on areas of their expertise. The problem with this is that real-world crises inevitably stretch the limit of any one scientist’s expertise. Climate change, for instance, is such a complex issue that no one scientist has the in-depth knowledge to cover every angle.

In the past, we had the luxury of specialist science reporters who were able to talk to a range of scientists to deal with this complexity. Today, few journalists have the time, expertise or network of scientific contacts to do this well. While the Science Media Centre plays an important role in connecting media with scientists, the onus now falls much more on these scientists to provide context for their science. This will often require stepping outside the bounds of their expertise.

And when stories develop rapidly, it is even more important to be prepared to push the boundaries of expertise. This is not an easy thing to do, especially in a short time frame, so it is not surprising that scientists are often reluctant to do it. Yet the public need scientific information in such crises, and any scientist who steps up will almost certainly be better than none.

Being prepared

Last October, the report of an external inquiry into the contamination scare (commissioned by Fonterra’s Board) was critical of Fonterra’s external communications and recommended that the organisation adopt greater openness and transparency in its crisis communication. One of the inquiry’s key recommendations was that Fonterra:

This is good advice not just for Fonterra, but for New Zealand’s entire science community. We must ensure that there are more than just a handful of scientists who are prepared to inform the public in a crisis. These scientists must be comfortable with the new forms of media, including social media like blogging and Twitter. They must understand the pressures that the traditional media face in dealing with complex scientific issues on short deadlines.

And above all, they need to be given proper credit for their work. Science communication takes considerable effort, but despite its obvious importance to society, it often receives little academic recognition. Rewarding individual scientists like Mark and Siouxsie for their efforts after the fact is all very well, but if we want to be ready for the next crisis, we need to ensure that we prepare as a community.

]]>http://sciblogs.co.nz/a-measure-of-science/2014/02/17/timing-is-everything/feed/1Marsden 2013: Big increase in funding lifts success ratehttp://sciblogs.co.nz/a-measure-of-science/2014/01/24/marsden-2013-big-increase-in-funding-lifts-success-rate/
http://sciblogs.co.nz/a-measure-of-science/2014/01/24/marsden-2013-big-increase-in-funding-lifts-success-rate/#commentsFri, 24 Jan 2014 03:11:02 +0000http://sciblogs.co.nz/a-measure-of-science/?p=982This post is late, very late! I have a long list of excuses, many of which involve moving to Auckland and writing a Centre of Research Excellence Proposal. But with the 2014 Marsden round almost upon us, it is well past time to look at the numbers from 2013. 2013 saw a big increase in […]

]]>This post is late, very late! I have a long list of excuses, many of which involve moving to Auckland and writing a Centre of Research Excellence Proposal. But with the 2014 Marsden round almost upon us, it is well past time to look at the numbers from 2013.

2013 saw a big increase in the funds handed out. In fact the $68m awarded was the largest ever*, only surpassed by the 2009 round ($65m) if you adjust for inflation. In real terms, the Marsden fund has handed out about 18% more each year over the period 2008-2013 than it did over the preceding decade. The average funding awarded to each successful proposal (fast-start and standard) continues to hover just below $600k.

If the total investment was high in 2013, while the funding per proposal remained static, then the number of projects that were funded must have risen. This was indeed the case, yet at the same time the number of proposals received by the Royal Society continued to climb. There were a record 1157 first round proposals submitted in 2013, compared to an average of 800 proposals per year over the period 1998-2007. This means that although a record-equalling 109 proposals were funded, the overall success rate of 9.4% remained below its long run average of 10%.

The growth in the proportion of funds awarded to fast-start grants for early career researchers (available to researchers within seven years of completing their PhDs) has continued, but the proportion of funds awarded to fast-start grants is still less than the proportion of applications for fast-start grants: in 2013, 22% of the funds Marsden awarded went to fast-start grants while 28% of applicants wrote fast-start proposals**. Would it be fair perhaps to see the share of funding allocated to fast-starts grow to match the proportion of fast-start applicants?

Fast-start proposals have had a success rate of just below 13% since they were created, slightly higher than that of standard proposals at 9%. Interestingly, the success rates of fast-start and standard applicants are only weakly correlated. As I noted last year, the fast-start scheme now plays an important role in early career development for scientists now that the FRST post-doctoral fellowship scheme and the International Mobility Fund are gone. The Rutherford Discovery Fellowships also contribute to early career development but are relatively few in number.

There was a comment on my 2012 Marsden post that the >1000 proposals rejected annually represented a huge opportunity cost. However the worth of a rejected proposal is not zero. I always tell myself that it is a chance to plan my research several years in advance, and – if you make it through to the second round – it is a chance to get feedback from international experts in the field. Nonetheless the significant growth in rejected proposals that has occurred over the last few years suggests that the opportunity cost of the Marsden Fund may be increasing.

*NB: The figures released by the Marsden Fund in 2013 did not include GST.

** My thanks go to Jason Gush for filling in some holes in my data on fast-starts

]]>http://sciblogs.co.nz/a-measure-of-science/2014/01/24/marsden-2013-big-increase-in-funding-lifts-success-rate/feed/5The Physics of Santahttp://sciblogs.co.nz/a-measure-of-science/2013/12/24/the-physics-of-santa/
http://sciblogs.co.nz/a-measure-of-science/2013/12/24/the-physics-of-santa/#commentsMon, 23 Dec 2013 22:18:04 +0000http://sciblogs.co.nz/a-measure-of-science/?p=985At this time of year, many parents worry about the risks posed to their children from exposure to Santa Claus. We know very little about the science of Santa because the government refuses to fund research into Christmas,as it cannot be linked to direct economic benefit. Yet, as Colin Craig has been at pains to […]

]]>At this time of year, many parents worry about the risks posed to their children from exposure to Santa Claus. We know very little about the science of Santa because the government refuses to fund research into Christmas,as it cannot be linked to direct economic benefit.

Many New Zealanders want to know how fast Santa has to travel to deliver all his gifts.

We estimate that Santa has a bit over 30 hours to do this, assuming that he starts around midnight on New Zealand’s side of the international date line, and finishes before people wake up on the other side.

We are less sure about how many children he has to visit: there are about 2 billion children in the world, and when you ask them whether they have been naughty or nice, they inevitably claim they have been nice. As scientists – and some of us were once children ourselves – we just don’t buy this. We think that it is more likely that only about half the children in the world, roughly one billion, have managed to be nice all year.

To deliver presents to these children, Santa has to visit about 5000 homes per second, assuming that there are 2.2 nice kids per household.

When Santa visits New Zealand, however, he has to deal with the fact that Kiwi kids are generally regarded as pretty nice, and this means he has to visit almost all of them**. If he is to reach the children on his list, Santa only three minutes to deliver his presents to the 800,000 Kiwi kids who were not naughty this year.

To fly from Cape Reinga to Rakiura, Santa and his reindeer must travel about 1600km in that three minutes. This works out to be a speed of around 32,000 km/h – 320 times the open road speed limit or about the same speed the space shuttle travels when it orbits the Earth. This is pretty fast.

But how does he power his reindeer? We think that Santa must be sharing the Xmas mince pies and glasses of sherry that are left out for him with his reindeer to keep their energy levels up.

While he travels over New Zealand, his reindeer have to pull him, his sleigh and about 400 metric tonnes of gifts. To achieve their need for speed, the reindeer must supply kinetic energy of about 13 gigajoules. If each child in New Zealand leaves Santa one Xmas mince pie (say, 400 kilojoules) to share with his reindeer, this will only supply around 5% of their energy needs.

Santa’s energy shortfall works out to be roughly the amount of chemical potential energy is stored in two barrels of oil. Hmmm.

More research is clearly needed. We would recommend however that Santa give consideration to delivering his presents in the day time so that he can take advantage of recent advances in solar cell technologies. This would also make life easier for scientists who currently have to write lengthy applications for permission to stay up past their bed time in order to study this mysterious phenomenon.

Declaration of interests: I have been informed that I was bad this year for wasting taxpayers’ money, so Santa is unlikely to be leaving a present for me under the tree. You can rest assured that this has in no way influenced the conclusions drawn in this blog post.

* Radio NZ apparently has no editorial policy on whether the gentleman in question should be referred to as Santa, Santa Claus or Father Christmas. We’ll stick to Santa.

** Anecdotally, my younger sister reports that her two older brothers, despite being Kiwis, were not always nice. This just goes to show how unwise it is to rely on hearsay evidence.

]]>http://sciblogs.co.nz/a-measure-of-science/2013/12/24/the-physics-of-santa/feed/1Pounamu returns Thursday Aug 29http://sciblogs.co.nz/a-measure-of-science/2013/08/25/pounamu-returns-thursday-aug-29/
http://sciblogs.co.nz/a-measure-of-science/2013/08/25/pounamu-returns-thursday-aug-29/#commentsSun, 25 Aug 2013 07:19:00 +0000http://sciblogs.co.nz/a-measure-of-science/?p=964This coming Thursday (Aug 29) from midday we will be running Pounamu again for 24 hours. This is a free, online game set in a future world where all of us can use science as easily as they can use a computer now. We ran the game for the first time last year, in conjunction […]

]]>This coming Thursday (Aug 29) from midday we will be running Pounamu again for 24 hours. This is a free, online game set in a future world where all of us can use science as easily as they can use a computer now. We ran the game for the first time last year, in conjunction with the Transit of Venus forum and boy was it addictive. Sciblogger Michael Edmonds wrote a post about his experiences last year. Like Michael, I found it to be one of the most stimulating and exciting forms of science communication I had ever engaged in – I learnt a lot.

You play by posting micro-forecasts (concise ideas – 140 characters, like twitter) of future possibilities, or build on and reshape other players’ ideas. Here’s a micro-forecast from last time by our very own Peter Griffin:

This provocative statement started a conversation that went in several directions:

You gain points and move up the game leader-board by posting ideas that create more discussion, contributing interesting ideas to the game and winning awards. This year Auckland University Press are offering copies of Get Off the Grass as prizes. Peter would have scored some points for his fore-cast, but so would have those who built on Peter’s initially card.

You can play for ﬁve minutes and share one idea, or play for the whole game and post hundreds of possible futures. Anyone can play as long as they have an internet connection for their browser – players can register here in advance. There will be some public playing hubs in libraries, museums and other places where you can drop in and get the hang of playing and share the experience with others. I will be playing at Te Papa on Level 4 at one of our public hubs.

The conversation which produced the most discussion last year concerned the teaching of science in te Reo from a Maori perspective. What impressed me most was that the subsequent discussion appeared to change many people’s minds about this idea. The conversation tree is shown below – click here to view the tree on prezi or just click on the image to download:

]]>http://sciblogs.co.nz/a-measure-of-science/2013/08/25/pounamu-returns-thursday-aug-29/feed/1Getting Off the Grasshttp://sciblogs.co.nz/a-measure-of-science/2013/08/06/getting-off-the-grass/
http://sciblogs.co.nz/a-measure-of-science/2013/08/06/getting-off-the-grass/#commentsTue, 06 Aug 2013 08:33:00 +0000http://sciblogs.co.nz/a-measure-of-science/?p=953Fonterra’s discovery of the bacterium that causes botulism in a batch of whey protein concentrate has alarmed many. As the whey protein is an ingredient in popular infant formulas, many parents will be worried that they have inadvertently exposed their children to potentially fatal bacteria. Hopefully, the recall of products that use Fonterra’s whey ingredient […]

]]>Fonterra’s discovery of the bacterium that causes botulism in a batch of whey protein concentrate has alarmed many. As the whey protein is an ingredient in popular infant formulas, many parents will be worried that they have inadvertently exposed their children to potentially fatal bacteria. Hopefully, the recall of products that use Fonterra’s whey ingredient will prevent any illness, even if it appears that these recalls may have been tardy. One would also hope that Fonterra learns from the experience, because when Fonterra stumbles, so does the rest of the country.

This latest incident illustrates once again how important it is that New Zealand diversify its economy. In fact, this is the subject that I address in my upcoming book, Get Off the Grass, co-authored with the late Sir Paul Callaghan. I’ll be launching the book with a public talk at Victoria University of Wellington on August 15th (register here if you would like to come along), with the paperback hitting bookstores the following day.

In Get Off the Grass, Sir Paul and I investigate why New Zealanders work harder and earn less than most other people in the developed world. In Sir Paul’s previous book, Wool to Weta, this was framed as a choice: we choose to be poor because of the types of industries that we prioritise, such as farming and tourism, earn us relatively little per hour worked. In Get Off the Grass, we use ideas from economic geography and the study of complex systems to investigate why it has been so hard to innovate our way out of these low productivity industries.

To illustrate just how specialised New Zealand’s economy is, I have borrowed a figure from Dr Helen Anderson, which compares the diversity of our exports with those of Denmark. Like New Zealand, Denmark has a strong agricultural base. Unlike New Zealand, Denmark has made concerted efforts to diversify its economy over the last few decades. We are constantly told that New Zealand is too small to do everything, yet Denmark, a country with a population of only 5.5 million people, manages to do a heck of a lot more.

With a title like Get Off the Grass, it won’t surprise you that we argue that New Zealand can and should look to do an awful lot more than just agriculture. Some of the points we make in the book are:

There is a deep flaw in our reliance on the 100% Pure brand. We need the edge our clean, green brand gives us to sell our agricultural commodities at good prices, yet the production of these commodities actually damages the environment. See this piece I wrote for Unlimited magazine last year.

Economic diversity is crucial for long-term economic stability, and this in turn is crucial for growth. The fluctuations in our dollar caused by the contamination of one of our major exports illustrates why. The volatility caused by such crises in turn hurts other export sectors, making it even harder to get off the grass.

Diversity is regarded as a crucial ingredient for innovation, so our strong focus on agricultural research actually makes us less innovative as a nation, whether in agriculture or otherwise. Physics and chemistry have contributed an awful lot to agriculture, but agricultural science has not returned the favour.

Specialisation in a single industry is just not a good long term strategy. No industry stays on top forever, and if your favoured industry becomes too important to fail, it will prevent you moving into other industries before it’s too late.

Detroit, with its dependence on car manufacturing, is a classic example. Although Detroit’s car industry has vast scale with the three biggest car makers in the US, the city is now a basket case because its mono-cultural manufacturing sector has failed to reinvent itself in the face of strong competition from overseas manufacturers.

As I said when the National Science Challenges were announced, our dependence on the primary sector leaves our economy perilously exposed to volatile commodity markets. Jacqueline Rowarth told Radio New Zealand that previous attempts to diversify our economy had failed. Get off the grass – we’ve yet to make a serious attempt!

]]>http://sciblogs.co.nz/a-measure-of-science/2013/08/06/getting-off-the-grass/feed/7Complexity, emergence and networkshttp://sciblogs.co.nz/a-measure-of-science/2013/07/11/complexity-emergence-and-networks/
http://sciblogs.co.nz/a-measure-of-science/2013/07/11/complexity-emergence-and-networks/#respondThu, 11 Jul 2013 07:39:43 +0000http://sciblogs.co.nz/a-measure-of-science/?p=942What do magnets, stock markets, and Facebook all have in common? With Get Off the Grass off to the printers, I now have some time to ponder such important questions. So tonight at 8.40pm, I’ll be back talking to Bryan Crump on Radio NZ Nights about what it is that these things share: namely, complexity. […]

]]>What do magnets, stock markets, and Facebook all have in common? With Get Off the Grass off to the printers, I now have some time to ponder such important questions. So tonight at 8.40pm, I’ll be back talking to Bryan Crump on Radio NZ Nights about what it is that these things share: namely, complexity. (You can listen the interview here.)

It’s complicatedWe are surrounded by complicated things. It seems obvious that both the behaviour of the stock market, which is a result of many individual investment decisions made by thousands of investors, and the behaviour of a magnet, which is the aggregate of the magnetic properties of a very, very large number of individual atoms, are complicated.

What is much less obvious is that the stock market and a magnet should behave anything like each other. Yet this is what scientists have found: in certain circumstances, complicated systems that consist of many entities that interact with each other often exhibit similar patterns of collective behaviour.

What are these similarities? It turns out that statistically, the ups and downs of the stock market are similar to the microscopic fluctuations of the strength of a magnet like iron. Because individual atoms will occasionally flip the orientation of their own magnetic field, the net magnetic field of a collection of magnetic atoms will fluctuate. These fluctuations tend to be small, because if an individual atom flips, it will then experience a magnetic force from the other atoms that will eventually make it flip back again to line up with all the others.

If you heat the magnet up though, each atom in the magnet will jiggle more and is more likely to flip. The hotter the magnet becomes, the more the strength of its magnetic field will fluctuate. But if the magnet becomes too hot, it can actually lose its magnetic field altogether, because the flipping becomes so random that the tiny magnetic fields of each of the atoms cancel each other out.

Sell-offs, seagulls and sandWhat has this got to do with the stock market? It turns out that investors can behave a little bit like atoms. Most of the time, investors tend to invest independently of each other. They make their own decisions to buy and sell stocks based on the prospects and performance of individual companies, without worrying too much about what everyone else is doing.

Just prior to a stock market crash, this behaviour changes. If you buy a stock at the point where everyone else is selling, then you will soon see the price of that stock drop below what you have just paid for it. This can look like a strong incentive to sell your stock before its price drops any further. If this starts happening across too many stocks, then investors will see the value of their stock portfolios falling and this can trigger an even larger sell-off. The value of the market plummets.

When the stock market behaves normally, investors act independently, just like atoms in a piece of iron that is too hot to have a magnetic field. When it crashes, investors all start doing the same thing – selling – just like the atoms in a magnet that have all aligned their magnetic fields. You can also make mathematical analogies with flocks of birds, when they all fly together in the same direction. Even the avalanches that occur on the slopes of sand dunes have things in common with the movement of stock prices during a crash.

Systems that are normally so complicated that we might think of them as nearly random can, on occasion, start to act collectively. Stock markets can plummet in minutes, birds of a feather flock together, while atoms can align to produce powerful magnets. When systems start to behave coherently, scientists see complexity, not just complication. In other words, complexity is what results when the components of a complicated system start to behave in a collective, self-organised fashion. And remarkably, these complicated systems exhibit very similar behaviour when they self-organise.

Breaking symmetryDespite examples like these, complexity remains a tricky concept to nail down. You know it when you see it, but it’s hard to come up with a single definition that encompasses all aspects of complexity that we see in nature and human society. Nonetheless, complexity has become an increasingly important concept in science over the last few decades.

One of the seminal articles in the field was written by theoretical physicist Philip W Anderson in 1972. Anderson noted that surprising behaviour can arise in systems that contain many interacting components, like the atoms in a magnet or investors in the market. He pointed out that we can’t always understand such complex systems by focussing on their individual components.

When New Zealanders travel to Europe or North America, they often find themselves bumping into other people when they walk down a busy footpath. Kiwis tend to pass people on the left, while people overseas often pass on the right. At least for the first few days, this means we are constantly walking into people. It probably has something to do with the side of the road that we drive on, but this is not universal. In my experience, when the British are on foot, they seem to want to pass each other on the right despite the fact they drive on the left.

This is an example of what physicists call spontaneous symmetry breaking. It’s really only possible to walk down a busy street if we all agree on the way in which we’ll pass each other. Kiwis have made one choice, while people in other countries have made others, yet there is nothing in particular about any of us individually that says it has to be the left or the right – we just have to agree with those we walk past on a daily basis.

Something very similar happens in biology. Bio-molecules that are mirror images of each other are said to be left- or right-handed, by analogy with the way your left hand becomes your right when you look at yourself in the mirror. The chemistry of left or right-handed molecules is identical, at least when those molecules are in isolation or interact with chemicals that don’t have a handedness. However, in much the same way as you would find it hard to shake someone’s left hand with your right, left-handed molecules are not always able to react with right-handed molecules.

So biology has to make a choice. If life is going to work properly, it needs to stick to either left- or right-handed molecules, neither of which is preferred by the chemistry of the individual molecule. On Earth at least, life chose to be left-handed. So despite the fact that the building blocks of biology are chemicals, biology is not just applied biochemistry.

Complex societiesIn other words, complex systems cannot be completely understood by studying their components in isolation. Understanding how one investor or molecule behaves in isolation won’t necessarily tell you why stock markets crash or how life works. The properties of complex systems, like the biosphere or the stock market, only emerge when the components of the system have to interact with each other.

Networks have become very important these days in our increasingly connected world. If you have read this far, it won’t surprise you that the networks that underpin both society and the economy also show complex, emergent behaviour. In recent times, studies of social networks like Facebook have led to some of the biggest advances in understanding complexity, but as with other complex systems, it is impossible to understand a network by considering just a single person in that network.

We have a lot more to learn about our society and the economy, but the lessons we should take from the study of complex systems is that we are not just a collection of individuals. Society is more than just the sum of its parts.