PLoS ONE Alerts: Science PolicyPLOShttp://www.plosone.org/webmaster@plos.orgPublishing scienceinfo:doi/10.1371/feed.pone?categories=Science policyThis work is licensed under a Creative Commons Attribution-Share Alike 3.0 Licensehttp://www.plosone.org/${webserver-url}images/favicon.icohttp://www.plosone.org/${webserver-url}images/favicon.ico2015-03-31T20:57:04ZSimple Messages Help Set the Record Straight about Scientific Agreement on Human-Caused Climate Change: The Results of Two ExperimentsTeresa A. Myers et al.info:doi/10.1371/journal.pone.01209852015-03-26T21:00:00Z2015-03-26T21:00:00Z<p>by Teresa A. Myers, Edward Maibach, Ellen Peters, Anthony Leiserowitz</p>
Human-caused climate change is happening; nearly all climate scientists are convinced of this basic fact according to surveys of experts and reviews of the peer-reviewed literature. Yet, among the American public, there is widespread misunderstanding of this scientific consensus. In this paper, we report results from two experiments, conducted with national samples of American adults, that tested messages designed to convey the high level of agreement in the climate science community about human-caused climate change. The first experiment tested hypotheses about providing numeric versus non-numeric assertions concerning the level of scientific agreement. We found that numeric statements resulted in higher estimates of the scientific agreement. The second experiment tested the effect of eliciting respondents’ estimates of scientific agreement prior to presenting them with a statement about the level of scientific agreement. Participants who estimated the level of agreement prior to being shown the corrective statement gave higher estimates of the scientific consensus than respondents who were not asked to estimate in advance, indicating that incorporating an “estimation and reveal” technique into public communication about scientific consensus may be effective. The interaction of messages with political ideology was also tested, and demonstrated that messages were approximately equally effective among liberals and conservatives. Implications for theory and practice are discussed.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/WxEqh9f875E" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0120985Progress in Human Embryonic Stem Cell Research in the United States between 2001 and 2010Keyvan Vakili et al.info:doi/10.1371/journal.pone.01200522015-03-26T21:00:00Z2015-03-26T21:00:00Z<p>by Keyvan Vakili, Anita M. McGahan, Rahim Rezaie, Will Mitchell, Abdallah S. Daar</p>
On August 9th, 2001, the federal government of the United States announced a policy restricting federal funds available for research on human embryonic stem cell (hESCs) out of concern for the “vast ethical mine fields” associated with the creation of embryos for research purposes. Until the policy was repealed on March 9th, 2009, no U.S. federal funds were available for research on hESCs extracted after August 9, 2001, and only limited federal funds were available for research on a subset of hESC lines that had previously been extracted. This paper analyzes how the 2001 U.S. federal funding restrictions influenced the quantity and geography of peer-reviewed journal publications on hESC. The primary finding is that the 2001 policy did not have a significant aggregate effect on hESC research in the U.S. After a brief lag in early 2000s, U.S. hESC research maintained pace with other areas of stem cell and genetic research. The policy had several other consequences. First, it was tied to increased hESC research funding within the U.S. at the state level, leading to concentration of related activities in a relatively small number of states. Second, it stimulated increased collaborative research between US-based scientists and those in countries with flexible policies toward hESC research (including Canada, the U.K., Israel, China, Spain, and South Korea). Third, it encouraged independent hESC research in countries without restrictions.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/phIOwfmMGaI" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0120052A Game Theoretic Framework for Analyzing Re-Identification RiskZhiyu Wan et al.info:doi/10.1371/journal.pone.01205922015-03-25T21:00:00Z2015-03-25T21:00:00Z<p>by Zhiyu Wan, Yevgeniy Vorobeychik, Weiyi Xia, Ellen Wright Clayton, Murat Kantarcioglu, Ranjit Ganta, Raymond Heatherly, Bradley A. Malin</p>
Given the potential wealth of insights in personal data the big databases can provide, many organizations aim to share data while protecting privacy by sharing de-identified data, but are concerned because various demonstrations show such data can be re-identified. Yet these investigations focus on how attacks can be perpetrated, not the likelihood they will be realized. This paper introduces a game theoretic framework that enables a publisher to balance re-identification risk with the value of sharing data, leveraging a natural assumption that a recipient only attempts re-identification if its potential gains outweigh the costs. We apply the framework to a real case study, where the value of the data to the publisher is the actual grant funding dollar amounts from a national sponsor and the re-identification gain of the recipient is the fine paid to a regulator for violation of federal privacy rules. There are three notable findings: 1) it is possible to achieve zero risk, in that the recipient never gains from re-identification, while sharing almost as much data as the optimal solution that allows for a small amount of risk; 2) the zero-risk solution enables sharing much more data than a commonly invoked de-identification policy of the U.S. Health Insurance Portability and Accountability Act (HIPAA); and 3) a sensitivity analysis demonstrates these findings are robust to order-of-magnitude changes in player losses and gains. In combination, these findings provide support that such a framework can enable pragmatic policy decisions about de-identified data sharing.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/tWqaj8kZ-k8" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0120592Shared Investment Projects and Forecasting Errors: Setting Framework Conditions for Coordination and Sequencing Data Quality ActivitiesStephan Leitner et al.info:doi/10.1371/journal.pone.01213622015-03-24T21:00:00Z2015-03-24T21:00:00Z<p>by Stephan Leitner, Alexander Brauneis, Alexandra Rausch</p>
In this paper, we investigate the impact of inaccurate forecasting on the coordination of distributed investment decisions. In particular, by setting up a computational multi-agent model of a stylized firm, we investigate the case of investment opportunities that are mutually carried out by organizational departments. The forecasts of concern pertain to the initial amount of money necessary to launch and operate an investment opportunity, to the expected intertemporal distribution of cash flows, and the departments’ efficiency in operating the investment opportunity at hand. We propose a budget allocation mechanism for coordinating such distributed decisions The paper provides guidance on how to set framework conditions, in terms of the number of investment opportunities considered in one round of funding and the number of departments operating one investment opportunity, so that the coordination mechanism is highly robust to forecasting errors. Furthermore, we show that—in some setups—a certain extent of misforecasting is desirable from the firm’s point of view as it supports the achievement of the corporate objective of value maximization. We then address the question of how to improve forecasting quality in the best possible way, and provide policy advice on how to sequence activities for improving forecasting quality so that the robustness of the coordination mechanism to errors increases in the best possible way. At the same time, we show that wrong decisions regarding the sequencing can lead to a decrease in robustness. Finally, we conduct a comprehensive sensitivity analysis and prove that—in particular for relatively good forecasters—most of our results are robust to changes in setting the parameters of our multi-agent simulation model.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/2z-G1CN79L4" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0121362On the Automaticity of the Evaluative Priming Effect in the Valent/Non-Valent Categorization TaskAdriaan Spruyt et al.info:doi/10.1371/journal.pone.01215642015-03-24T21:00:00Z2015-03-24T21:00:00Z<p>by Adriaan Spruyt, Helen Tibboel</p>
It has previously been argued (a) that automatic evaluative stimulus processing is critically dependent upon feature-specific attention allocation and (b) that evaluative priming effects can arise in the absence of dimensional overlap between the prime set and the response set. In line with both claims, research conducted at our lab revealed that the evaluative priming effect replicates in the valent/non-valent categorization task. This research was criticized, however, because non-automatic, strategic processes may have contributed to the emergence of this effect. We now report the results of a replication study in which the operation of non-automatic, strategic processes was controlled for. A clear-cut evaluative priming effect emerged, thus supporting initial claims concerning feature-specific attention allocation and dimensional overlap.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/DujkSrd8k6U" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0121564When Data Sharing Gets Close to 100%: What Human Paleogenetics Can Teach the Open Science MovementPaolo Anagnostou et al.info:doi/10.1371/journal.pone.01214092015-03-23T21:00:00Z2015-03-23T21:00:00Z<p>by Paolo Anagnostou, Marco Capocasa, Nicola Milia, Emanuele Sanna, Cinzia Battaggia, Daniela Luzi, Giovanni Destro Bisol</p>
This study analyzes data sharing regarding mitochondrial, Y chromosomal and autosomal polymorphisms in a total of 162 papers on ancient human DNA published between 1988 and 2013. The estimated sharing rate was not far from totality (97.6% ± 2.1%) and substantially higher than observed in other fields of genetic research (evolutionary, medical and forensic genetics). Both a <i>questionnaire</i>-based survey and the examination of Journals’ editorial policies suggest that this high sharing rate cannot be simply explained by the need to comply with stakeholders requests. Most data were made available through body text, but the use of primary databases increased in coincidence with the introduction of complete mitochondrial and next-generation sequencing methods. Our study highlights three important aspects. First, our results imply that researchers’ awareness of the importance of openness and transparency for scientific progress may complement stakeholders’ policies in achieving very high sharing rates. Second, widespread data sharing does not necessarily coincide with a prevalent use of practices which maximize data findability, accessibility, useability and preservation. A detailed look at the different ways in which data are released can be very useful to detect failures to adopt the best sharing modalities and understand how to correct them. Third and finally, the case of human paleogenetics tells us that a widespread awareness of the importance of Open Science may be important to build reliable scientific practices even in the presence of complex experimental challenges.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/SO2aebXHIME" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0121409Increasing Patient Engagement in Rehabilitation Exercises Using Computer-Based Citizen ScienceJeffrey Laut et al.info:doi/10.1371/journal.pone.01170132015-03-20T21:00:00Z2015-03-20T21:00:00Z<p>by Jeffrey Laut, Francesco Cappa, Oded Nov, Maurizio Porfiri</p>
Patient motivation is an important factor to consider when developing rehabilitation programs. Here, we explore the effectiveness of active participation in web-based citizen science activities as a means of increasing participant engagement in rehabilitation exercises, through the use of a low-cost haptic joystick interfaced with a laptop computer. Using the joystick, patients navigate a virtual environment representing the site of a citizen science project situated in a polluted canal. Participants are tasked with following a path on a laptop screen representing the canal. The experiment consists of two conditions: in one condition, a citizen science component where participants classify images from the canal is included; and in the other, the citizen science component is absent. Both conditions are tested on a group of young patients undergoing rehabilitation treatments and a group of healthy subjects. A survey administered at the end of both tasks reveals that participants prefer performing the scientific task, and are more likely to choose to repeat it, even at the cost of increasing the time of their rehabilitation exercise. Furthermore, performance indices based on data collected from the joystick indicate significant differences in the trajectories created by patients and healthy subjects, suggesting that the low-cost device can be used in a rehabilitation setting for gauging patient recovery.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/xIHm8OU3UgU" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0117013Predictors of Negotiated NIH Indirect Rates at US InstitutionsS. Claiborne Johnston et al.info:doi/10.1371/journal.pone.01212732015-03-20T21:00:00Z2015-03-20T21:00:00Z<p>by S. Claiborne Johnston, Susan Desmond-Hellmann, Stewart Hauser, Eric Vermillion, Nilo Mia</p>
Background <p>The United States (US) Department of Health and Human Services and the Office of Naval Research negotiate institutional rates for payments of overhead costs associated with administration and space usage, commonly known as indirect rates. Such payments account for a large proportion of spending by the National Institutes of Health (NIH). Little has been published about differences in rates and their predictors.</p> Methods <p>Negotiated indirect rates for on-campus research grants were requested from the Council on Governmental Relations for the 100 institutions with greatest NIH funding in 2010. NIH funding, cost of living (ACCRA Index for 2008), public vs. private status, negotiating governmental organization (Department of Health and Human Services or Office of Naval Research), US Census Region, and year were assessed as predictors of institutional indirect rates using generalized estimating equations with all variables included in the model.</p> Results <p>Overall, 72 institutions participated, with 207 reported indirect rates for the years 2006, 2008, and 2010. Indirect rates ranged from 36.3% to 78%, with an average of 54.5%. Mean rates increased from 53.6% in 2006 to 55.4% in 2010 (p<0.001). In multivariable models, private institutions had 6.2% (95% CI 3.7%-8.7%; p<0.001) higher indirect rates than public institutions. Rates in the Northeast were highest (Midwest 4.0% lower; West 4.9% lower; South 5.2% lower). Greater NIH funding (p = 0.025) and cost of living (p = 0.034) also predicted indirect rates while negotiating governmental organization did not (p = 0.414).</p> Conclusions <p>Negotiated indirect rates for governmental research grants to academic centers vary widely. Although the association between indirect rates and cost of living may be justified, the cause of variation in rates by region, public-private status, and NIH funding levels is unclear.</p><img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/LpainHTZQMo" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0121273Science Classroom Inquiry (SCI) Simulations: A Novel Method to Scaffold Science LearningMelanie E. Peffer et al.info:doi/10.1371/journal.pone.01206382015-03-18T21:00:00Z2015-03-18T21:00:00Z<p>by Melanie E. Peffer, Matthew L. Beckler, Christian Schunn, Maggie Renken, Amanda Revak</p>
Science education is progressively more focused on employing inquiry-based learning methods in the classroom and increasing scientific literacy among students. However, due to time and resource constraints, many classroom science activities and laboratory experiments focus on simple inquiry, with a step-by-step approach to reach predetermined outcomes. The science classroom inquiry (SCI) simulations were designed to give students real life, authentic science experiences within the confines of a typical classroom. The SCI simulations allow students to engage with a science problem in a meaningful, inquiry-based manner. Three discrete SCI simulations were created as website applications for use with middle school and high school students. For each simulation, students were tasked with solving a scientific problem through investigation and hypothesis testing. After completion of the simulation, 67% of students reported a change in how they perceived authentic science practices, specifically related to the complex and dynamic nature of scientific research and how scientists approach problems. Moreover, 80% of the students who did not report a change in how they viewed the practice of science indicated that the simulation confirmed or strengthened their prior understanding. Additionally, we found a statistically significant positive correlation between students’ self-reported changes in understanding of authentic science practices and the degree to which each simulation benefitted learning. Since SCI simulations were effective in promoting both student learning and student understanding of authentic science practices with both middle and high school students, we propose that SCI simulations are a valuable and versatile technology that can be used to educate and inspire a wide range of science students on the real-world complexities inherent in scientific study.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/nW0A-8PbyIE" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.01206383D-Assisted Quantitative Assessment of Orbital Volume Using an Open-Source Software Platform in a Taiwanese PopulationVictor Bong-Hang Shyu et al.info:doi/10.1371/journal.pone.01195892015-03-16T21:00:00Z2015-03-16T21:00:00Z<p>by Victor Bong-Hang Shyu, Chung-En Hsu, Chih-hao Chen, Chien-Tzung Chen</p>
Orbital volume evaluation is an important part of pre-operative assessments in orbital trauma and congenital deformity patients. The availability of the affordable, open-source software, OsiriX, as a tool for preoperative planning increased the popularity of radiological assessments by the surgeon. A volume calculation method based on 3D volume rendering-assisted region-of-interest computation was used to determine the normal orbital volume in Taiwanese patients after reorientation to the Frankfurt plane. Method one utilized 3D points for intuitive orbital rim outlining. The mean normal orbital volume for left and right orbits was 24.3±1.51 ml and 24.7±1.17 ml in male and 21.0±1.21 ml and 21.1±1.30 ml in female subjects. Another method (method two) based on the bilateral orbital lateral rim was also used to calculate orbital volume and compared with method one. The mean normal orbital volume for left and right orbits was 19.0±1.68 ml and 19.1±1.45 ml in male and 16.0±1.01 ml and 16.1±0.92 ml in female subjects. The inter-rater reliability and intra-rater measurement accuracy between users for both methods was found to be acceptable for orbital volume calculations. 3D-assisted quantification of orbital volume is a feasible technique for orbital volume assessment. The normal orbital volume can be used as controls in cases of unilateral orbital reconstruction with a mean size discrepancy of less than 3.1±2.03% in females and 2.7±1.32% in males. The OsiriX software can be used reliably by the individual surgeon as a comprehensive preoperative planning and imaging tool for orbital volume measurement and computed tomography reorientation.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/ihw9euGk7Ng" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0119589Students’ Perceptions of Peer-Organized Extra-Curricular Research Course during Medical School: A Qualitative StudyBassel Nazha et al.info:doi/10.1371/journal.pone.01193752015-03-12T21:00:00Z2015-03-12T21:00:00Z<p>by Bassel Nazha, Rony H. Salloum, Akl C. Fahed, Mona Nabulsi</p>
Early integration of research education into medical curricula is crucial for evidence-based practice. Yet, many medical students are graduating with no research experience due to the lack of such integration in their medical school programs. The purpose of this study was to explore the impact of a peer-organized, extra-curricular research methodology course on the attitudes of medical students towards research and future academic careers. Twenty one medical students who participated in a peer-organized research course were enrolled in three focus group discussions to explore their experiences, perceptions and attitudes towards research after the course. Discussions were conducted using a semi-structured interview guide, and were transcribed and thematically analyzed for major and minor themes identification. Our findings indicate that students’ perceptions of research changed after the course from being difficult initially to becoming possible. Participants felt that their research skills and critical thinking were enhanced and that they would develop research proposals and abstracts successfully. Students praised the peer-assisted teaching approach as being successful in enhancing the learning environment and filling the curricular gap. In conclusion, peer-organized extra-curricular research courses may be a useful option to promote research interest and skills of medical students when gaps in research education in medical curricula exist.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/GkPHUjZ2uJk" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0119375To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and BenefitsTed von Hippel et al.info:doi/10.1371/journal.pone.01184942015-03-04T22:00:00Z2015-03-04T22:00:00Z<p>by Ted von Hippel, Courtney von Hippel</p>
We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/6YHQ7SvKTvk" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0118494Research Activity and the Association with MortalityBaris A. Ozdemir et al.info:doi/10.1371/journal.pone.01182532015-02-26T22:00:00Z2015-02-26T22:00:00Z<p>by Baris A. Ozdemir, Alan Karthikesalingam, Sidhartha Sinha, Jan D. Poloniecki, Robert J. Hinchliffe, Matt M. Thompson, Jonathan D. Gower, Annette Boaz, Peter J. E. Holt</p>
Introduction <p>The aims of this study were to describe the key features of acute NHS Trusts with different levels of research activity and to investigate associations between research activity and clinical outcomes.</p> Methods <p>National Institute for Health Research (NIHR) Comprehensive Clinical Research Network (CCRN) funding and number of patients recruited to NIHR Clinical Research Network (CRN) portfolio studies for each NHS Trusts were used as markers of research activity. Patient-level data for adult non-elective admissions were extracted from the English Hospital Episode Statistics (2005-10). Risk-adjusted mortality associations between Trust structures, research activity and, clinical outcomes were investigated.</p> Results <p>Low mortality Trusts received greater levels of funding and recruited more patients adjusted for size of Trust (n = 35, 2,349 £/bed [95% CI 1,855–2,843], 5.9 patients/bed [2.7–9.0]) than Trusts with expected (n = 63, 1,110 £/bed, [864–1,357] p<0.0001, 2.6 patients/bed [1.7–3.5] p<0.0169) or, high (n = 42, 930 £/bed [683–1,177] p = 0.0001, 1.8 patients/bed [1.4–2.1] p<0.0005) mortality rates. The most research active Trusts were those with more doctors, nurses, critical care beds, operating theatres and, made greater use of radiology. Multifactorial analysis demonstrated better survival in the top funding and patient recruitment tertiles (lowest vs. highest (odds ratio & 95% CI: funding 1.050 [1.033–1.068] p<0.0001, recruitment 1.069 [1.052–1.086] p<0.0001), middle vs. highest (funding 1.040 [1.024–1.055] p<0.0001, recruitment 1.085 [1.070–1.100] p<0.0001).</p> Conclusions <p>Research active Trusts appear to have key differences in composition than less research active Trusts. Research active Trusts had lower risk-adjusted mortality for acute admissions, which persisted after adjustment for staffing and other structural factors.</p><img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/j7U9beUaES0" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0118253The Scientific Consensus on Climate Change as a Gateway Belief: Experimental EvidenceSander L. van der Linden et al.info:doi/10.1371/journal.pone.01184892015-02-25T22:00:00Z2015-02-25T22:00:00Z<p>by Sander L. van der Linden, Anthony A. Leiserowitz, Geoffrey D. Feinberg, Edward W. Maibach</p>
There is currently widespread public misunderstanding about the degree of scientific consensus on human-caused climate change, both in the US as well as internationally. Moreover, previous research has identified important associations between public perceptions of the scientific consensus, belief in climate change and support for climate policy. This paper extends this line of research by advancing and providing experimental evidence for a “gateway belief model” (GBM). Using national data (N = 1104) from a consensus-message experiment, we find that increasing public perceptions of the scientific consensus is significantly and causally associated with an increase in the belief that climate change is happening, human-caused and a worrisome threat. In turn, changes in these key beliefs are predictive of increased support for public action. In short, we find that perceived scientific agreement is an important gateway belief, ultimately influencing public responses to climate change.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/JqZzhbwmNXk" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0118489What Drives Academic Data Sharing?Benedikt Fecher et al.info:doi/10.1371/journal.pone.01180532015-02-25T22:00:00Z2015-02-25T22:00:00Z<p>by Benedikt Fecher, Sascha Friesike, Marcel Hebing</p>
Despite widespread support from policy makers, funding agencies, and scientific journals, academic researchers rarely make their research data available to others. At the same time, data sharing in research is attributed a vast potential for scientific progress. It allows the reproducibility of study results and the reuse of <i>old</i> data for <i>new</i> research questions. Based on a systematic review of 98 scholarly papers and an empirical survey among 603 secondary data users, we develop a conceptual framework that explains the process of data sharing from the primary researcher’s point of view. We show that this process can be divided into six descriptive categories: <i>Data donor, research organization, research community, norms, data infrastructure</i>, and <i>data recipients</i>. Drawing from our findings, we discuss theoretical implications regarding knowledge creation and dissemination as well as research policy measures to foster academic collaboration. We conclude that research data cannot be regarded as knowledge commons, but research policies that better incentivise data sharing are needed to improve the quality of research results and foster scientific progress.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/hFZXXDOE6fE" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0118053Acknowledging Individual Responsibility while Emphasizing Social Determinants in Narratives to Promote Obesity-Reducing Public Policy: A Randomized ExperimentJeff Niederdeppe et al.info:doi/10.1371/journal.pone.01175652015-02-23T22:00:00Z2015-02-23T22:00:00Z<p>by Jeff Niederdeppe, Sungjong Roh, Michael A. Shapiro</p>
This study tests whether policy narratives designed to increase support for obesity-reducing public policies should explicitly acknowledge individual responsibility while emphasizing social, physical, and economic (social) determinants of obesity. We use a web-based, randomized experiment with a nationally representative sample of American adults (n = 718) to test hypotheses derived from theory and research on narrative persuasion. Respondents exposed to narratives that acknowledged individual responsibility while emphasizing obesity’s social determinants were less likely to engage in counterargument and felt more empathy for the story’s main character than those exposed to a message that did not acknowledge individual responsibility. Counterarguing and affective empathy fully mediated the relationship between message condition and support for policies to reduce rates of obesity. Failure to acknowledge individual responsibility in narratives emphasizing social determinants of obesity may undermine the persuasiveness of policy narratives. Omitting information about individual responsibility, a strongly-held American value, invites the public to engage in counterargument about the narratives and reduces feelings of empathy for a character that experiences the challenges and benefits of social determinants of obesity.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/8LU4EniM9FQ" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0117565Is There a Relationship between Research Sponsorship and Publication Impact? An Analysis of Funding Acknowledgments in Nanotechnology PapersJue Wang et al.info:doi/10.1371/journal.pone.01177272015-02-19T22:00:00Z2015-02-19T22:00:00Z<p>by Jue Wang, Philip Shapira</p>
This study analyzes funding acknowledgments in scientific papers to investigate relationships between research sponsorship and publication impacts. We identify acknowledgments to research sponsors for nanotechnology papers published in the Web of Science during a one-year sample period. We examine the citations accrued by these papers and the journal impact factors of their publication titles. The results show that publications from grant sponsored research exhibit higher impacts in terms of both journal ranking and citation counts than research that is not grant sponsored. We discuss the method and models used, and the insights provided by this approach as well as it limitations.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/gN9djebZMnA" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0117727Benford’s Law: Textbook Exercises and Multiple-Choice TestbanksAaron D. Slepkov et al.info:doi/10.1371/journal.pone.01179722015-02-17T22:00:00Z2015-02-17T22:00:00Z<p>by Aaron D. Slepkov, Kevin B. Ironside, David DiBattista</p>
Benford’s Law describes the finding that the distribution of leading (or leftmost) digits of innumerable datasets follows a well-defined logarithmic trend, rather than an intuitive uniformity. In practice this means that the most common leading digit is 1, with an expected frequency of 30.1%, and the least common is 9, with an expected frequency of 4.6%. Currently, the most common application of Benford’s Law is in detecting number invention and tampering such as found in accounting-, tax-, and voter-fraud. We demonstrate that answers to end-of-chapter exercises in physics and chemistry textbooks conform to Benford’s Law. Subsequently, we investigate whether this fact can be used to gain advantage over random guessing in multiple-choice tests, and find that while testbank answers in introductory physics closely conform to Benford’s Law, the testbank is nonetheless secure against such a Benford’s attack for banal reasons.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/AxScM1MpgjA" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0117972IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in JavaPhilipp Kainz et al.info:doi/10.1371/journal.pone.01163292015-01-22T22:00:00Z2015-01-22T22:00:00Z<p>by Philipp Kainz, Michael Mayrhofer-Reinhartshuber, Helmut Ahammer</p>
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/e39xYK2TZN8" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0116329Costs of Eliminating Malaria and the Impact of the Global Fund in 34 CountriesBrittany Zelman et al.info:doi/10.1371/journal.pone.01157142014-12-31T22:00:00Z2014-12-31T22:00:00Z<p>by Brittany Zelman, Anthony Kiszewski, Chris Cotter, Jenny Liu</p>
Background <p>International financing for malaria increased more than 18-fold between 2000 and 2011; the largest source came from The Global Fund to Fight AIDS, Tuberculosis and Malaria (Global Fund). Countries have made substantial progress, but achieving elimination requires sustained finances to interrupt transmission and prevent reintroduction. Since 2011, global financing for malaria has declined, fueling concerns that further progress will be impeded, especially for current malaria-eliminating countries that may face resurgent malaria if programs are disrupted.</p> Objectives <p>This study aims to 1) assess past total and Global Fund funding to the 34 current malaria-eliminating countries, and 2) estimate their future funding needs to achieve malaria elimination and prevent reintroduction through 2030.</p> Methods <p>Historical funding is assessed against trends in country-level malaria annual parasite incidences (APIs) and income per capita. Following Kizewski et al. (2007), program costs to eliminate malaria and prevent reintroduction through 2030 are estimated using a deterministic model. The cost parameters are tailored to a package of interventions aimed at malaria elimination and prevention of reintroduction.</p> Results <p>The majority of Global Fund-supported countries experiencing increases in total funding from 2005 to 2010 coincided with reductions in malaria APIs and also overall GNI per capita average annual growth. The total amount of projected funding needed for the current malaria-eliminating countries to achieve elimination and prevent reintroduction through 2030 is approximately US$8.5 billion, or about $1.84 per person at risk per year (PPY) (ranging from $2.51 PPY in 2014 to $1.43 PPY in 2030).</p> Conclusions <p>Although external donor funding, particularly from the Global Fund, has been key for many malaria-eliminating countries, sustained and sufficient financing is critical for furthering global malaria elimination. Projected cost estimates for elimination provide policymakers with an indication of the level of financial resources that should be mobilized to achieve malaria elimination goals.</p><img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/gjf9hhrBT7g" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0115714Symmetry Breaking on Density in Escaping Ants: Experiment and Alarm Pheromone ModelGeng Li et al.info:doi/10.1371/journal.pone.01145172014-12-31T22:00:00Z2014-12-31T22:00:00Z<p>by Geng Li, Di Huan, Bertrand Roehner, Yijuan Xu, Ling Zeng, Zengru Di, Zhangang Han</p>
The symmetry breaking observed in nature is fascinating. This symmetry breaking is observed in both human crowds and ant colonies. In such cases, when escaping from a closed space with two symmetrically located exits, one exit is used more often than the other. Group size and density have been reported as having no significant impact on symmetry breaking, and the alignment rule has been used to model symmetry breaking. Density usually plays important roles in collective behavior. However, density is not well-studied in symmetry breaking, which forms the major basis of this paper. The experiment described in this paper on an ant colony displays an increase then decrease of symmetry breaking versus ant density. This result suggests that a Vicsek-like model with an alignment rule may not be the correct model for escaping ants. Based on biological facts that ants use pheromones to communicate, rather than seeing how other individuals move, we propose a simple yet effective alarm pheromone model. The model results agree well with the experimental outcomes. As a measure, this paper redefines symmetry breaking as the collective asymmetry by deducing the random fluctuations. This research indicates that ants deposit and respond to the alarm pheromone, and the accumulation of this biased information sharing leads to symmetry breaking, which suggests true fundamental rules of collective escape behavior in ants.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/BMhSaxtQQAM" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114517Voting Behavior, Coalitions and Government Strength through a Complex Network AnalysisCarlo Dal Maso et al.info:doi/10.1371/journal.pone.01160462014-12-30T22:00:00Z2014-12-30T22:00:00Z<p>by Carlo Dal Maso, Gabriele Pompa, Michelangelo Puliga, Gianni Riotta, Alessandro Chessa</p>
We analyze the network of relations between parliament members according to their voting behavior. In particular, we examine the emergent community structure with respect to political coalitions and government alliances. We rely on tools developed in the Complex Network literature to explore the core of these communities and use their topological features to develop new metrics for party polarization, internal coalition cohesiveness and government strength. As a case study, we focus on the Chamber of Deputies of the Italian Parliament, for which we are able to characterize the heterogeneity of the ruling coalition as well as parties specific contributions to the stability of the government over time. We find sharp contrast in the political debate which surprisingly does not imply a relevant structure based on established parties. We take a closer look to changes in the community structure after parties split up and their effect on the position of single deputies within communities. Finally, we introduce a way to track the stability of the government coalition over time that is able to discern the contribution of each member along with the impact of its possible defection. While our case study relies on the Italian parliament, whose relevance has come into the international spotlight in the present economic downturn, the methods developed here are entirely general and can therefore be applied to a multitude of other scenarios.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/iMjYi_wqGVc" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0116046Scholarly Context Not Found: One in Five Articles Suffers from Reference RotMartin Klein et al.info:doi/10.1371/journal.pone.01152532014-12-26T22:00:00Z2014-12-26T22:00:00Z<p>by Martin Klein, Herbert Van de Sompel, Robert Sanderson, Harihar Shankar, Lyudmila Balakireva, Ke Zhou, Richard Tobin</p>
The emergence of the web has fundamentally affected most aspects of information communication, including scholarly communication. The immediacy that characterizes publishing information to the web, as well as accessing it, allows for a dramatic increase in the speed of dissemination of scholarly knowledge. But, the transition from a paper-based to a web-based scholarly communication system also poses challenges. In this paper, we focus on reference rot, the combination of link rot and content drift to which references to web resources included in Science, Technology, and Medicine (STM) articles are subject. We investigate the extent to which reference rot impacts the ability to revisit the web context that surrounds STM articles some time after their publication. We do so on the basis of a vast collection of articles from three corpora that span publication years 1997 to 2012. For over one million references to web resources extracted from over 3.5 million articles, we determine whether the HTTP URI is still responsive on the live web and whether web archives contain an archived snapshot representative of the state the referenced resource had at the time it was referenced. We observe that the fraction of articles containing references to web resources is growing steadily over time. We find one out of five STM articles suffering from reference rot, meaning it is impossible to revisit the web context that surrounds them some time after their publication. When only considering STM articles that contain references to web resources, this fraction increases to seven out of ten. We suggest that, in order to safeguard the long-term integrity of the web-based scholarly record, robust solutions to combat the reference rot problem are required. In conclusion, we provide a brief insight into the directions that are explored with this regard in the context of the Hiberlink project.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/nTEQedH2W-w" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0115253Combating Obesity through Healthy Eating Behavior: A Call for System Dynamics OptimizationNorhaslinda Zainal Abidin et al.info:doi/10.1371/journal.pone.01141352014-12-15T22:00:00Z2014-12-15T22:00:00Z<p>by Norhaslinda Zainal Abidin, Mustafa Mamat, Brian Dangerfield, Jafri Haji Zulkepli, Md. Azizul Baten, Antoni Wibowo</p>
Poor eating behavior has been identified as one of the core contributory factors of the childhood obesity epidemic. The consequences of obesity on numerous aspects of life are thoroughly explored in the existing literature. For instance, evidence shows that obesity is linked to incidences of diseases such as heart disease, type-2 diabetes, and some cancers, as well as psychosocial problems. To respond to the increasing trends in the UK, in 2008 the government set a target to reverse the prevalence of obesity (POB) back to 2000 levels by 2020. This paper will outline the application of system dynamics (SD) optimization to simulate the effect of changes in the eating behavior of British children (aged 2 to 15 years) on weight and obesity. This study also will identify how long it will take to achieve the government’s target. This paper proposed a simulation model called Intervention Childhood Obesity Dynamics (ICOD) by focusing the interrelations between various strands of knowledge in one complex human weight regulation system. The model offers distinct insights into the dynamics by capturing the complex interdependencies from the causal loop and feedback structure, with the intention to better understand how eating behaviors influence children’s weight, body mass index (BMI), and POB measurement. This study proposed a set of equations that are revised from the original (baseline) equations. The new functions are constructed using a RAMP function of linear decrement in portion size and number of meal variables from 2013 until 2020 in order to achieve the 2020 desired target. Findings from the optimization analysis revealed that the 2020 target won’t be achieved until 2026 at the earliest, six years late. Thus, the model suggested that a longer period may be needed to significantly reduce obesity in this population.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/KCzt2Puw0XI" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114135Master Settlement Agreement (MSA) Spending and Tobacco Control EffortsJayani Jayawardhana et al.info:doi/10.1371/journal.pone.01147062014-12-15T22:00:00Z2014-12-15T22:00:00Z<p>by Jayani Jayawardhana, W. David Bradford, Walter Jones, Paul J. Nietert, Gerard Silvestri</p>
We investigate whether the distributions to the states from the Tobacco Master Settlement Agreement (MSA) in 1998 is associated with stronger tobacco control efforts. We use state level data from 50 states and the District of Columbia from four time periods post MSA (1999, 2002, 2004, and 2006) for the analysis. Using fixed effect regression models, we estimate the relationship between MSA disbursements and a new aggregate measure of strength of state tobacco control known as the Strength of Tobacco Control (SoTC) Index. Results show an increase of $1 in the annual per capita MSA disbursement to a state is associated with a decrease of −0.316 in the SoTC mean value, indicating higher MSA payments were associated with weaker tobacco control measures within states. In order to achieve the initial objectives of the MSA payments, policy makers should focus on utilizing MSA payments strictly on tobacco control activities across states.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/lHtXdrR1kb4" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114706Biomedical Science Ph.D. Career Interest Patterns by Race/Ethnicity and GenderKenneth D. Gibbs et al.info:doi/10.1371/journal.pone.01147362014-12-10T22:00:00Z2014-12-10T22:00:00Z<p>by Kenneth D. Gibbs, John McGready, Jessica C. Bennett, Kimberly Griffin</p>
Increasing biomedical workforce diversity remains a persistent challenge. Recent reports have shown that biomedical sciences (BMS) graduate students become less interested in faculty careers as training progresses; however, it is unclear whether or how the career preferences of women and underrepresented minority (URM) scientists change in manners distinct from their better-represented peers. We report results from a survey of 1500 recent American BMS Ph.D. graduates (including 276 URMs) that examined career preferences over the course of their graduate training experiences. On average, scientists from all social backgrounds showed significantly decreased interest in faculty careers at research universities, and significantly increased interest in non-research careers at Ph.D. completion relative to entry. However, group differences emerged in overall levels of interest (at Ph.D. entry and completion), and the magnitude of change in interest in these careers. Multiple logistic regression showed that when controlling for career pathway interest at Ph.D. entry, first-author publication rate, faculty support, research self-efficacy, and graduate training experiences, differences in career pathway interest between social identity groups persisted. All groups were less likely than men from well-represented (WR) racial/ethnic backgrounds to report high interest in faculty careers at research-intensive universities (URM men: OR 0.60, 95% CI: 0.36–0.98, p = 0.04; WR women: OR: 0.64, 95% CI: 0.47–0.89, p = 0.008; URM women: OR: 0.46, 95% CI: 0.30–0.71, p<0.001), and URM women were more likely than all other groups to report high interest in non-research careers (OR: 1.93, 95% CI: 1.28–2.90, p = 0.002). The persistence of disparities in the career interests of Ph.D. recipients suggests that a supply-side (or “pipeline”) framing of biomedical workforce diversity challenges may limit the effectiveness of efforts to attract and retain the best and most diverse workforce. We propose incorporation of an ecological perspective of career development when considering strategies to enhance the biomedical workforce and professoriate through diversity.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/uu17oBInHlY" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114736Using Health Care Utilization and Publication Patterns to Characterize the Research Portfolio and to Plan Future Research InvestmentsLuba Katz et al.info:doi/10.1371/journal.pone.01148732014-12-10T22:00:00Z2014-12-10T22:00:00Z<p>by Luba Katz, Rebecca V. Fink, Samuel R. Bozeman, Barbara J. McNeil</p>
Objective <p>Government funders of biomedical research are under increasing pressure to demonstrate societal benefits of their investments. A number of published studies attempted to correlate research funding levels with the societal burden for various diseases, with mixed results. We examined whether research funded by the Department of Veterans Affairs (VA) is well aligned with current and projected veterans’ health needs. The organizational structure of the VA makes it a particularly suitable setting for examining these questions.</p> Methods <p>We used the publication patterns and dollar expenditures of VA-funded researchers to characterize the VA research portfolio by disease. We used health care utilization data from the VA for the same diseases to define veterans’ health needs. We then measured the level of correlation between the two and identified disease groups that were under- or over-represented in the research portfolio relative to disease expenditures. Finally, we used historic health care utilization trends combined with demographic projections to identify diseases and conditions that are increasing in costs and/or patient volume and consequently represent potential targets for future research investments.</p> Results <p>We found a significant correlation between research volume/expenditures and health utilization. Some disease groups were slightly under- or over-represented, but these deviations were relatively small. Diseases and conditions with the increasing utilization trend at the VA included hypertension, hypercholesterolemia, diabetes, hearing loss, sleeping disorders, complications of pregnancy, and several mental disorders.</p> Conclusions <p>Research investments at the VA are well aligned with veteran health needs. The VA can continue to meet these needs by supporting research on the diseases and conditions with a growing number of patients, costs of care, or both. Our approach can be used by other funders of disease research to characterize their portfolios and to plan research investments.</p><img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/7sPlTZSrsFE" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114873To Crowdfund Research, Scientists Must Build an Audience for Their WorkJarrett E. K. Byrnes et al.info:doi/10.1371/journal.pone.01103292014-12-10T22:00:00Z2014-12-10T22:00:00Z<p>by Jarrett E. K. Byrnes, Jai Ranganathan, Barbara L. E. Walker, Zen Faulkes</p>
As rates of traditional sources of scientific funding decline, scientists have become increasingly interested in crowdfunding as a means of bringing in new money for research. In fields where crowdfunding has become a major venue for fundraising such as the arts and technology, building an audience for one's work is key for successful crowdfunding. For science, to what extent does audience building, via engagement and outreach, increase a scientist's abilities to bring in money via crowdfunding? Here we report on an analysis of the #SciFund Challenge, a crowdfunding experiment in which 159 scientists attempted to crowdfund their research. Using data gathered from a survey of participants, internet metrics, and logs of project donations, we find that public engagement is the key to crowdfunding success. Building an audience or “fanbase” and actively engaging with that audience as well as seeking to broaden the reach of one's audience indirectly increases levels of funding. Audience size and effort interact to bring in more people to view a scientist's project proposal, leading to funding. We discuss how projects capable of raising levels of funds commensurate with traditional funding agencies will need to incorporate direct involvement of the public with science. We suggest that if scientists and research institutions wish to tap this new source of funds, they will need to encourage and reward activities that allow scientists to engage with the public.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/pngl_Thinvo" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0110329Research Data Management and Libraries: Relationships, Activities, Drivers and InfluencesStephen Pinfield et al.info:doi/10.1371/journal.pone.01147342014-12-08T22:00:00Z2014-12-08T22:00:00Z<p>by Stephen Pinfield, Andrew M. Cox, Jen Smith</p>
The management of research data is now a major challenge for research organisations. Vast quantities of born-digital data are being produced in a wide variety of forms at a rapid rate in universities. This paper analyses the contribution of academic libraries to research data management (RDM) in the wider institutional context. In particular it: examines the roles and relationships involved in RDM, identifies the main components of an RDM programme, evaluates the major drivers for RDM activities, and analyses the key factors influencing the shape of RDM developments. The study is written from the perspective of library professionals, analysing data from 26 semi-structured interviews of library staff from different UK institutions. This is an early qualitative contribution to the topic complementing existing quantitative and case study approaches. Results show that although libraries are playing a significant role in RDM, there is uncertainty and variation in the relationship with other stakeholders such as IT services and research support offices. Current emphases in RDM programmes are on developments of policies and guidelines, with some early work on technology infrastructures and support services. Drivers for developments include storage, security, quality, compliance, preservation, and sharing with libraries associated most closely with the last three. The paper also highlights a ‘jurisdictional’ driver in which libraries are claiming a role in this space. A wide range of factors, including governance, resourcing and skills, are identified as influencing ongoing developments. From the analysis, a model is constructed designed to capture the main aspects of an institutional RDM programme. This model helps to clarify the different issues involved in RDM, identifying layers of activity, multiple stakeholders and drivers, and a large number of factors influencing the implementation of any initiative. Institutions may usefully benchmark their activities against the data and model in order to inform ongoing RDM activity.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/26ej7mSgT50" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0114734Understanding and Using the Brief Implicit Association Test: Recommended Scoring ProceduresBrian A. Nosek et al.info:doi/10.1371/journal.pone.01109382014-12-08T22:00:00Z2014-12-08T22:00:00Z<p>by Brian A. Nosek, Yoav Bar-Anan, N. Sriram, Jordan Axt, Anthony G. Greenwald</p>
A brief version of the Implicit Association Test (BIAT) has been introduced. The present research identified analytical best practices for overall psychometric performance of the BIAT. In 7 studies and multiple replications, we investigated analytic practices with several evaluation criteria: sensitivity to detecting known effects and group differences, internal consistency, relations with implicit measures of the same topic, relations with explicit measures of the same topic and other criterion variables, and resistance to an extraneous influence of average response time. The data transformation algorithms <i>D</i> outperformed other approaches. This replicates and extends the strong prior performance of <i>D</i> compared to conventional analytic techniques. We conclude with recommended analytic practices for standard use of the BIAT.<img src="//feeds.feedburner.com/~r/plosone/Sciencepolicy/~4/zE4nKPROceQ" height="1" width="1" alt=""/>http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0110938