Subscribe to the BITSS Blog

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,757 other subscribers

Email Address

Get Email Updates from BITSS

First Name

Last Name

Email

Leave this field empty if you're human:

Mid-Summer Recap

July 11, 2016July 11, 2016

Garret Christensen–BITSS Project Scientist

BITSS and I have been busy lately. Here’s a quick rundown:

If you want to learn or teach reproducible programming, Software Carpentry is one of the best organizations out there–check out their lesson materials and see if they’re offering a workshop near you. I got hooked up with them through my association with Berkeley Institute for Data Science, and was certified as an instructor last month.

We held our annual Summer Institute June 8-10. The faculty, as well as the students, change every year based on availability, so I always learn new things. This year faculty we went into different approaches to meta-analysis (Tom Stanley with meta-regression, Leif Nelson with the P-curve) and data anonymization (Simson Garfinkel did a great job of presenting this accessibly–without the difficult algorithms of differential privacy that sometimes lose people, myself included.) All the materials are online here.

I presented a literature review of research transparency in economics at the Western Economics Association International meeting June 30 in Portland, OR. Slides are here.

I gave a two-day mini course on research transparency at the University of Michigan as part of the ICPSR summer program July 5-6. All the materials are on Github here. I hope to make the time to give these materials a friendly front-end website so that people can more easily re-use them. (I understand Github can scare off the unfamiliar, but I promise you it’s the most useful reproducibility tool there is.) If you have any suggestions, there’s nothing I’d love more than to get a pull request!

Part of the lesson was a new Shiny web tool that lets you practice your p-hacking skills (tongue firmly in cheek) and then run a suite of tests (P-curve, Test for excess significance, misreporting, etc.) on them. I think it’s one of the best, and perhaps most practically useful, research transparency web visualizations I’ve seen.

Support for data citation: In reference sections of documents submitted to us (e.g. manuscripts, grant proposals, tenure dossiers), data and software will increasingly be required to be cited in the same way as publications are at present.

Support for monitoring data sharing policies: The participating organizations will expand their efforts to monitor compliance with their respective data sharing policies and with open science principles more broadly. For example, the group will encourage the wider adoption of data availability statements in grant proposals and papers and will highlight best practices for sharing research.

Support for developing interlinking policy and infrastructure: In order to further these goals, we will stimulate the creation of policy-related infrastructure, e.g. tools that help complement text versions of sharing policies or sharing plans in a way that allows machines to assist with compliance monitoring or with facilitating the discoverability of shared resources.

If you have done, or are working on a reproducible research project in the social sciences, BIDS colleagues and I are collecting a volume of case studies. We’re still looking for submissions. You can read all about the project, and submit your case study here.