Online Platform Aims to Facilitate Replication Studies

A volunteer-run database called StudySwap, which launched in beta last month, is starting to gain momentum.

By Dalmeet Singh Chawla | April 7, 2017

PIXNIOStudySwap, a platform hosted on the Center for Open Science’s Open Science Framework site, enables users to post their research “haves” and “needs,” with the goal of helping scientists exchange resources and find suitable collaborators for replication studies. After launching in beta late last month, StudySwap has already catalyzed a potential collaboration.

Initially, the platform was intended to broker pre-publication independent replications, explained Christopher Chartier, a psychologist at Ashland University in Ohio who helped create the resource. Quickly, however, Chartier and colleagues realized that StudySwap might also be useful for other types of collaborations. For instance, scientists with spare research participants might post an offer to share the surplus of study volunteers under “haves.”

According to Chartier, StudySwap was also in part inspired by a pre-publication independent replication endeavor, in which psychologist Martin Schweinsberg and colleagues requested replication experiments from 25 teams before they submitted their results for a journal’s consideration. Schweinsberg, an assistant professor of organisational behavior at the European School of Management and Technology in Berlin, is also part of StudySwap’s first collaboration, which was agreed to last week.

Schweinsberg posted a hypothesis on StudySwap, voicing a need for study participants. He soon after heard from Michael Bernstein, a psychologist at Penn State Abington. Schweinsberg “had an interesting hypothesis and a need for participants,” Bernstein told The Scientist. “I have participants and thought his idea was worth testing.”

Previous research has shown that negotiators of a deal are routinely advised to make extreme first offers to push the final price in their desired direction, said Schweinsberg. But these extreme first offers can also potentially reduce knowledge exchange and trust. So Schweinsberg would like to test the effects of extreme first offers on negotiators. According to Chartier, Bernstein is currently seeking ethics approval to replicate Schweinsberg’s study.

Currently, StudySwap users must manually scan posts to familiarize themselves with others’ “haves” and “needs,” and revisit the site to view any updates. The platform is a “labor of love” for Chartier, co-creator Randy McCarthy of Northern Illinois University, and a few volunteers. The group plans to apply for funding in the summer. Additionally, the team also aims to monitor the site’s first 100 collaborations and publish a paper outlining the platform’s progress.

One possible drawback of replication studies facilitated through StudySwap is that researchers may feel obligated to provide positive feedback on others’ work. But this is a possibility in any collaboration, said Chartier. The hope, he added, is that pre-registering studies online, sharing data, and other open practices will counteract potential subjectivity. Although none of these practices are mandated by StudySwap, Chartier said users can agree to specifics with each other, through informal “exchange agreements,” before collaborations proceed.

Matt Hodgkinson, head of research integrity at the U.K.-based open-access publisher Hindawi, suggested that the platform may be useful for researchers in resource-poor settings. Yet, “getting a critical mass of researchers to match up those with resources to those in need may be tricky,” he said.

Overall, StudySwap “has excellent potential to include more people in the research process, and improve the robustness of conducted research,” said Brian Nosek, executive director of the Center for Open Science. “Perhaps it could become a Craigslist for researchers.”