New report finds providing school choice data to parents does not equalize educational opportunity, but rather replicates and perpetuates existing inequalities.

Data & Society Researcher Dr. Claire Fontaine and Research Assistant Kinjal Dave performed a qualitative, semi-structured, interview-based study with a socio-economically, racially, and geographically diverse group of 30 New York City parents and guardians between May and November 2017. Interviews focused on experiences of school choice; and data and information sources.

“These moves from Uber and Lyft seem to align with their gig-economy model of employment, which structures work as an individual pursuit and individual liability. But even this sell is misleading. While, for many drivers, the idea of being independent at work is very appealing, their ability to make entrepreneurial decisions is consistently constrained by the ride-hail apps’ nudges and other algorithmic management, rules, external costs, and wage cuts.”

This paper is a response to calls for explainable machines by Data & Society Postdoctoral Scholar Andrew Selbst and Affiliate Solon Barocas.

“We argue that calls for explainable machines have failed to recognize the connection between intuition and evaluation and the limitations of such an approach. A belief in the value of explanation for justification assumes that if only a model is explained, problems will reveal themselves intuitively. Machine learning, however, can uncover relationships that are both non-intuitive and legitimate, frustrating this mode of normative assessment. If justification requires understanding why the model’s rules are what they are, we should seek explanations of the process behind a model’s development and use, not just explanations of the model itself.”

“So why should we be worried about rules that require caregivers to provide an electronic verification of the labor provided to clients? Because without careful controls and ethical design thinking, surveillance of caregiver labor is also functionally surveillance of care recipients, especially when family members are employed as caregivers.”

Fairness in Precision Medicine is the first report to deeply examine the potential for biased and discriminatory outcomes in the emerging field of precision medicine; “the effort to collect, integrate and analyze multiple sources of data in order to develop individualized insights about health and disease.”

The Precision Medicine National Actor Map is the first visualization of the three major national precision medicine projects–All of Us Research Program, My Research Legacy, Project Baseline–and the network of institutions connected to them as grantees and sub-grantees.

What is Precision Medicine? is a general audience white paper by Dr. Kadija Ferryman and Mikaela Pitcan that introduces and outlines the emerging field of precision medicine; the effort to collect, integrate and analyze multiple sources of data in order to develop individualized insights about health and disease.

“In that moment, I realized that this community of Evangelical Christians were engaged in media literacy, but used a set of reading practices secular thinkers might be unfamiliar with. I’ve seen hundreds of Conservative Evangelicals apply the same critique they use for the Bible, arguably a postmodern method of unpacking a text, to mainstream media — favoring their own research on topics rather than trusting media authorities.”

“The problem with amplified speech online is that something like this crisis-actor narrative gets a lot of reach and attention, then the story becomes about that, and not the shooting or what these students are doing. I would suggest that media only mentions these narratives to say that this is wrong and that students need to be believed.”

This report responds to the “fake news” problem by evaluating the successes and failures of recent media literacy efforts while pointing towards next steps for educators, legislators, technologists, and philanthropists.

“Amazon’s peculiar culture notwithstanding, the wristbands in many ways don’t offer anything new, technologically or conceptually. What has changed is workers’ ability to challenge this kind of surveillance.”

“This type of analysis sheds light on how organizational contexts are embedded into algorithms, which can then become embedded within other organizational and individual practices. By investigating technical practices as organizational and bureaucratic, discussions about accountability and decision-making can be reframed.”

As data becomes more prevalent in the health world, Data & Society Postdoctoral Scholar Kadija Ferryman urges us to consider how we will regulate its collection and usage.

“As precision medicine rushes on in the US, how can we understand where there might be tensions between fast-paced technological advancement and regulation and oversight? What regulatory problems might emerge? Are our policies and institutions ready to meet these challenges?”

“As genetic risk and other health data become more widely available, insights from research and early clinical adoption will expand the growing and data-centric field of precision medicine. However, just like previous forms of medical intervention, precision medicine aims to enhance life, decrease risk of disease, improve treatment, and though data plays a big role, the success of the field depends heavily upon clinician and patient interactions.”

In this essay, D&S Fellow Taeyoon Choi interrogates technology designed for those with disabilities.

“Even with the most advanced technology, disability can not and—sometimes should not—disappear from people. There are disabled people whose relationship with their own bodily functions and psychological capabilities cannot be considered in a linear movement from causation to result, where narratives of technology as cure override the real varieties in people’s needs and conditions and falsely construct binary states—one or the other, abled or disabled—shadowing everything between or outside of those options.”

In the essay, Data & Society INFRA Lead Ingrid Burrington grounds technological development in the environment.

“While the aforementioned narratives are strategic in their own worlds, they tend to maintain the premise that the environmental cost of technology is still orthogonal or an externality to the more diffuse, less obviously material societal implications of living in an Information Age. The politics of a modern world increasingly defined by data mining may only exist because of literal open-pit mining, but the open pit is more often treated as a plot pivot than a natural through-line: Sure, you feel bad about a social media site being creepy, but behold, the hidden environmental devastation wrought by your iPhone—doesn’t that make you feel even worse?”

Data & Society Media Manipulation Lead Joan Donovan investigates the development of InterOccupy, a virtual organization operated by participants in the Occupy Movement.

“InterOccupy took infrastructure building as a political strategy to ensure the movement endured beyond the police raids on the encampments. I conclude that NSMs create virtual organizations when there are routine and insurmountable failures in the communication milieu, where the future of the movement is at stake. My research follows the Occupy Movement ethnographically to understand what happens after the keyword.”

What are internet trolls? Above the Noise explains where internet trolls come from in this video and encourages watchers to read Data & Society’s report “Online Harassment, Digital Abuse, and Cyberstalking in America.”

Artificial intelligence is increasingly being used across multiple sectors and people often refer to its function as “magic.” In this blogpost, D&S researcher Madeleine Clare Elish points out how there’s nothing magical about AI and reminds us that the human labor involved in making AI systems work is often rendered invisible.

“From one perspective, this makes sense: Working like magic implies impressive and seamless functionality and the means by which the effect was achieved is hidden from view or even irrelevant. Yet, from another perspective, implying something works like magic focuses attention on the end result, denying an accounting of the means by which that end result was reached.”

D&S founder and president sings praise for Virginia Eubank’s new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.

“This book should be mandatory for anyone who works in social services, government, or the technology sector because it forces you to really think about what algorithmic decision-making tools are doing to our public sector, and the costs that this has on the people that are supposedly being served. It’s also essential reading for taxpayers and voters who need to understand why technology is not the panacea that it’s often purported to be. Or rather, how capitalizing on the benefits of technology will require serious investment and a deep commitment to improving the quality of social services, rather than a tax cut.”

In the gig-economy, management by algorithms means employment relationships grow more remote and distributed across the network. Alex Rosenblat explains how workers navigate this by creating their own forums.

“Online forums aren’t just helping drivers like Cole navigate the challenges of their work, and helping those of us who use and study these platforms grasp those challenges too. They show how as employment relationships grow more remote and distributed across the network, workers can adapt, using technology to forge their own workplace culture.”

The informational, economic, and political influence of the dominant tech platforms — Google, Facebook, and Amazon in particular — has become a central topic of debate. In this talk, K. Sabeel Rahman argues that these firms are best understood as the core infrastructure of our 21st century economy and public sphere. The infrastructural power of these firms raises a range of policy questions. What exactly about these firms (e.g., their accumulation of data, their gatekeeping functions, their control over vital public and economic functions like retail delivery or online speech) is “infrastructural?” How should these infrastructural functions be governed and regulated, in light of both their economic and political influence?

Professor Rahman sketches some tentative answers to these questions, drawing on the intellectual history of early 20th century “public utility regulation,” where reformers developed a compelling approach to diagnosing and remedying the problem of private power over the essential infrastructure of the industrial economy, from railroads to finance.

This history suggests some design principles and opens up some novel implications for addressing the problem of platform power in the digital economy. The talk explores more contemporary analogies and applications in the context of our current debates over informational platforms, big data, AI, and algorithms, in order to sketch out some principles for what a public utility-style regulatory approach to Internet platforms would look like.

K. Sabeel Rahman is a Visiting Professor of Law at Harvard Law School, an Assistant Professor of Law at Brooklyn Law School, and a Fellow at the Roosevelt Institute. Rahman earned his AB at Harvard College summa cum laude in Social Studies and returned to Harvard for his JD at Harvard Law School and his PhD in the Harvard Government Department. He also has degrees in Economics and Sociolegal Studies from Oxford, where he was a Rhodes Scholar.

Miranda Katz of WIRED interviews D&S founder and president danah boyd on the evolving public discourse around disinformation and how the tech industry can help rebuild American society.

“It’s actually really clear: How do you reknit society? Society is produced by the social connections that are knit together. The stronger those networks, the stronger the society. We have to make a concerted effort to create social ties, social relationships, social networks in the classic sense that allow for strategic bridges across the polis so that people can see themselves as one.”

“On the one hand, it is banally predictable that the consequences of machine-learning-enabled surveillance will fall disproportionately on demographic minorities. On the other hand, queer folks hardly need data scientists scrutinizing their jawlines and hairstyles to warm them about this. They have always known this.”

“The justices will surely understand that without any alternatives for accessing online services, vulnerable (and over-policed) populations will be unable to make meaningful choices to protect their privacy, amplifying the disadvantages they already face.”

Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.

Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”

This is a transcript of Data & Society founder and president danah boyd’s recent lightening talk at The People’s Disruption: Platform Co-Ops for Global Challenges.

“But as many of you know, power corrupts. And the same geek masculinities that were once rejuvenating have spiraled out of control. Today, we’re watching as diversity becomes a wedge issue that can be used to radicalize disaffected young men in tech. The gendered nature of tech is getting ugly.”

“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.

But for some families, the choice process begins and ends with the question: Is the building fully accessible?”

D&S researcher Monica Bulger co-authored an article on the way children are engaging with technology nationally.

“Beyond revealing pressing and sizeable gaps in knowledge, this cross-national review also reveals the importance of understanding local values and practices regarding the use of technologies. This leads us to stress that future researchers must take into account local contexts and existing inequalities and must share best practices internationally so that children can navigate the balance between risks and opportunities.”

“Critical commentary on data science has converged on a worrisome idea: that data scientists do not recognize their power and, thus, wield it carelessly. These criticisms channel legitimate concerns about data science into doubts about the ethical awareness of its practitioners. For these critics, carelessness and indifference explains much of the problem—to which only they can offer a solution.”

D&S researcher Alex Rosenblat co-authored an article on power dynamics in the sharing economy.

“Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value but raises concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power.”

D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.

“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”

D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.

“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”