“Without comprehensively accounting for the strengths and weaknesses of technical practices, the work of ethics—which includes weighing the risks and benefits and potential consequences of an AI system—will be incomplete.”

How do people decide what to trust? Data & Society Postdoctoral Scholar Francesca Tripodi shares insights from her research into conservative news practices.

“While not all Christians are conservative nor all conservatives religious, there is a clear connection between how the process of scriptural inference trickles down into conservative methods of inquiry. Favoring the original text of the Constitution is closely tied to the practices of ‘constitutional conservatism,’ and currently members in all three branches of the U.S. government rely on practices of scriptural inference to make important political decisions.”

“The CEOs’ comments reveal a paradox of democratic values: tolerance of opposing viewpoints is a foundational value of democracy, but tolerating intolerant views can threaten democratic principles of inclusion and equality. And some who adopt this rhetoric — including online influencers like Dave Rubin — have exploited this paradox of democratic values, shifting the goalposts of diversity to delegitimizeany inclusion efforts except those that tolerate discriminatory viewpoints.”

“On the one hand, it is banally predictable that the consequences of machine-learning-enabled surveillance will fall disproportionately on demographic minorities. On the other hand, queer folks hardly need data scientists scrutinizing their jawlines and hairstyles to warm them about this. They have always known this.”

D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.

“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”

In this blogpost, Zachary Gold dives into the implications of the Children’s Internet Protection Act (CIPA), launched in 2000.

“Many parents certainly worry about their children getting access to inappropriate material online, and CIPA may have been a reasonable way to address that concern when it was passed. The devices we use, and the way we use the internet, have changed drastically since then. Updating CIPA, or replacing it to govern these new devices and connections being used by students could do more harm than good. Keeping pornography out of student’s schoolrooms is important, but filtering and monitoring student’s internet activity around town and at home blurs the role of school administrators.”

Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.

If this sounds complicated and scary, that’s because it is. But confronted with this matrix of vulnerabilities, the library—with its longstanding commitment to patron privacy—also offers an impressive plan of action.

D&S affiliate Desmond Patton breaks down how social media can lead to gun violence in this piece in The Trace.

Social media doesn’t allow for the opportunity to physically de-escalate an argument. Instead, it offers myriad ways to exacerbate a brewing conflict as opposing gangs or crews and friends and family take turns weighing in.

D&S artist-in-residence Ingrid Burrington, with Josh Begley and Seth Freed Wessler, created an installation on presentation at the Ace Hotel.

Building on Wessler’s journalistic investigation into privately run immigrant-only federal prisons, Burrington and Begley present seventy-five individual lenticular prints of satellite imagery capturing these sites and government documents pertaining to them. Together the images, arranged into three distinct grids, explore and abstract “the terrain of U.S. immigration and carceral policy and the human stories usually conspicuously absent in the aerial perspective.”

D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.

Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.

So, fast forward to our refund situation: now I no longer feel like I have any moral high ground to demand a formal close out — in my mind, I was complicit in the shadiness when I was cool with fooling the apartment building. How is that any different than agreeing to sidestep the Airbnb platform rules?

As the white house continues to issues executive orders on issues like immigration that hit tech companies directly, and as issues like transgender rights — that are outside the pocketbook interests but may intersect with a company or community’s values — come up, it feels as though companies are going to continue to be under pressure to take public stands.

Worse, we’ve lost the ability to discern that a short-term benefit for some users that’s subsidized by an unsustainable investment model will lead to terrible long-term consequences for society. We’re hooked on the temporary infusion of venture capital dollars into vulnerable markets that we know are about to be remade by technological transformation and automation. The only social force empowered to anticipate or prevent these disruptions are policymakers who are often too illiterate to understand how these technologies work, and who too desperately want the halo of appearing to be associated with “high tech”, the secular religion of America.

D&S founder danah boyd responds to remarks by the Trump administration stating that their opposition is the media.

And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.

The impulse to pair a technology associated with automated extralegal killing of American citizens alongside “culture and the arts” is weird, but not entirely surprising—the vantage point of drones affords a particular aesthetic in addition to plausible deniability. The aerial perspective has appealed to artists for as long as it has appealed to generals and kings. That distant, presumed-objective view from nowhere, whether achieved via hot air balloon or low-orbit satellite, suggests a totality, a kind of coherence in defiance of the often-incoherent groundtruth of everyday life. For generals, coherence offers the possibility of tactical advantage. For artists (or at least good artists), it’s something to interrogate and take apart.

D&S advisor Ethan Zuckerman provides a transcript on his recent speech about journalism and civics.

One final thing: we have this tendency in journalism right now to feel very sorry for ourselves. This is a field that we are all enormously proud to be part of. This is a field that is harder and harder to make a living in, and I see more and more news organizations essentially saying, “You’re going to miss us. We’re going away. I just want to warn you.”

This year, I’ll be traveling the US talking to people in scrappy communities who are building fiber on their own. They’re fed up with waiting for enormous incumbent communications companies to decide it’s in their corporate interests to invest in 21st-century communications capacity for Americans. These communities have run the numbers and looked at their economic development needs — as well as the possibilities for advanced healthcare, world-class educations, effective governance, energy management, and public safety that publicly controlled wholesale “street grids” of fiber make real — and they’ve come to the conclusion that if they hang back, they’ll become irrelevant.

As we begin a new year and a new political administration takes office in the US, let’s take some time to consider some pressing issues that exist at the nexus of technology and social justice—and think about how we as social justice advocates can address them most effectively. Even amid so many unknowns, we can be certain that these issues are among those that will shape 2017 and the years and decades beyond it. And they will be central to the work of building a free, open, and transparent future.

It’s unrealistic and unfair to ignore all that work – to myself, and others. Citing luck and serendipity gives the impression that people in positions of influence will somehow magically find out about you and your interests and reach out to you – they (probably) won’t. It implies that if you’re doing this right, opportunities to work on things you want to be working on will just pop up out of the blue.

D&S advisor Ethan Zuckerman reflects on today’s political atmosphere and FDR’s speech on the four freedoms.

This is a scary moment, a time where it looks like the progress we’ve made around the world might reverse, where we go from a world that’s gotten much bigger to one that shrinks. The good news is that we get to decide how big a world we want to live in. We get to decide how to speak, how to listen and how to stand together against fear.

Here in China, even well-educated and progressive friends have sincerely asked me about some pretty niche conspiracies. Did Hillary really assassinate someone? (No.) Didn’t Trump win 90% of the vote? (No.) Yesterday, someone even mentioned that they really liked a poem he wrote about his vision for America’s future. (What.)

I didn’t want to see a Trump presidency, and the rise of insurrectionism to the highest levels of the American government scares the crap out of me. But scarier is the endless blame game I hear my allies engaged in, figuring out whether we blame the media, the FBI or anyone other than ourselves for this loss. We have a brief opportunity to figure out how to make social change in an age of high mistrust and widespread insurrectionism. It would be a shame if Donald Trump figured out how to harness this power and the progressives lined up against him failed to do so.

Recognising the political importance of our technical decisions is within reach, leading ultimately to reclaiming power and control of our activism in the digital sphere as well as in the offline world.

D&S advisor Tarleton Gillespie, with several scholars, continues sharing essays on the U.S. election and the implications on scholarship.

As we said yesterday, we know the scholars in this community cannot address every issue that’s likely on the horizon, but we think our work touches a surprising number of them. The kinds of questions that motivate our scholarship — from fairness and equity, to labor and precarity, to harassment and misogyny, to globalism and fear, to systems and control, to journalism and ignorance — all of these seem so much more pressing now then they did just a few days ago. If we’re going to have four years of a Trump presidency and all that goes with it, and one way we can respond is through our scholarship, we want to start today in stridently moving that forward.

D&S advisor Tarleton Gillespie joins other scholars in considering the place of research and scholarship in a post-Trump USA.

But as scholars, we do a disservice to allow for simple or single explanations. “Perfect storm” has become a cliche, but I can see a set of elements that had to all be true, that came together, to produce the election we just witnessed: Globalization, economic precarity, and fundamentalist reactionary responses; the rise of the conservative right and its target tactics, especially against the Clintons; backlashes to multiculturalism, diversity, and the election of President Obama; the undoing of the workings and cultural authority of journalism; the alt-right and the undercurrents of social media; the residual fear and anxiety in America after 9/11. It is all of these things, and they were all already connected, before candidate Trump emerged.

But the work to be done in fighting Donald Trump is not unprecedented. All of us who are targets of his rhetorical attacks and his proposed policies can look back at history and see times when we’ve faced down similar threats—and won. It is only because progress has been made that we feel so gutted by this loss. And this is not, as some would say, the last gasp of old oppressions, it is simply another dark milestone in a fight against injustice that will never end.

This is the two-dimensional world that data-driven campaigning is optimized for. It gets results at the margins, and tomorrow many Democratic professionals will congratulate themselves for another job well done. If the people at Trump’s rallies, and at Sanders’, taught us anything this year, it’s that being a name in a file isn’t enough. A political party can’t just be a vehicle for gathering votes. And the work of politics isn’t over when a campaign ends. It’s just beginning.

D&S artist-in-residence Heather Dewey-Hagborg’s work was recently profiled on Blumhouse.

Undoubtedly the most shocking (but fascinating) of Dewey-Hagborg’s projects is “Stranger Visions,” which she launched in 2013 to much acclaim. The work not only stunned the art community, but sparked the curiosity of the scientific world… for disturbing reasons I’ll get into shortly.

D&S advisor Susan Crawford discusses the possibility and implications of an AT&T and Time Warner Cable merger.

The high-speed internet access market in America is entirely stuck on a expensive plateau of uncompetitive mediocrity, with only city fiber networks providing a public option or, indeed, any alternative at all. The AT&T/TWX deal will not prompt a drop of additional competition in that market. Nor will it mean that the entertainment industry will see more competition or new entrants — just that one player will get an unfair distribution advantage. It’s hard to think of a single positive thing this merger will accomplish, other than shining a bright light on just how awful the picture is for data transmission in this nation.

This deal should be dead on arrival. In fact, AT&T should spare us by dropping the idea now. This merger must not happen.

The net result of this batshit crazy election cycle is a Distributed Denial of Service attack on democracy. Like a webserver brought to its figurative knees by an endless flood of malformed requests, we are beginning to melt down under the avalanche of craziness. We’re left with the impression that this is an election between the possibly shady but unfairly attacked versus the truly unhinged… or between the thoroughly corrupt insider whos managed to undermine both government and the media versus the rough, offensive and often outrageous outsider who’s the only man she couldn’t bring down. We can’t move beyond those impressions because we are drowning in controversies and conspiracies, with very little help in understanding which matter and which we should take seriously.

Without the directional signs in place, suddenly huge numbers of sites couldn’t be found. Who knew the Internet of Things could have such a big effect on our daily lives?

Actually, a lot of people knew. IoT is very big business these days.

While we’re patching those insecure home DVRs, routers, and webcams, let’s back up and talk about the implications of IoT for public values generally. Because it’s not just websites that could be affected by unrestrained Internet of Things deployments. We’re not just using IoT in our homes. We’re also going to be using it, in a big way, in the places where 80 percent of Americans live, work, and play: in cities.

D&S researcher Claire Fontaine looks at how school performance data can lead to segregation.

In our technocratic society, we are predisposed toward privileging the quantitative. So, we need to find ways to highlight what is truly helpful in the data, but also insert an element of creative distrust. We need to encourage data consumers to think deeply about their values, rather than using data to reify knee-jerk prejudicial attitudes. Data scientists and engineers are in the position to help shift the conversation around data as truth.

In the next economy, the most important skills may be difficult to quantify or commodify—but optimizing for human welfare demands that the people driving the innovation economy take them seriously. Care work requires workers to build trust and practice kindness. It is “emotional labor” that demands skills such as calmness, empathy and interpersonal creativity. Given this outlook, the greatest victory of our tech industry could be in turning away from systems that incentivize efficiency and profit and toward designing systems that optimize workers’ and consumers’ dignity, sustenance and welfare.

D&S research analyst Mikaela Pitcan discusses how missing data can impact how students with mental health conditions.

The areas in which data are lacking communicate priorities. However, without concrete data to show a need to prioritize the issue of mental health in schools, there is little incentive to make this issue a priority. Is the failure to account for students with mental illness in a detailed manner the result of stigma? Is it the result of a broader culture that idealizes childhood and is unable to integrate the idea of children struggling with mental illness into our collective consciousness? How might big data be used to identify children in need of mental health treatment in schools to target intervention while protecting students’ privacy? In an age where incredibly detailed information is collected, some students’ needs remain invisible. How can we use the data we have to address the need for the data that is missing?

D&S advisor Ethan Zuckerman responds to criticism for donating to the North Carolina GOP office’s reconstruction.

It’s also possible that kindness is the single most important and powerful thing you can do to make change in the world. Consider the story of Derek Black, who inherited a leadership role in the White Nationalist movement from his father, the founder of the Stormfront message board community. A fellow student at New College in Sarasota, Florida reached out to Black, inviting him to an interfaith shabbat dinner, not to confront him about his beliefs, but simply to reach out and include him. This kindness proved transformative — at great cost to his relationships with his family, Black has forsaken white nationalism.

D&S advisor Susan Crawford discusses how streetlights are becoming a part of the Internet of Things.

But the third step was the charm: This past summer, Santa Monica adopted an ordinance requiring that wireless carriers get access to Santa Monica’s streetlights and traffic signal poles only on a neutral basis. It also sets design requirements for these rights-of-way assets, emphasizing the need for nice-looking poles that conceal gear. But the important thing is that carriers will not be able, in the words of former Santa Monica CIO Jory Wolf, to “delay or preclude” competition. The desired result: no one can lock up these poles.

D&S researcher Bonnie Tijerina discusses the development of a “hands-on professional training program on data and privacy literacy in hopes of showing how this knowledge can positively impact their service to library patrons.”

I want to live in the world where the second chances Donald Trump has received thousands of times are redistributed to others who deserve and would do more with such chances. For with all his second chances, what has he done for others? But that black teenager could become a legitimate business owner. That Syrian refugee could create great art. That migrant worker could revolutionize our education system. And the millions currently incarcerated, largely for non violent offenses, could return to their communities as assets.

Although, a step in the right direction, the Partnership on AI does highlight a certain conundrum — what exactly is it that we want from Silicon Valley’s tech giants? Do we want a seat at their table? Or are we asking for a deeper and more sustaining type of participation? Or perhaps, more disturbingly, is it too late for any truly inclusive and meaningful participation in the development of future AI technologies?

D&S advisor Anil Dash examines how to change the tech industry for good.

Some of the most novel critiques about technology and Silicon Valley are coming from women and underrepresented minorities, but their work is seldom recognized in traditional critical venues. As a result, readers may miss much of the critical discourse about technology if they focus only on the work of a few, outspoken intellectuals.

It’s pretty clear that Smart Cities 1.0 was always going to take cities in a bad direction — and its why I wrote my book. Cities have clearly responded, and the city-led Smart Cities 2.0 model is clearly ascendant — most clearly reflected in the proliferation of smart city campaigns, visions and digital master plans (see my 2015 paper with Stephen Lorimer, who now is on Smart London team at the Greater London Authority, where we compare the content, planning process, and implementation approach of 8 cities’ digital plans: “Digital Master Planning: Am Emerging Strategic Practice in Global Cities”)

They’re three people from wildly different backgrounds, each working on a spate of complex issues. But what they have in common is that they rejected cynicism, they saw that they could have a role in leading change, and they’ve worked to make it happen. In our discussion, we’ll find out how they took that leap, and maybe learn how each of us can focus on the issues we care about and maybe even think about how we can “change the world”.

D&S fellow Zara Rahman explores the need for access to information and open data, i.e. the right to know.

Given these growing threats, combined with our increased knowledge of government secrecy and surveillance, and new possibilities through widespread technologies, it feels like we should be focusing more than ever on strengthening our right to information. This means directing funding towards it, supporting the established RTI community, and directing resources towards exercising our right to information when we can.

The civic tech gang for some reason — probably because it is their target and they are nibbling off what they think they can actually achieve in the short run — hasn’t really articulated the fault lines in its interactions with city governments. When I listen to those exchanges it seems a little too cozy, as if the civic tech players are just waiting to be brought into government to drive the change from within.

In this blogpost, Jade E. Davis argues against the myth of bootstrapping in education equality.

“Those who are imagined as less American might see a single generational gain with successive generations seeing some benefit. But overall, educational attainment does not free a person from the larger cultural forces that shape and limit their experiences as potential. Additionally, the assumptions of identity color who we imagine being capable of success globally.”

Traditional news media has a lot of say in what it publishes. This is one of the major things that distinguishes it from social media, which propagates the fears and anxieties of the public. And yet, time and time again, news media shows itself to be irresponsible, motivated more by the attention and money that it can obtain by stoking people’s fears than by a moral responsibility to help ground an anxious public.

In this blogpost, Audrey Watters challenges accountability processes in the current public school system.

“So when we think about “what counts” and who’s held to account under public education’s accountability regime, it’s still worth asking if accountability can co-exist with “response-ability” — accountable to whom, how and to what ends; responsible to whom, how, and to what ends.”

Anil Dash analyzes how ‘teaching kids to code’ does not address the wider inequities that prevent diversity in tech.

Many tech companies are still terrible at inclusion in their hiring, a weakness which is even more unacceptable given the diversity of the younger generations we’re educating today. Many of the biggest, most prominent companies in Silicon Valley—including giants like Apple and Google—have illegally colluded against their employees to depress wages, so even employees who do get past the exclusionary hiring processes won’t necessarily end up in an environment where they’ll be paid fairly or have equal opportunity to advance. If the effort to educate many more programmers succeeds, simple math tells us that a massive increase in the number of people qualified to work on technology would only drive down today’s high wages and outrageously generous benefits. (Say goodbye to the free massages!)

D&S affiliate Karen Crawford wrote a compelling piece, sparked by Facebook’s censorship of “The Terror of War” photograph, on the social impacts of artificial intelligence.

The core issue here isn’t that AI is worse than the existing human-led processes that serve to make predictions and assign rankings. Indeed, there’s much hope that AI can be used to provide more objective assessments than humans, reducing bias and leading to better outcomes. The key concern is that AI systems are being integrated into key social institutions, even though their accuracy, and their social and economic effects, have not been rigorously studied or validated.

Achieving maximum efficiency and increasing the use of renewable energy to power data centers are two significant components within a greater web of environmental actions that will be needed to responsibly address the rise of cloud computing in all its visible and invisible instantiations. However, reducing the environmental impacts of the cloud will require a comprehensive approach that looks at each of its connecting pieces, far beyond the data centers that serve as its central hub.

D&S fellow Zara Rahman wrote a piece on GenderIt discussing what it means for the internet to become feminist.

A feminist internet can mean many things, it means that everyone has affordable, unconditional, open, meaningful and equal access to the internet; it means acknowledging that attacks, threats, intimidation, and policing experienced by women and queers is real, harmful, and alarming, and is the responsibility of corporates but also our collective responsibility; it means that the right to free expression includes the right of women and queers to sexual expression and gender expression; and it includes principles on access, movements, public participation, resistance, free and open source software, anonymity, agency and so on. But even as we’re working towards and fighting for the foundations of a feminist internet, there are tools that are being built upon the infrastructure that we have today which support feminist ideas.

Data is great at masking its own embedded bias, and school performance data allows privileged parents to reinforce educational inequality. The best interests of some individuals are optimized at the expense of the society. Accountability programs, particularly when coupled with school choice models, serve to keep middle and upper middle class families invested in public schools, but in an uneven and patterned way, causing segregated school environments to persist despite racial and socioeconomic residential diversity.

In “Disentangling the real and potential risks of advertising in schools” D&S research analyst Alexandra Mateescu teases apart the difference between real and hypothetical risks of advertising in schools. Mateescu discusses how traditional forms of advertising in schools are shifting due to technology and highlights factors that are contributing to the increasing complexity of understanding advertising in schools.

D&S researcher Monica Bulger discusses her participation in a meeting of experts held by the National Academy of Education. She concludes with a series of observations, including:

The main problem is that researchers have done a poor job explaining the value of collecting and using student data for educational research. Much of the contributions of research to practice are invisible, especially when they are successful. There are no signposts to flag how early education becomes a priority, or a school starts serving breakfast, or why early-career teachers are paired with veteran mentors. But if there were, the signs might say say this is brought to you by research from xyz, or student data from over 100,000 kids in 45 school districts over a 5-year period have informed this new practice.

In “Class or Race: The Factor that Matters More for Equity,” D&S research analyst Mikaela Pitcan tackles the question of whether racial or class diversity should be considered the most important indicator of equity. Pitcan argues that researchers should instead embrace the complexity inherent within discussions of equity by looking at how the intersections between race and class impact outcomes.

This also contributes to spreading tech’s well-known shortcomings around inclusion and diversity into new fields. Today, companies described as tech startups are doing everything from making mayonnaise to preparing grilled cheese sandwiches to delivering pizza. But given that companies ranging from AirBNB to Uber have relied on their status as “tech companies” to systematically shirk inconvenient laws in each new city they enter, we can expect that at least one of these food companies entering the market as part of the “tech industry” are going to similarly find the rules around sanitation and inspection too onerous and use their tech status to evade health regulations.

Anil Dash wrote a piece describing the evolution of blogs. He compares past and current capabilities of features, such as searches, comments, and following. He concludes with:

Ultimately, though, I think most of these ideas were good ideas the first time around and will remain good ideas in whatever modern incarnation revives them for a new generation. I have no doubt there’s a billion-dollar company waiting to be founded based on revisiting one of the concepts outlined here.

This set up essentially provides negative disincentives to drivers to retrieve the wages they’re owed. An analogy I think of, by comparison, is how cell phone companies can cram small fees into customer bills. Only some percentage of customers are actively tracking their bills, and some percentage of those are willing to spend an hour on the phone with a well-meaning but ineffective customer service agent to get back their small fee.

D&S researcher Claire Fontaine writes about data is speeding up school segregation.

Data is great at masking its own embedded bias, and school performance data allows privileged parents to reinforce educational inequality. The best interests of some individuals are optimized at the expense of the society. Accountability programs, particularly when coupled with school choice models, serve to keep middle and upper middle class families invested in public schools, but in an uneven and patterned way, causing segregated school environments to persist despite racial and socioeconomic residential diversity.

Anil Dash provided background details to his spoken tribute to Prince at the EyeO Festival in Minneapolis, Minnesota.

This talk was designed as a one-time tribute to Prince’s artistry, courage and unique spirit. Collected here are sources, footnotes and related links designed to add context and clarity to the video of the talk, as well as errata covering my errors and omissions along the way.

D&S advisor Nick Grossman wrote a detailed piece discussing how small businesses can utilize data and available connectivity.

Critical to that goal is the mindset that data is an asset. The more open we are to new kinds of business operating, the more we have the opportunity to see data that comes from activities, the more we can learn from the data, and then iterate on policy. Thinking about policy and regulation this way therefore biases us towards encouraging activity to happen, rather than stopping it from happening.

D&S advisor Susan Crawford discussed the limitations of how net neutrality can protect consumers. Crawford details other methods that providers can and have exploit loopholes.

Look, I am just as hard core about net neutrality as anyone. I celebrate that the long battle to make the principle into law seems finally over, in both the US and Europe. But it’s only a beginning, and stopping here would be a mistake. It’s a temporizing approach to a deeper, structural dilemma: how much power to give the private providers of what should be a utility service. Where those providers effectively have monopoly control they have a thousand ways to avoid these rules.

D&S affiliate Anthony Townsend writes about his research in data and city charters.

Now, there’s a number of organizations that are working hard on pushing cities up this maturity hill. CityMart is figuring out how to help cities overhaul their innovation process from within. Bloomberg Philanthropies is driving hard to get city governments to focus on achieving measurable innovation. But its all too much within the existing framework of governance systems that are usually fundamental dysfunctional, structurally incapable of delivering. Digital maturity seems to want to engage a larger conversation about the transformation of governance that is missing. No one seems to be willing to go out on a limb — with the exception of the radical political movements like Podemos and Syria (but they haven’t engaged the smart city meme in any real way yet) — and call the whole incremental update campaign into question. (n.b. while the Pirate Party has engaged ‘smart’ in a legitimate way, they don’t represent a coherent political movement in my opinion).

D&S researcher Alex Rosenblat discusses safety and surveillance of Uber and Lyft drivers in Medium. From neighborhood discrimination to threats of violence, drivers describe safety issues while disclosing how they feel Uber and Lyft are surveilling them, such as through suspected camera spying.

When drivers discuss the dangers of their job, they usually reference a passenger who made them uncomfortable, or, more commonly, specific neighborhoods they avoid, such as by logging out when they’re nearby so they don’t get a ride request. Most drivers know it’s taboo to explicitly discriminate based on destination, and they generally express a willingness to accommodate passenger requests, but sometimes perceptions about dangerous neighborhoods become a factor in their risk assessment. (One of the big selling points for ridehail services is that they go where cabs refuse to venture, particularly to low-income, minority neighborhoods).

There are lots of reasons drivers might opt to disguise or promote their work as ridehail drivers. To help passengers locate them on a busy street, trade dress can be helpful, but not all drivers want to be identified explicitly as Uber or Lyft drivers.

D&S advisor Anil Dash details what the New York City tech community has done right.

Put simply: New York City is unique in that its tech community is grounded in principles of social and civic responsibility. It’s an important distinction, one that we’ve got to work hard to protect and nurture. And just like New York-style pizza, I’m hoping lots of people in other cities think that what we’re making here is good enough that they try to emulate it in their own communities.

D&S affiliate Anthony Townsend writes more on city charters and big data.

The point is… what we now think of as ‘hidebound obsolete bureaucracy’ was no so long ago the cutting edge analytics and evidence-based administrative technology of its day. It’s outlived its usefulness for sure, but these zombie public organizations will shamble on for a long time without a better vision that can plot a transition path to within the reform process that’s required by law.

Student data privacy conversations center on concerns regarding the safeguarding of data collected in educational settings. These concerns oftentimes revolve around a fear of student information being sold to third parties for targeted advertising purposes. Other concerns include the hyper surveillance of vulnerable groups of students, schools selling students’ personal details to marketing companies, and schools using data to make potentially damaging decisions about children.

D&S researcher Alex Rosenblat wrote this piece narrating her many interviews with Uber drivers around the country. In this article, Rosenblat highlights many aspects of Uber drivers’ work and lives, including working in different regional contexts, anxieties around information privacy, and learning English on the job.

Just because software is universally deployable, though, doesn’t mean that work is experienced the same way everywhere, for everyone. The app works pretty much the same way in different places, and produces a workforce that behaves relatively homogeneously to give passengers a reliable experience — it’s easy to come away with the impression that the work experience is standardized, too.

Cultural perceptions of the role of humans in automated and robotic systems need to be updated in order to protect against new forms of consumer and worker harms. The symptoms of moral crumple zones (at the risk of mixing metaphors) are some of the phenomena that human factors researchers have been studying for years, such as deskilling, skill atrophy, and impossible cognitive workloads. One of the consequences is that the risks and rewards of technological development do not necessarily develop in the broader public interest. As with previous transitions in the history of automation, new technologies do not so much do away with the human but rather obscure the ways in which human labor and social relations are reconfigured.

D&S Researcher Alex Rosenblat on the fallout of the Austin Transportation’s showdown with Uber and Lyft:

Uber allied with Lyft in Austin to lobby against an ordinance passed by the city council which requires ridehail drivers to undergo fingerprint-based background checks. The two companies spent $8.1 million combined to encourage (i.e. bombard with robo-texts) Austin voters to oppose the ordinance in a referendum vote called Proposition 1. If local cities take a stand against Uber or Lyft’s demand about background checks, and they prevail, that could produce a ripple effect in other cities that have regulatory demands. The local impact on Austin is a secondary concern to the global and national ambitions of imperial Uber and parochial Lyft. When they lost the vote on Prop. 1, they followed through on their threats to withdraw their services.

D&S Advisor Tarleton Gillespie responds to Gizmodo’s recent piece alleging bias in Facebook’s Trending Topics list. He argues that information algorithms like the ones used to identify “trends” on Facebook do not work alone and cannot work alone and argues that “in so many ways that we must simply discard the fantasy that they do, or ever will.”

People are in the algorithm because how could they not be? People produce the Facebook activity being measured, people design the algorithms and set their evaluative criteria, people decide what counts as a trend, people name and summarize them, and people look to game the algorithm with their next posts.

Trending algorithms are undeniably becoming part of the cultural landscape, and revelations like Gizmodo’s are helpful steps in helping us shed the easy notions of what they are and how they work, notions the platforms have fostered. Social media platforms must come to fully realize that they are newsmakers and gatekeepers, whether they intend to be or not, whether they want to be or not. And while algorithms can chew on a lot of data, it is still a substantial, significant, and human process to turn that data into claims about importance that get fed back to millions of users. This is not a realization that they will ever reach on their own — which suggests to me that they need the two countervailing forces that journalism has: a structural commitment to the public, imposed if not inherent, and competition to force them to take such obligations seriously.

Reflections from D&S Affiliate Solon Barocas and Advisors Edward W. Felten and Joel Reidenberg on the recent “Unlocking the Black Box” Conference held on April 2 at Yale Law School:

Our work on accountable algorithms shows that transparency alone is not enough: we must have transparency of the right information about how a system works. Both transparency and the evaluation of computer systems as inscrutable black boxes, against which we can only test the relationship of inputs and outputs, both fail on their own to effect even the most basic procedural safeguards for automated decision making. And without a notion of procedural regularity on which to base analysis, it is fruitless to inquire as to a computer system’s fairness or compliance with norms of law, politics, or social acceptability. Fortunately, the tools of computer science provide the necessary means to build computer systems that are fully accountable. Both transparency and black-box testing play a part, but if we are to have accountable algorithms, we must design for this goal from the ground up.

D&S Research Analyst Mikeala Pitcan gives us a round-up of news events from January through March 2016 addressing data and equity in schools with a focus on efforts to combat bias in data in New York’s Specialized High Schools.

In opening a door to the Internet, Facebook doesn’t need to be a gatekeeper The good news, though, is that Facebook could quite easily fix its two core flaws and move forward with a program that is effective, widely supported, and consistent with Internet ideals and good public policy.

Rather than mandating an application process, vetting supplicants, and maintaining and making happy a list of approved service providers, Facebook could simply enforce all of its service restrictions through code. Entirely consistent with principles of network neutrality, Facebook could provide a stripped-down browser that only renders, for example, mobile-optimized websites built in HTML, but not Javascript, iframes, video files, flash applets, images over a certain size, etc. Facebook can publish the technical specs for its low-bandwidth browser; ideally, those specs would map directly to existing open web standards and best practices for mobile web pages and other services. When the user wants to go to a site or service, the browser makes the request and the target server delivers its response — if the browser can render what the server sends, it does; if it can’t, it tells the user as much. As the operators of websites and online services notice a surge in users with these kinds of Free Basics browsers, they will work to ensure their mobile web offering renders the way they want it to.

In this gatekeeper-less model, neither the user nor the online service has to ask Facebook’s permission to connect with each other. And that’s what makes all the difference. Rather than referring to an approved set of ~300 companies, the word “Basics” in Free Basics would denote any site or service anywhere in the world that provides a standards-compliant, low-bandwidth, mobile-optimized version.

D&S Advisor Susan Crawford argues that the President is on shaky legal ground in the FBI vs. Apple showdown:

The problem for the president is that when it comes to the specific battle going on right now between Apple and the FBI, the law is clear: twenty years ago, Congress passed a statute, the Communications Assistance for Law Enforcement Act (CALEA) that does not allow the government to tell manufacturers how to design or configure a phone or software used by that phone — including security software used by that phone.

CALEA was the subject of intense negotiation — a deal, in other words. The government won an extensive, specific list of wiretapping assistance requirements in connection with digital communications. But in exchange, in Section 1002 of that act, the Feds gave up authority to “require any specific design of equipment, facilities, services, features or system configurations” from any phone manufacturer. The government can’t require companies that build phones to come to it for clearance in advance of launching a new device. Nor can the authorities ask a manufacturer to design something new — like a back door — once that device is out.

D&S Board Member Anil Dash contrasts two recent approaches to making internet connectivity more widely available. Comparing the efforts to build consensus behind Facebook’s Free Basics initiative to LinkNYC, the recently-launched program to bring free broadband wifi to New York City, Dash views each situation as a compelling example of who gets heard, and when, any time a big institution tries to create a technology infrastructure to serve millions of people.

There’s one key lesson we can take from these two attempts to connect millions of people to the Internet: it’s about building trust. Technology infrastructure can be good or bad, extractive or supportive, a lifeline or a raw deal. Objections to new infrastructure are often dismissed by the people pushing them, but people’s concerns are seldom simply about advertising or bring skeptical of corporations. There are often very good reasons to look a gift horse in the mouth.

Whether we believe in the positive potential of getting connected simply boils down to whether we feel the people providing that infrastructure have truly listened to us. The good news is, we have clear examples of how to do exactly that.

D&S Fellow Mimi Onuoha tries to figure out where her electrity comes from…and runs into a few roadblocks.

To know where your electricity comes from is to know all the points it travels through: the generators that produce it, substations that route and distribute it, transmission lines that transport it, transformers that raise and lower its voltage, and the service that directs it into your home. But lest you think the process is as straightforward as I have described, I should mention that for each step there are further caveats and complications. Feeders are divided into primary and secondary; there are upwards of eight transformers in each substation; service boxes are also connected to manhole vaults that serve as access points to equipment; and power plants go by a plethora of other names (generators, power stations, powerhouses). Each step of the process could prompt its own exploration.

D&S Fellow Mark Latonero considers the digital infrastructure for movement of refugees — the social media platforms, mobile apps, online maps, instant messaging, translation websites, wire money transfers, cell phone charging stations, and Wi-Fi hotspots — that is accelerating the massive flow of people from places like Syria, Iraq, and Afghanistan to Greece, Germany, and Norway. He argues that while the tools that underpin this passage provide many benefits, they are also used to exploit refugees and raise serious questions about surveillance.

Refugees are among the world’s most vulnerable people. Studies have shown that undue surveillance towards marginalized populations can drive them off the grid. Both perceived and real fears around data collection may result in refugees seeking unauthorized routes to European destinations. This avoidance strategy can make them invisible to officials and more susceptible to criminal enterprises. Data collection on refugees should balance security and public safety with the need to preserve human dignity and rights. Governments and refugee agencies need to establish trust when collecting data from refugees. Technology companies should acknowledge their platforms are used by refugees and smugglers alike and create better user safety measures. As governments and leaders coordinate a response to the crisis, appropriate safeguards around data and technology need to be put in place to ensure the digital passage is safe and secure.

It is well known that gang violence is a serious public health problem, particularly in Chicago, Illinois, where it has claimed over 400 murders and 2,674 shooting victims so far this year.

What is less well known is that gang-involved youth rely on social networking sites like Twitter to communicate with friends and rival gangs.

D&S affiliate Desmond Patton is currently researching the use of social media by gang involved youth in Chicago. In this post on Medium, he discusses some of the preliminary findings from this research that reveal the ways that online messages of violence escalate into real-world violence. Find out more at Medium.

In one of her “Digitized Brainstorms”, D&S fellow Wilneida Negrón provokes readers to consider a new approach to education- one that considers the effects and outcomes that technology has on not only the world around us but the way that our brains are wired. Read more and contribute your thoughts to the discussion here.

D&S researcher, Monica Bulger analyzes a recent OECD report regarding the benefits and drawbacks using computers and technology to aid children’s learning. While the findings of the report reveal negligible results when considering the improvements facilitated by computer use in PISA scores- Bulger ends the note by suggesting more effective ways for technological impact to be measured both by researchers and by parents.

Ask the children. As part of the demographic data collected by PISA tests, adding questions about use of technology in the classroom would be a stronger and more accurate measure. School administrators might feel pressured to report on ideal use or expected use rather than actual use, but asking an entire cohort of students about their use will likely result in more accurate averages. Further, given the number of classes and teachers at any given school, expecting principals to be aware of specific practices might be unreasonable.

In this blogpost for Ethical Resolve, researcher Jacob Metcalf discusses A/B testing and research ethics and argues that:

…data scientists need to earn the social trust that is the foundation of ethical research in any field. Ultimately, the foundations of ethical research are about trusting social relationships, not our assumptions about how experiments are constituted. This is a critical moment for data-driven enterprises to get creative and thoughtful about building such trust.