Tuesday, December 20, 2016

The instructions "do not fold, spindle or mutilate" used to accompany
IBM punch cards, a ubiquitous technology for capturing and storing data
for computational purposes up until the late 1980s.

As
colleges and universities began to computerize their student records
many people experienced feeling like they'd become just a number, just
some data in a big machine. Of course, many people - whole demographic
groups - had long been familiar with this phenomenon. Some of our worst
moments in history include government/business alliances that used "data
on people" for a variety of harmful reasons. A single century provides
examples from the passbook requirements for Blacks in apartheid South
Africa, to the stars on Jews in Nazi Germany, internment camps for
Japanese Americans in World War Two, and government files on American
citizens during the McCarthy era and the civil rights movement.

Data
on people can be used for good (improving health care,
educational opportunities, tracking environmental refugees,
enfranchisement, targeted advertising) or evil (discrimination,
elimination, disenfranchisement, targeted advertising).

The
Free Speech movement of the 1960s co-opted the instructions to "not
fold, spindle or mutilate" to apply to the humans captured in the data,
not just the punch cards.

Nowadays,
we are (or should be) aware that both businesses and governments are
collecting data on us in ways so pervasive and passive as to make punch
cards seem quaint. We also know that we have been complicit in making
our data available freely - often in exchange for search functionality,
social media connections, retail discounts, or two day free shipping.

Given
this knowledge, people who are preparing to work with data - in any
capacity - need to think about the ethics of what they're doing. This
last week saw the rise of the NeverTech manifesto - in which tech company employees from across the spectrum vowed not to help build Donald Trump's muslim registry (#NeverAgain.Tech) Other tech executives are signing on to commitments to civil liberties. These statements are important, but, really, they are more of a floor than an aspirational ceiling.
Refusing to participate in building tools to facilitate discrimination
that defy the very principles of religious liberty on which the U.S.
was founded 200+ years ago hardly lives up to technologists' self-image
of disruptive, risk-taking, future creators.

The generation of digital tools on which we now depend - social media,
search, mobile - as long as they continue to destroy our
ability to speak freely, to assemble peacably, and to learn, think and
act privately are neither innovative nor groundbreaking. They are lazy
first generation solutions, avoiding the tough issues of personal
agency, liberty, privacy, and civil rights.

We
the people who are the digital data, who are excited about its
possibilities, and who are dedicated to taking advantage of it must be
the ones to dismantle liberty-destroying pervasive surveilled networks
and unaccountable third-party landgrabs over our digital selves. We must be
the ones to fight for encryption as a fundamental bulwark of civil
society, to take on the difficult engineering tasks that encode and
protect personal privacy in pursuit of public benefit, and to invent
digital systems that align with and extend humanity's highest
aspirations for life and liberty.

We need bold action now to make the digital realm align with the
principles of justice, freedom, individual action and collective good
that centuries of humans have fought to codify in our most principled
democracies. To give up on the former is to destroy the latter.

To
defer to decades-old business models, special-interest influenced
governance protocols, or difficult engineering challenges is to default
on the opportunities we face, to walk away from enticing computing
challenges and disruptive possibility, and to choose business as usual.
Focusing our best minds and our creative capital on digital tools that
destroy civil liberties and threaten employment opportunities while
ignoring those that would conserve our natural resources and enhance
human dignity, will be to hasten our demise as free, peaceful people.

All of us - creators and users of digital tools - need to get out from behind our willful blindness and acknowledge that How We Use Digital Data
is as important as what we do with it. Our digital lives depend on the
ethical choices we bring to - and that we demand of - the digital spaces
that are substructural to our daily actions. We must now take to the
streets, to the classrooms, to our open plan workspaces, to our
lawmakers, and to the board rooms to protect our digital rights and
enhance our humanity.

People need to protect themselves and demand protections in the products they use and from the companies they purchase from

We
need to insist on government action that aligns with the founding
principles of democracy and doesn't toss them aside in favor of cowardly
falsehoods about national security or economic competition

Organizations and individuals need to use their market power to demand digital products that they can use without compromising their social missions

Tech
companies, hardware/software makers, telecommunication firms, and app
designers need to lead and be rewarded for person-protecting consent,
privacy, and security practices, transparency and auditability.

Business
people need to stop resting on incumbent explotaitive revenue models.
Now is the chance for true innovators to demonstrate an ability to
produce economic value in line with human and democratic values

We, and only we, can can
lead us into an era in which our human, civil and democratic rights are protected in digital spaces by
design and by default.

Sunday, December 18, 2016

This is a metaphysical question but one that may help you think about using digital data safely, ethically, and effectively at your organization. I've been mulling over this question for awhile and it seems there are many ways to conceive of the value and role of digital data to you and your organization:

As resources, like time or money

As assets (and liabilities)

As relationships

As a context or place

As a lifecycle

As a multiplier or expansion strategy

As ones and zeros, a binary language of representation

?

I'm bingeing again on the Raw Data podcast (Season 2!) and several of the episodes - plus the reflection on season one - make it clear that there are lots of ways to think about digital data.

Different people think about digital data in different ways. Someone involved in fundraising may see the digital data held in the organization's CRM system as evidence of the relationships they manage. The IT staff may see digital data as a cycle of responsibilities and vulnerabilities. Communications experts may think of online as a place or a context. Program staff may wonder how data can be used for greater reach or deeper insights. (I'm not sure how these different roles line up or not with these different mental maps - might be an interesting thing to ask your colleagues)

How you think about digital data (and how your colleagues do) can inform who needs to do what when you're thinking about your foundation's or nonprofit's data management and governance responsibilities.

This year's Blueprint includes several worksheets you can adapt to your organizational needs - to think about what data you have, what skills you need, and how data can help, or hinder, your pursuit of mission. Check out the worksheets here.

Saturday, December 03, 2016

The U.S. nonprofit sector often thinks of itself as being independent from government and markets. This self-image is held widely enough that one of the major trade organizations even calls itself Independent Sector. But independence in the digital age is...well...complicated. Almost all of the infrastructure used to transmit digital data is owned and monitored by the government and/or commercial firms that sell internet access, cloud storage, cell phones and mobile data plans, or that provide search functionality or social media by selling your data to advertisers, or that do all of the above. So if you're communicating key messages via social media, storing your donor and beneficiary files online, and using commercial software to send text alerts or work collaboratively on your program evaluations, just how independent are you, really?

Since the Presidential election on November 8, there have been a few impressive actions that recognize the independence of nonprofits and foundations on digital systems owned and monitored by the government and/or commercial firms.

Libraries - a living example of nonprofits that manage digital data within a code of ethical practice - are taking action.

Nonprofits and foundations in far flung places looking for help with digital operations and governance questions? - ask your local librarian

The San Francisco Foundation launched a Rapid Response Fund to support movement builders. The Kairos Fellowship, for digital campaigners of color, is going strong (and applications are open)

I've written two previous posts on the threats to free assembly, expression, and privacy on which the President-elect campaigned, why we should believe those campaign statements, and what to do in the reality they represent.

Even inlcuding the actions in the above bulleted list, I've been underwhelmed by the philanthropic and nonprofit community's response to our dependent digital state. With a few exceptions, most foundations and nonprofits - even those expressing real concern about their issues - are going about their business as if nothing fundamental has changed. They don't seem to get just how "un-independent" they long ago became and what that dependence means now and for the next few (?) years.

Nonprofits and foundations work on a lot of issues. Many will tell you they work on behalf of vulnerable people - children, the elderly, the sick, the poor. Others cherish and work on behalf of people specifically targeted specifically by the President-elect's campaign and its supporters, such as immigrants, Muslims, LGBT people, and women. The digital data that these organizations use every day - emails, funding information, text messages for outreach, photos, videos, web sites, program data, beneficiary information - is the lifeblood of their work. And every bit of it may be of interest to a government intent on "radical change" - which includes building registries, deporting people, "law and order," and building walls.

If your nonprofit or foundation works with or for vulnerable people, you should not make them more vulnerable. This was true on November 7. It's more true now. The incoming administration touts its plans to register Muslims. It banned selected reporters throughout the campaign. "Long memories" about political adversaries are proudly brought up by advisors to the administration. These are not normal actions or statements, and they don't bode well for the idea of either an independent press or an independent nonprofit sector.•

Your organizational ability to manage digital data safely, ethically, and effectively is not an optional concern. It is a core operational and governance capacity. You cannot be an effective nonprofit or foundation unless you are attending to your digital assets with the same integrity, alignment to mission, and dedicated expertise that you depend on your lawyers, accountants, and financial advisors to provide regarding your human resources and financial systems.

This isn't just about the effectiveness of your organization (though that's a fine place to start). It's about the independence of, the nature and role of, and the future of independent organizations and independent civil society. Such a sector is based on the real practice of free assembly, expression and privacy, not just a presumption of their conceptual existence. That practice begins with you and your organization. You may not be able to create a copy of yourself in Canada. But the question remains...what are you going to do?

Social justice advocates, reproductive rights activists, racial equity leaders, librarians, civil liberties protectors, and journalists have been doing the hard work of protecting our rights for a long time. They have been in the forefront of protecting themselves (and us) in digital civil society against precisely the concerns being raised across the U.S. nonprofit, philanthropic, and activist communities.

Since 1990 and the founding of the Electronic Frontier Foundation (or maybe 1985 and founding of FSF) many have been warning that these same protections are needed in the digital age.

The newly elected U.S. president boasts of putting legal limits on the press and continues to show a deft hand at manipulating it. He's hired a white supremacist to work alongside him in the White House. He ran on a campaign of xenophobia, misogyny, and bigotry. We should take him at his word.

Civil society needs to stand up. This means ALL nonprofits and foundations. At the very least, these organizations need to stand by the activists who will be standing up. This is not a message just for the organizations and people who voted against the president-elect. The threats he has made to a free press, peaceable assembly and privacy are threats to an independent civil society. They are threats to all independent action.

All our civic action - from philanthropy to protest, from petitions to polling - now takes place on a digital infrastructure. Every organization that is dedicated to helping the vulnerable, to free expression, or that understands it is simply an institutionalized form of our right to peaceable assembly and private action for public benefit should realize now that their existence depends on the rights now threatened. As civil society has closed elsewhere, so has it now been directly, overtly, and rather unabashedly threatened from the people elected to lead our government.

Audit and improve your organizational governance policies and practices - DigitalImpact.io. Find colleagues you can work with at the Future of Privacy Forum. Organizations that provide capacity building, consulting, governance training, and technology support need to address digital governance and practices. It is not optional, it's integral to running a safe and effective organization.

Second, realize that your organizational existence - to say nothing of your rights as a citizen - depend on free expression, freedom to associate, and the right to act privately. The laws that protect these rights are the bedrock upon which your organization exists. Fight for them. Nonprofit peers such as EFF, ACLU, Center for Democracy and Technology, EPIC, Public Knowledge - these organizations are on the front lines of the policy issues that matter to digital civil society.

Friday, November 11, 2016

That democracy depends on an independent civil society is a bedrock assumption in political theory. In the USA, we've just held an election that will test this theory against reality.

Like so many people, I've spent the last few days trying to reconcile my feelings, my fear, my skills, my political beliefs, my social commitments, and my morality with the immediate and longer-term future that millions of my countrymen just voted for.

I believe we have to take the elected campaign at its word. The intention of the incoming administration is to take the USA back in time in terms of economic policies, racial equity, social justice, and its interactions with the rest of the globe. That's what the "again" meant.

Accepting that this vision has been handed the reins of power is daunting, but the past provides some perspective. We know how these types of choices have played out in the past. We can learn from history, our own in the U.S. and others' around the globe. We can look to previous generations and contemporary societies.

We who disagree with all of the above intentions of the incoming administration need to fight against these plans at every level. We need to protect ourselves and our neighbors from already escalating street level violence while also working for structural change that could actually provide justice and opportunity.

Civil society in the U.S. will be tested in terms of its ability to hold the newly elected administration accountable, to stand for the rights of those who didn't support the election victors (in this case, the majority of voters), and to remain steadfast protectors of our individual and collective rights to free expression, free press, free assembly, and privacy. Again, there are things we can learn from and build with allies in the U.S. and abroad. What has happened here is not unique, it has unfortunate parallels and amplifiers in many places around the world, here and now.

But, there are elements of this moment that have no easy historical analogues. The role of cyber attacks and cross national government/NGO manipulation may have antecedents, but in today's versions we see the dangers of the scale, rapidity, and decentralized nature that are also our digital systems' great strengths.

We know that most NGOs and nonprofits and civic associations are not equipped to manage and govern their digital resources in safe, ethical, and effective ways - either to protect themselves and the people they serve or to prevent themselves from becoming puppets of forces they cannot see.

Civil society doesn't have the luxury of time. The structures of civil society have been upended by the digital age - and not in ways that position us well to take on the tasks at hand. We knew what the demands were for digital civil society - and of democracies in the digital age - on Monday. But back then, we mistakenly thought we had time to bring our institutions and legal practices closer in line with the nature of digital action. Today these demands are clearer to more people - and more pressing. And we've lost too much time already.

Monday, October 03, 2016

I'm delighted to have co-edited this new volume, Philanthropy in Democratic Societies. The book is a product of an unusual process, one of workshops and seminars designed to create an multi-author volume that forms a more coherent whole than most such collections.

The blog HistPhil is running a series of pieces by each of the volume's contributing authors. My chapter uses the development of the Digital Public Library of America as a case study of philanthropy and nonprofits seeking to fill the liminal space between markets and governments. This role is not new. But filling such space when the resources to be managed are digital, the founding leaders are disbursed, and the ideal of the decentralized internet holds strong as a governing metaphor is not only the DPLA's story but a model of enterprises yet to come.

My contribution to the HistPhil series can be found here. The book is available here. If you are in the Bay Area, please join several of the book's contributors and me for a book launch at Stanford on October 27. Information is here.

Thursday, September 01, 2016

The book is the result of an 18 month
workshop with the chapter authors in which we considered the question of
philanthropy's fit in democracy through the lenses of history,
institutional structures, and values. Authors include: Jonathan Levy,
Olivier Zunz, Rob Reich, Aaron Horvath and Walter W. Powell, Paul Brest, Ray D. Madoff, Lucy Bernholz, Eric Beerbohm, Ryan Pevnick, and Chiara Cordelli.

Saturday, August 20, 2016

These are the dog days of summer (northern hemisphere). This post has nothing to do with philanthropy, other than I took the top two photos while I was in Australia, working on digital governance in philanthropy. I'll have more to write on what I learned when I stop procrastinating by looking at photos.

Monday, August 08, 2016

I've been in Australia for several weeks, leading workshops on digital civil society and data governance in nonprofits, meeting with philanthropists, corporate leaders, and various government bodies, and searching for potential scholarly collaborators. I'll be writing a reflection piece on this work soon.

But I've mostly been thinking about the census. In December 2015 the Australian department that manages the census announced it would be collecting and storing real names with the census data. Penalties would be levied against anyone who didn't file a form or who used a false name. The Christmas Eve announcement went largely unnoticed at the time, but the (mandatory) census date of August 9 brought this issue back to everyone's attention during my time in the country.

At dinner one night in Sydney I sat next to a woman who was telling me how she,
always law-abiding and even professionally dependent on the census findings,
found herself contemplating obfuscation as she reviewed the form. The fact that
her name would be attached and stored shed a new light on the questions being asked about religion, income, and family structure.

Filling out the census is mandatory, everyone is set to file on a single day, and the push is to get most Australians to file online. There are immediate penalties that accrue daily for not filing or for omitting your name (making lying on the form or boycotting it all together another example of privacy becoming a luxury item). The more I imagined myself facing down such choices the more the psychological tradeoffs bounced about in my head.

So I wonder, will the new census approach reveal that Australia is now home to a
million “mickey mouses,” a million followers of the R2D2 faith, or several
million people who simply make up all kinds of information about themselves?

I wonder – in addition to
considering the utility, the ethics, and the security issues of attaching names
to census data - did the folks at the Australian Bureau of Statistics consider the psychological calculations inevitably being made by those filling out the forms? By compelling the citizens to file and name themselves has the government created a situation in which a last grasp for privacy (and dignity) outweighs a civic obligation for accuracy?

How do our psychological needs, our civic responsibilities, our political attitudes toward government (or nonprofits or corporations) interact with the digital demands for information we face constantly? Will digital data demands make liars of us all? How does your nonprofit or foundation take these variables into account when you ask people for their data?

Tuesday, July 05, 2016

The news business has been through quite a bit over the last two decades. A year or so ago, Facebook made itself into a key distribution channel for magazine, newspaper and broadcast outlets - creating tools like Instant Articles, inking deals to with news organizations to use Facebook Live, jiggering its news feed and striking deals with major outlets. Nowadays, news organizations pretty much depend on Facebook to get their stories in front of readers.

By now, we should all realize - it's their platform, their algorithm, their rules. News organizations know this, but Facebook's reach is so great they clearly decided they needed to play in the company's sandbox, regardless of the rules.

Nonprofits shouldn't make the same mistake.

The same week that Facebook announced it was pulling the rug out from under news companies' content it announced it was luring nonprofits to the platform to do their fundraising. Again.

Just as it promised the news companies, Facebook's pitch to nonprofits is about scale. Facebook may have 1+ billion users, but that doesn't mean they're all going to suddenly care about your organization.

Shifting your fundraising over to the platform may get you a few dollars in the short term. It may make it easier for an especially eager volunteer to run a fundraising event for you.

But be wary. If someone came to you and offered: "Why not hold all your events in our house, we'll manage all your invitations, process all the gifts, follow up with everyone who attends" you'd ask yourself, "What's in it for them?" Ask yourself the same question of Facebook (or any tech platform that you don't control). The answer is easy - they get all of the data on who, what, when, and how much. They own your fundraising data. And if they decide to change the rules on how their tools work (or close the doors of the metaphorical house) they can. If history is any guide, they will.

Nonprofits give up a bit of their independence, a bit of their donors' and constituents' privacy, and a lot of control with these arrangements. It may raise a little money in the short term. But in the long term it sells out the sector. Just ask your local newspaper publisher. If you still have one.

The news business has been through quite a bit over the last two decades. A year or so ago, Facebook made itself into a key distribution channel for magazine, newspaper and broadcast outlets - creating tools like Instant Articles, inking deals to with news organizations to use Facebook Live, jiggering its news feed and striking deals with major outlets. Nowadays, news organizations pretty much depend on Facebook to get their stories in front of readers.

By now, we should all realize - it's their platform, their algorithm, their rules. News organizations know this, but Facebook's reach is so great they clearly decided they needed to play in the company's sandbox, regardless of the rules.

Nonprofits shouldn't make the same mistake.

The same week that Facebook announced it was pulling the rug out from under news companies' content it announced it was luring nonprofits to the platform to do their fundraising. Again.

Just as it promised the news companies, Facebook's pitch to nonprofits is about scale. Facebook may have 1+ billion users, but that doesn't mean they're all going to suddenly care about your organization.

Shifting your fundraising over to the platform may get you a few dollars in the short term. It may make it easier for an especially eager volunteer to run a fundraising event for you.

But be wary. If someone came to you and offered: "Why not hold all your events in our house, we'll manage all your invitations, process all the gifts, follow up with everyone who attends" you'd ask yourself, "What's in it for them?" Ask yourself the same question of Facebook (or any tech platform that you don't control). The answer is easy - they get all of the data on who, what, when, and how much. They own your fundraising data. And if they decide to change the rules on how their tools work (or close the doors of the metaphorical house) they can. If history is any guide, they will.

Nonprofits give up a bit of their independence, a bit of their donors' and constituents' privacy, and a lot of control with these arrangements. It may raise a little money in the short term. But in the long term it sells out the sector. Just ask your local newspaper publisher. If you still have one.

Friday, June 03, 2016

Once upon a time, the codes that guided society were the province of a few. The word of God was read and interpreted by priests and men of the church who told the people what the book said, what the codes for a good life were. And the people did as they were told.

(The above is a deliberately oversimplified analysis of 15-18th century western European historical canon, minus all the power struggles, racism, sexism, extra-Christian turmoil, and colonialism. I'm trying to make a point.)

Once upon a time, software code drove devices used mostly by those who could read and write software code. Then these devices and the networks they powered were opened to most (not quite all). And all became dependent on software powered gadgetry and digital networks. But the code remained the province of a few, even as some led movements for opens source, open data, open algorithms, open governance. But, still the many had no understanding of the nature of the code, its limitations or bounds.

As this code and its disciples brought their tools, which were designed around the efficiency of market forces, into other realms of life, such as the household, political systems, and civil society there was tension. The value of efficiency, coded into the software, didn't always fit smoothly with the values of the household (privacy) or that of the governing systems (participation and representation) or civil society (justice, equity, beauty).

And so there was a clash of values, a clash of codes. And the priests of software code and the priests of governance found themselves at odds. And the people - to whom open data was given - were not equipped to use it.

And some of the nonprofits and foundations and associations that constitute civil society interrogated some of the software code. Over and over again they pointed out ways in which the code was misaligned with the task to which it was being applied. Examples of racial discrimination. Of algorithmic bias. Of new divides and new versions of exclusionary practice. Civil society served one of its most important functions - checking and re-checking the power of governments and markets (and their digital tools).

Still, a more fundamental structural divide remained. Call it a linguistic divide. Between those "fluent" in digital and those "fluent" in democracy and civil society. A reformation in access and capacity and understanding was needed.

And this is where we are today.

Codes - software and legal - are sets of values.
Written and enforced. When using digital technologies within democratic
systems or for democratic purposes the values embodied in the codes
need to align. They need to be able to interrogate each other and for
the people to understand what is meant, what is captured in the code,
what is being promoted or enforced by the collective set of rules and
tools.

The common term for helping individuals, non techies, understand digital data and systems, codes and algorithms, is data literacy.
This is not my favorite term, but let's use it for now, recognizing
that it its not a one way street. People need to better understand how
digital systems and codes work and software coders need to the priorities
and principles of of democratic practice - both "literacies" are
needed.

All involved, not just the priests with the books
(not just the software coders) but the people, our agents, our elected
officials, our judges) need to be able to understand the code.

Algorithmic accountability, open data, machine learning must be designed by and with those who understand the principles of democratic governance. It is the job of those who understand these systems to teach those who understand algorithms and vice versa.

We need both codes - the codes of democracy and software code - to be written, used, held accountable, and procedurally applied and interrogated together.

One does not have the solutions for the other, they must build solutions
together. If for no other reason than we (in democratic societies at
least) are all dependent on both democracy and software. There is no
"they," we are we.

Thursday, May 26, 2016

Suing news outlets with whom you don't agree is not philanthropy. Wealthy individuals litigating an agenda by themselves (and secretly) is different from "impact litigation" led by public interest groups, (even when financed by a few individuals). (see below * on associational power)

The arc of platform consolidation built on the back of personal data that has contributed to the collapse of independent journalism is a story line we may see repeated in the nonprofit sector writ large.

Community-governed, small, independent associations - which de Tocqueville noted as core to American democracy - are threatened by homogenizing pushes for scale, efficiency, short-term metrics, and earned revenue.

These associations are key to what scholars call social capital, political wonks call civic engagement, and neighbors recognize as community. We overlook these roles of nonprofits and associations at our peril.

They are bulwarks against both economic and political monoculturalism. Otherwise known as inequality and tyranny.

Associations fill this role in at least two ways. First, they provide support for a diversity of views.
* Second, their governance structure is intended to involve multiple people as a form of public accountability and mechanism by which power can be scrutinized. Toward this end, transparency and public reporting requirements for associations (and sits in tension with anonymity). We're fooling ourselves if we think concentrated wealth or power is any less threatening in a nonprofit or philanthropic guise.

Creative Commons, Wikipedia, Mozilla, Electronic Frontier Foundation and the Internet Archive are our first models of civil society organizations purpose built for the digital age. We all manage digital resources now. We need new institutional forms.

Monday, May 09, 2016

Digital data are everywhere. They are replicable, generative, storable, scalable, nonrival and nonexcludable. Digital data are different enough from time and money - the two resources around which most of our existing institutions are designed - that it's time to redesign those institutions.

It's time for institutional innovation.

Nonprofits and nongovernmental organizations are familiar corporate forms that manage private monies (and time) for public benefit. Their corporate structure, reporting, and governance requirements direct resources to the public mission and provide bulwarks against misuse of financial resources. There is nothing in their corporate code or governance structure that equips them to do the same with digital data.

We need a new type of organization to manage and protect digital data for public benefit, especially digital data that is voluntarily contributed by individuals or other organizations.

There are a lot of building blocks for something like this. We know a lot about governance, digital data, and organizations. We have lots of models from participatory development to community based data collection to trust forms. We have ethical scaffolding in biomedical research and digital data collection that we can draw from. There are legal experts, design thinkers, experienced digital data users, digital rights activists, research reports and people from vulnerable communities who can inform the design of new structures.

There are many driving forces and vested interests. A map like this one - for this issue - would be helpful.

It's time that we:

Assume digital resources are here to stay

Get past pilot projects and stop acting like using digital data is a one-off action

Develop systems and standards for using digital resources well and safely

The Digital Civil Society Lab at Stanford is hosting a workshop on the role of Community Focused Ethical Review Processes as one step. We'll look at how a variety of nonprofits and corporations are developing new mechanisms to inform how they collect and use digital data from their communities. We'll report out on it and use what we learn to inform an ongoing effort to imagine - and reinvent - the institutional forms we need.

Friday, May 06, 2016

I just finished teaching a continuing studies class at Stanford on Tech for Social Good. My colleague and co-teacher, Rob Reich and I assembled this list of free online sites to follow/ newsletters to read for the class. I thought I'd share it here as well. Enjoy! (and let me know what I'm missing)

Sunday, March 06, 2016

That said, the last 7 minutes or so include some thoughts from me about our relationships to our digital data, why we need new rules for this resource, and why it matters during life as well as after it.

In addition to being fun to record, the interview process prompted me to think hard about perpetuity, immortality, the law and digital data. This is exciting. It also ties in nicely with an event on Giving in Time that Stanford PACS is co-hosting with Boston College School of Law - public event on campus on April 4, 2016. Stay tuned for more details.

Cathy O'Neill on "ethical data science." She looks at the way that society's values, our assumptions and software code influence each other. They are mutualistic. And increasingly inseparable. Those who write the algorithms, those who use them, and those whose lives are affected by them - in other words, all of us - need to understand this, question it, and use data and tools to lend new insights, not reinforce existing power imbalances.

Neil Richards on the need to be able to regulate code - software code and those who create it - in many uses and forms in the digital age and his admonition that Apple's arguments about privacy are sound, while their arguments about free speech are problematic. Applying a free speech framework to software code will make it very difficult to monitor and regulate uses of code that discriminate or cause other harms. And, increasingly, we are going to recognize that our civil rights battles are being fought on digital turf.

All three articles focus on our need to assume software code is fundamental now - to how decisions get made in society, business, and policy making. Indeed, they argue that software code under girds how we act as private citizens, associate with one another, and express ourselves. These rights, in turn, support civil society as we know it. Those of us focused on improving nonprofit or foundation action, on using digital tools for social outcomes, on building globally influential digital tools for social good need to take these lessons to heart.

Philanthropy and civil society now rests on software code - it is digital civil society.

Thursday, February 25, 2016

Last Friday I was part of the International Data Responsibility Group's second conference where I heard incredible examples of how the World Food Programme is using data to guide its work feeding people in conflict zones and thought long and hard about data collaboratives and the possibility of data philanthropy.

On Saturday I read about the ways Russia is targeting attacks on human rights and aid-related NGOs' digital systems in Syria.

On Wednesday, I heard Kevin Carey discuss his book, The End of College, which looks at the economic opportunities that digital tools bring to higher education. It only hints at how the digital data generated in those environments will become a key resource and point of contention between schools, students, employers, regulators, researchers, and teachers.

Today, I pushed forward in my attempts to convene scholars of crowd (sourcing and funding) - or what we're calling CrowdX - from across the Stanford campus, in disciplines as diverse as civil engineering and business, social algorithms and democracy theory.

I also read that the charges that Airbnb "cooked its data books" in releasing information to New York regulators hold up. It took intrepid journalists to demonstrate this to the public and to regulators - and to get the company to fess up.

And, of course, like many people I am absorbed in the legal, ethical, and democratic arguments unfolding between the FBI and Apple.

All of these disparate events raise concerns about privacy, publicness, and data. But the one issue that I think they all raise - that has actionable, policy-related implications for philanthropy and civil society - is this:

What data must be made public (and auditable) by platforms that facilitate public services (transportation, shelter, funding charitable or public goods)?

We require public businesses, nonprofits and foundations to report on their activities. This information provides some form of accountability, is helpful in fighting fraud, and - in the aggregate - provides a critical lens into the state of our economy, our social sector, and our democracy.

So, what data do we need access to, as a public, to understand, oversee, and yes, audit, platform companies that facilitate transactions that meet the same criteria of public interest. They may be charitable in nature, public good supporting, or mutual aid related.

Setting the bar at its lowest - shouldn't we at least have access to the same information from these platforms that we would gather were the transactions happening in some other way? Why would the platforms - even as proprietary as they are - be given a pass on reporting data on transactions that we'd otherwise publicly report? And, given their own self-touted role in a "big data" economy, setting the bar that low seems, well, practically analog and 20th century.

We're working on this question. We don't yet have answers (although here are some ideas). We do know the answer is not "nothing, trust us." And we do know that the getting the answer right matters to - and depends on - the active participation of existing institutions in civil society and philanthropy.

Friday, February 12, 2016

One of the conversations I had during the event was initiated by someone asking me the question, "When will the lawsuits begin?"

The point was that organizations won't change until they really have to, so nonprofits won't start really digging into data governance practices and policies (what digitalIMPACT.io provides) until they're legally required to do so. Lawsuits that lead to regulatory or legislative change, this person was suggesting, are a step toward the same type of organizational behavior change we're trying to support with digitalIMPACT.

History bears out this "theory of change."

Bad behavior, lawsuit, lawsuit, lawsuit, regulatory change is a plot line (or subplot) through a great deal of social and political history. Environmental protection. Civil rights. Gun laws. Campaign funding. And, yes, the protection of civil liberties online.

Tonight, I was catching up on email and half-watching the news when I heard this sentence from the professionally-alarmed local newscaster:

"Big news for parents. Your child's private data is likely on its way to a nonprofit advocacy group."

A nonprofit that advocates for special education has sued California school districts as part of their efforts to make sure kids are getting appropriate services. The school districts are (allegedly) sending electronic files of all students with names, addresses, and social security numbers to the organization. The news story went on to describe the privacy risks for children's data and point viewers (parents) to an opt-out process.

How long will it be until there is a countersuit?

Now we have an answer to Tuesday's questions. Lawsuits between nonprofits and public agencies about data have begun. Give it a minute and it will be the nonprofits getting sued. (Of course, lawsuits over digital data started years ago, at least as early as 1990, when EFF was founded).

Legal challenges for regulatory change are a tried and true means of changing policy. They are not the only way. There are things nonprofits and foundations can do to treat digital data with integrity and respect, and perhaps avoid litigation. Check out digitalIMPACT.io.

But we have no standards and no accountability for how nonprofits and foundations collect, use, and protect our personal data - whether we are acting as donors, beneficiaries, volunteers, or fee-paying customers. When I interact with a nonprofit I do so as a private person, giving my money and my time (and increasingly my data, such as a phone number, email address, and credit card number) to them to accomplish some public-facing purpose. My trust in that organization is key - to use my money, time and data wisely and in line with their mission.

The Digital Civil Society Lab and Markets For Good Initiative at Stanford is working with several partners to run the digitalIMPACT.io project to help nonprofits and foundations think about these practices. But - we the people - will have to be the ones to set the standards by which we can trust these organizations and hold them accountable.

Tuesday, February 02, 2016

I was recently asked by the amazing folks at Worldview Stanford, who have included me in some of their regular campus programming, to do an interview for their new podcast. So first I wanted to listen to what they were up to. I missed my bus stop I was so tuned in.

In partnership with the Cyber Initiative at Stanford the Worldview folks have put together a great series (Raw Data) on how digital data are reshaping our lives. The Uploaded episode does a better job of explaining the power of metadata than any news story I've read. I wish I'd heard the episode on "crowd work" before sending Blueprint 2016 to press. The story on bitcoin made the slightly-less-near term changes that digital data and infrastructure may have on global finance accessible, compelling, and real.

I listened to the episode on digital data and voting the day after the Iowa caucuses. I should mention I also binge watched Scandal, seasons 1- whatever. Let me put it this way - Shonda Rhimes' plotline on rigged elections was pretty darn scary in its plausibility, and its got nothing on what Mike Osborne and Leslie Chang (hosts of Raw Data) interrogate in their episode, "Life, Liberty and the Pursuit of Data."

Each episode weaves a story from research. This is not simply a good Malcolm Gladwell article told on the air. The hosts interview scholars from multiple disciplines, visit companies, and speak with policymakers in such a way that you don't realize you've just had your mind blown by researchers in economics, law, engineering and psychology - you're just listening to a really good story. And learning a lot.

Tuesday, January 12, 2016

As if maintaining this blog weren't enough, I've jumped on the Medium bandwagon. You can find my stories here, the Lab's posts here, and a publication called The Development Set here.

Here's a piece I wrote about how the discussions about the Chan Zuckerberg Initiative might have played out had the announcement been covered accurately by the press.

Follow up your read of that with Tom Watson's piece from The Chronicle of Philanthropy and I think you'll agree that the social economy and all that it portends - as I've been writing about in the Blueprintseries since 2010 - has arrived

Sunday, January 10, 2016

As always, talking with smart people makes me smarter. Last week at lunch Henry Timms asked, "What is it, Lucy? What is the digital rights agenda for civil society?" Since it's Henry, I knew I needed a sharp, pithy answer (preferably tweetable).

And the Stanford Digital Civil Society Lab works to help activists, nonprofits, and foundations work in alignment with these principles and to the connect digital rights and civil society policy and scholarship.

Thursday, January 07, 2016

How would you react if you had to include your social security number alongside your name and address on charitable donations you made so that the nonprofit could then report that information as part of its public filings?

Let me guess...you'd say no way. That can't be safe. It can't be a good idea either for me to transmit that info or to think the nonprofit could store it or transmit it to the IRS safely.*

This is a big deal and a "tip of the iceberg" moment for nonprofits and foundations and donors - in other words, all of us - to think hard about the massive amounts of digital data that flow through nonprofits.

Nonprofits can't be expected to manage information like that securely. They're underresourced as it is, every time they turn around someone else is yelling at them about the money they spend on administrative costs and not on mission, and, oh by the way, big companies and the US government can't keep that kind of data safe, you really think a small community organization can?

There are lots of other issues about data security and ethical use out there. I'm hoping this success - on which the sector stood together - will help bring digital governance issues to the forefront. The digitalIMPACT.io site is designed to help address them - check it out here and be in touch if you have resources to share.

*Focusing just on data security issues. Says nothing about those who give anonymously and hope to keep it that way.

Monday, January 04, 2016

This video from the Open Society Foundations sums up the promise and peril of digital data, infrastructure, and governance to civil society. It is no coincidence that threats to civil society are increasing as we become more digitally dependent

"Solidarity between the online blogger and the gay rights activist, between the NGO that's getting shut down and the social movement that's turning out on the streets. Because although those actors might look and think that they're different from each other, what they have in common is they're all manifestations of our right to organize and mobilize..."

Danny Sriskandarajah
Secretary General and CEO
Civicus
4:45-5:08

Yes, we can use digital tools to expand free expression and assembly;
yes, these tools can be used to expand the voices we hear and
participation by many. But civil society actors - in the US specifically
- are fooling themselves if they think that digital tools are innately
and always democratizing.

Civil society actors - in
the US this means nonprofits and foundations as well as social
movements, protestors, and activists - must protect the right and
capacity to organize online, to express oneself and assemble peaceably
outside of government or corporate control in digital spaces, if we are
to maintain that right and capacity offline.

About me

Why is this blog called Philanthropy 2173?

This is a blog about the future. The year 2173 seems sufficiently far enough in the future to give us some perspective. As sure as we are of ourselves now, talking about the future - and making philanthropic investments - requires that we keep a sense of modesty and humor about what we are doing. Philanthropy is for the long-term - for the year 2173.