Cleaning up battery supply chains

The Fourth Industrial Revolution — a global transformation characterized by the convergence of digital, physical, and biological technologies — is just beginning.

However, as the Fourth Industrial Revolution matures, it will have an unparalleled, disruptive impact on society, upending how people communicate, organizations create value and humanity understands itself. Building on the foundations of the digital revolution, emerging technologies in the Fourth Industrial Revolution scale up exponentially through digital interoperability, emerge physically in smart products and services, and embed themselves prolifically in society.

The Fourth Industrial Revolution has the potential, like the revolutions that went before it, to consolidate power asymmetries, increase inequalities, and advance technologies that fail to embody human-centred values.

Legal and social protection systems are in the uncomfortable position of trying to keep up as new technologies are introduced at an increasingly rapid pace. While emerging technologies like artificial intelligence and drones can provide added value towards human rights, basic service delivery, and other functions of civil society, these same technologies are already being used against civic freedoms and human rights — often by governments and the private sector to limit the impact of their activities and the voices of their constituents.

Artificial intelligence, advanced robotics, drones and other technologies have the potential to be used for repressive purposes to assert control and limit human rights.

Government deployment of surveillance software, internet shutdowns and “blogging taxes” target specific populations within a country, often intentionally used to disrupt civic expression and freedoms. Artificial intelligence, advanced robotics, drones and other technologies have the potential to be used for similarly repressive purposes to assert control and limit human rights.

The direct and indirect effects of technology also push civil society organizations — advocacy groups, humanitarian organizations, development organizations, trade unions, and others — into difficult, often simultaneous positions in relation to emerging technologies: advocating for rights and freedoms while threats from new technologies are still nascent, difficult to understand or lack legal protections; deployment of new technologies to increase efficiency with limited resources and space to operate; self-preservation in the midst of technology-enabled government attacks on civic space; and limited engagement as the capacities needed for change (talent, technology, funding, etc.) prove to be too difficult to procure.

While governments and companies in democratic societies have the responsibility to take account and mitigate the potential harms and effects of their activities, civil society and citizens aim to answer a range of questions concerning the decisions made by governments and companies: can we be sure that the decisions are accurate? In what ways do we understand the impact of these decisions? Can we trust that the decisions were fair and unbiased?

While the state’s legitimacy is bound by its accountability to its citizens, the private sector is not bound in the same way to consumers. However, increasingly across most industries, companies have become compelled — as a result of market dynamics, corporate interests and public image concerns — to act in ways to satisfy customers and benefit society as a whole (through corporate social responsibility). Although companies have largely followed these trends, clear gaps often exist where citizen interests and oversight come into conflict with shareholder interests.

The use of digital and emerging technologies by companies, governments and civil society groups has ushered in new challenges and entrenches existing challenges associated with accountability, fairness, trust and transparency:

- Fairness: How can society ensure that individuals and groups are treated in the same way? Challenging areas include: the lack of inclusive or participative models; auditing disparate impact in algorithmic systems; and selection bias.

- Trust: How does the use of digital and emerging technologies promote, rather than decrease, trust within society? Challenging areas include: the lack of utility of digital data notice and consent mechanisms; limited dialogue and communication; and a power imbalance.

- Transparency: How does the use of digital and emerging technologies promote transparency? Challenging areas include: the lack of interpretable frameworks; harmful transparency from open data (causing group privacy violations); and information manipulation and opacity (e.g. in algorithmically-curated systems recommending alternative or extremist versions of available information).

Turning the tide, together

Concerns about data protection, digital misinformation and ethical use of technologies highlight the need for greater participation and partnership in governing how these powerful, emerging technologies shape societies.

Civil society organizations and the broader ecosystem of social innovation stakeholders (businesspeople, academics, philanthropists, social entrepreneurs, etc) need to help change the direction in which the situation is currently going.

Academia has long hosted several ongoing conversations about responsibility and ethics in data, algorithms, and technology use. In the last decade, several coalitions and groups (e.g. related to responsible data, responsible/ethical AI, responsible drones etc) have emerged involving practitioners, data scientists, computer scientists and some members of civil society. In addition, several sets of principles and recommendations have emerged from multi-stakeholder gatherings, emphasizing interpretability, oversight controls and identification of shared responsibility issues related to digital innovation, AI and other emerging technologies.

Have you read?

Civil society organizations have mainly been working within their own organizations and networks to draw insights on innovation and technology adoption.

However, there is a need to move faster, together: sharing insights and resources for foresight, strategy development, and shared responsibility among civil society and other stakeholders in social innovation. Key shared experiences and challenges among all types of nonprofit and civil society organizations warrants a broader platform to discuss strategies for avoiding hype, protecting against digital harms, and ensuring a different, fairer kind of industrial revolution.

Examples of shared experiences and challenges include, but are not limited to:

Alignment challenges: How does our use of data and technology align with the needs of our organization, our beneficiaries, our users? How do we identify and define this?

Capacity challenges:What parts of our current organizational capacities reinforce or catalyse responsible data and technology use? What’s missing and what internal guidance is needed? Are there trade-offs in our current approach (e.g. talent procurement and project scope)?

Responsibility challenges: How do our projects and partnerships reflect our organization’s responsibilities to protect against digital and technology harms? What is our organization’s role in addressing shared responsibility challenges in shaping how data and technology impact society?

Ecosystem challenges:How are we engaging other stakeholders in the broader ecosystem of digital and emerging technologies?

The Forum has recently launched its project on Preparing Civil Society for the Fourth Industrial Revolution. In collaboration with the Forum Civil Society community, this three-year initiative aims to accelerate knowledge and promote shared resources towards facilitating innovation within civil society organizations and the sector’s meaningful inclusion in the governance of digital and emerging technologies.

Share

Written by

David Sangokoya, Knowledge Lead, Society and Innovation, World Economic Forum