The Need to Govern the Digital Layer

Governance of the digital layer addresses who decides what data is collected and shared for what uses. Governance is needed for ensuring security, privacy, equity and competitiveness.

What makes a smart city smart is the ability to collect data, process and analyze the data, and act on the insights. In Helsinki, open data was used to develop BlindSquare, a mobile GPS app that helps blind and visually impaired people navigate the city. This kind of smart city application requires the collection and sharing of multiple sources of data across public and private organizations.

In order to collect and share data, a smart city adds a "digital layer" to engage existing city infrastructure. This is similar to how an operating system adds a digital layer of software on top of the physical hardware of a computer. The difference is for a city, the digital layer is not a single thing, but a network of many connected things. The digital layer includes:

Physical sensors to collect data;

Digital services that use data to control other infrastructure, e.g. traffic lights;

Databases that store the data on servers;

Algorithms and code to protect, manage, process, analyze and transform the data;

Maps, visualizations and models that organize and project the data into more useful forms;

There are four reasons governance of the digital layer is needed: protecting against data breaches and misuses; assuring individual and group privacy; advancing equitable distribution of value; and promoting economic competitiveness.

The Need for Security

Whenever information is collected and shared, there is the potential for data breaches. In 2018 alone, 1.1 billion Indian residents' personal information were compromised by an Aadhar data breach; 500 million Marriott Starwood guests worldwide had their hotel reservation data stolen; and Under Armour, Quora, Facebook, Google, Cathay Pacific Airways, Saks and Lord & Taylor, and T-Mobile all suffered personal data breaches affecting millions of users. For smart city data to be shared responsibly, there needs to be an organization that protects the data from cyber attack.

There are also potential misuses of data shared for an intended beneficial purpose. For example, in 2013, the FBI asked the public to help gather evidence in the wake of the Boston bombing. Anonymous commentators on 4chan and Reddit quickly profiled high school student Salah Barhoum and dozens of others based purely on their clothes and appearance. Salah found himself being followed by strange men who were convinced he was responsible for the terrorist attack. Although reported instances of misuse of shared data are rare, mitigating the potential for misuse is an important reason governance of the digital layer is needed.

The Need for Privacy

Digital governance is also needed is to protect privacy and to create a medium through which different privacy rights and obligations can be reconciled.

Privacy is recognized as a fundamental human right in the UN Declaration of Human Rights. Although privacy is not expressly mentioned in the Canadian Charter of Rights and Freedoms, privacy has been expressly recognized as an interest that is protected by section 8 of the Charter, which protects against unreasonable search and seizure. Privacy is also considered by some to underpin human dignity and the fundamental freedoms guaranteed by the Canadian Charter of Rights and Freedoms. Under this conception of privacy, privacy is not only an individual right, but also a societal good.

Part I Section 2 of the Canadian Charter of Rights and Freedoms guarantees four fundamental freedoms for everyone in Canada:

(a) freedom of conscience and religion;

(b) freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication;

The rights granted in the Charter are not absolute. They are subject to reasonable limits prescribed by law that can be justified in a free and democratic society (Section 1).

When discussing privacy, it is important to distinguish between different types of privacy interests. The Supreme Court of Canada has recognized three types of privacy interests - informational (information about a person), personal (bodily), territorial (own's home or private space).

The types of privacy interests involved in civic digital trusts are primarily informational. Canada has federal and provincial laws that operate within the limits of the Charter to protect individual informational privacy interests while recognizing the needs for governments, public sector organizations and private sector organizations to collect, use and disclose personal information. For example, the federal Privacy Act protects the privacy of individuals with respect to personal information about themselves held by a government institution and provides individuals with a right of access to that information. In Ontario, the Freedom of Information and Protection of Privacy Act (FIPPA) fulfills a similar purpose for provincial government institutions and the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) governs municipal corporations, such as the City of Toronto. There is also a Personal Health Information Protection Act in Ontario that applies primarily to hospitals and regulated health professionals when the collect, use and disclose personal health information for the purpose of providing healthcare to an individual.

Ontario (unlike Alberta, British Columbia and Quebec) does not have a private sector privacy law. Ontario also does not expressly recognize privacy as a fundamental human right (unlike Quebec). Instead, the federal Personal Information Protection and Electronic Documents Act contains the rules that govern organizations that collect, use or disclose personal information in the course of a commercial activity.

Under these laws, "personal information" is usually defined as information "about an identifiable individual”. It is data that on its own or combined with other pieces of data, can identify you as an individual... it can mean information about your:

race, national or ethnic origin,

religion,

age, marital status,

medical, education or employment history,

financial information,

DNA,

identifying numbers such as your social insurance number, or driver’s licence,

The rules governing when personal information can be collected and how it can be used are not the same in the public sector and the private sector. In the public sector, public bodies can collect personal information without consent as long as it is authorized by a law that complies with the Charter and notice is provided to the individual. By contrast, in the private sector, express or implied consent of the individual is a requirement, with limited exceptions.

In a smart city, where sensors are embedded in roads, streetlights and public spaces, it becomes difficult, if not impossible, for private sector organizations to obtain meaningful consent from individuals. It can also be challenging for municipal governments and other public sector institutions to give meaningful notice to individuals.

There are also differences between how information that has been collected can be used. For private sector organizations, consent must be provided for specified uses. However, public sector organizations may be able to use information for uses that are consistent with the original purpose for collection. In the healthcare context, health information custodians generally, but do not always require consent to the collection, use and disclosure of personal information.

In a smart city, data often has greater value for the many additional uses that may not have been foreseen at the time of collection. One way data is different than physical resources is it becomes more useful the more it is used. Given the public-private partnerships that may be involved in a smart city, the differences in the privacy legislation applying to the different parties complicates the ability to collaborate on new uses.

Privacy by Design is a framework for safeguarding privacy developed by former Ontario Information and Privacy Commissioner Ann Cavoukian. It is now a global standard that has been adopted by the International Assembly of Privacy Commissioners and Data Protection Authorities.

The 7 Foundational Principles of Privacy by Design

1. Proactive not Reactive; Preventative not Remedial The Privacy by Design (PbD) approach is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events before they happen. PbD does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred — it aims to prevent them from occurring. In short, Privacy by Design comes before-the-fact, not after.

2. Privacy as the Default Setting We can all be certain of one thing — the default rules! Privacy by Design seeks to deliver the maximum degree of privacy by ensuring that personal data are automatically protected in any given IT system or business practice. If an individual does nothing, their privacy still remains intact. No action is required on the part of the individual to protect their privacy — it is built into the system, by default.

3. Privacy Embedded into Design Privacy by Design is embedded into the design and architecture of IT systems and business practices. It is not bolted on as an add-on, after the fact. The result is that privacy becomes an essential component of the core functionality being delivered. Privacy is integral to the system, without diminishing functionality.

4. Full Functionality — Positive-Sum, not Zero-Sum Privacy by Design seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. Privacy by Design avoids the pretense of false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both.

5. End-to-End Security — Full Lifecycle Protection Privacy by Design, having been embedded into the system prior to the first element of information being collected, extends securely throughout the entire lifecycle of the data involved — strong security measures are essential to privacy, from start to finish. This ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion. Thus, Privacy by Design ensures cradle to grave, secure lifecycle management of information, end-to-end.

6. Visibility and Transparency — Keep it Open Privacy by Design seeks to assure all stakeholders that whatever the business practice or technology involved, it is in fact, operating according to the stated promises and objectives, subject to independent verification. Its component parts and operations remain visible and transparent, to users and providers alike. Remember, trust but verify.

7. Respect for User Privacy — Keep it User-Centric Above all, Privacy by Design requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options. Keep it user-centric.

To create a smart city of privacy rather than a smart city of surveillance, Ann Cavoukian believes the most important rule to enforce is de-identification of data at source. De-identification is the general term for the process of removing personal information from a record or data set. De-identification protects the privacy of individuals because once de-identified, a data set is considered to no longer contain personal information. If a data set does not contain personal information, its use or disclosure cannot violate the privacy of individuals. De-identification at source means de-identifying at the point of collection, before the data is stored in a database or shared. De-identified data creates a win-win, because it unlocks the value of data sharing for public benefit without violating individual privacy.

However, de-identified data can be subject to de-anonymization attacks. In 2006, Netflix launched an annual open innovation competition to improve the prediction of user ratings of films. Although the data sets they provided for the competition were de-identified, two researchers from the University of Texas at Austin were able to re-identify individual users by matching the data sets with film ratings on the Internet Movie Database. Privacy concerns contributed to Netflix terminating the competition in 2010. A recent MIT study demonstrated that by combining two anonymized datasets of people in Singapore, one of mobile phone logs and the other of transit trips, 17% of users could be matched up with 95% accuracy.

In an interview with Ann Cavoukian, she recommended strong de-identification protocols in combination with a risk of re-identification framework. The risk framework would assess the risk and only give the green light for collecting data if the risk of re-identification was very low; perhaps only 1 chance out of 200.

A second risk associated with de-identified data is it is still possible to analyze, influence and discriminate against groups of people using de-identified data. There is currently no legal protection for groups until it can be demonstrated that an individual is impacted. While there is an emerging literature exploring the challenging topic of group privacy, currently there is no consensus on how to safeguard group rights.

The Need for Equity

A third reason to govern the digital layer is that large pools of data and the algorithms that process the data are a source of power. The vast amount of data that is owned or controlled by the technology giants Facebook, Apple, Amazon, Netflix and Google is a significant factor in their market valuations. If data is going to be aggregated from the daily movements of citizens to create and capture new value we need to ask: are the benefits distributed fairly? How do the citizens, governments, and corporations that have built and populated the digital layer together share this new source of wealth?

One approach to these questions is to make all of the data open and free. Another approach is to assign all of the IP to a governing body like the trust. A third option is to create an IP sharing agreement.

Intellectual Property (IP) law exists to determine how to share value. When parties agree to pool data so that it can be used by others, according to Anthony de Fazekas there are five forms of IP that need to be addressed. In non-technical terms these are:

Background IP: the Intellectual Property of a party that is owned or controlled by that party before the agreement, or created by a party outside the scope of the agreement

Foreground IP: the Intellectual Property generated within an agreement

Latent IP: Applications of Intellectual Property other than the specific purpose for which the Intellectual Property was created

Arising IP: Intellectual Property that arises from a collaboration that was not expected at the time of the agreement

Shared IP: Intellectual Property that is jointly owned or pooled between two or more parties

An IP and data collaboration framework could specify who owns the data, who can access it, how the value of IP is assessed, and how royalties are distributed to the owners of the IP. In Canada, the five superclusters are creating "frictionless" IP sharing agreements that could be a useful model for a data trust.

The Need for Competitive Markets

Data gathered in cities can drive innovation and be a source of new economic opportunity. Because sharing data could unlock solutions to major problems, the companies who help to solve these problems will generate revenue and jobs. These solutions could make life better here in Canada and then be exported globally. Conversely, if Canada lags other jurisdictions in finding ways to responsibly share smart city data, the hubs of this new industry will emerge in other regions, and Canada will become a consumer of solutions, rather than a producer.

Data is also a source of competitive advantage for individual firms. If one company holds proprietary access to large data sets that other companies can't access, this could effectively create monopolies with prohibitive barriers to market entry. If as a society we decide to instrument our cities to collect data and solve problems, we need governance to avoid anti-competitive practices. Sharing smart city data so that many companies have access is better than having the data under the control of a single company. However, governance of shared data should also consider the private data companies have proprietary access to that may in combination with the shared data create an even bigger asymmetrical advantage. If this is not considered, the trust could end up over-regulating a startup and under-regulating a tech giant.

Summary

In order to protect against data breaches and misuses; assure group privacy; advance equitable distribution of value; and promote economic competitiveness, a governing body is needed to steward the digital layer in the public interest. The governing body would make decisions about what data can be collected; what rules must be followed to protect privacy and security through the full data lifecycle; and what uses of data are permitted. The governing body would need to include channels for public participation and stakeholder consultation to ensure that everyone affected by their decisions has an opportunity to have their voice heard. There would also need to be an accountability framework to hold the governing body accountable to the decisions they make in the public interest.