Breadcrumb

Past TILT semimars

2018

11th December 2018 – Mireille van Eechoud

Topic: "The limits of lawmaking: Why a new intellectual property right won't save the media”

Abstract

In the current political climate, the key role media play as public watchdogs is both undisputed and under siege. Legacy print media struggle to make the shift to digital because the internet and the rise of social media have profoundly changed the landscape for news production and consumption. In response to those challenges, the EU plans to introduce a new exclusive right in information. The proposals for a press publishers right have drawn intense criticism, also from academics across Europe (see e.g. http://www.ivir.nl/nl/academics-against-press-publishers-right/). Decision-making is at a critical stage. In this lecture we take a closer look at the problems facing media, at the different proposals on the table & and the effects they are likely to produce.

Bio

Mireille van Eechoud is professor of information law at the University of Amsterdam, IViR. Het key areas of research are access to information, intellectual property and open data policy-making. Van Eechoud directs the LLM program Informatierecht. Among other things, she is a member of the Commissie Auteursrecht (the Ministry of Justice's advisory body on copyright policy) and of the Advisory Council of the Dutch Dataprotection Authority. She is also vice-president of the Dutch Copyright Society (Vereniging voor Auteursrecht) and member of the Excecutive Committee of ALAI, the international copyright association.

6th December 2018 – Miho Kamitsukue

Topic: “Privacy Law in Japan - From Torts View”

Abstract

In Japanese law, the first prominent case in which people paid attention to privacy was in 1964.

From then to now, developing a theory of protection for privacy had been based on Prosser's theory.

Whereas in today's society in Japan, both technological development and the approach towards individuality has changed dramatically. These changes has brought up many important questions on how to protect privacy.

These questions may lead to larger and more profound questions such as "What is privacy? and what is protection ?" Most countries have these concerning questions.

In Japan, we have these same questions. However, we also have distinctive characteristic points which is somewhat convoluted.

For example, the theory of privacy in Japan tends to be based on Prosser. On the other hand, protecting personal data is modelled-on not only after the U.S. but also after the E.U. such as GDPR.

So, what is the best way to protect privacy? and what is privacy?

In order to answer these fundamental questions, one has to analyze effective measures of protecting privacy and the meaning of privacy.

Bio

Dr. Miho Kamitsukue, Professor of Sapporo University in Hokkaido, Japan. Her research is focused on privacy in torts law. She is also a member of both administrative and governmental committees about privacy protection and personal data protection.

Abstract

The UN General Assembly has characterized climate change as “a common concern of mankind”. It is alleged that innovation and transfer of environmentally sound technologies (ESTs), which form part of the solution to climate change, have not taken place fast enough to effectively mitigate climate change. To the extent that ESTs are protected by intellectual property rights (IPRs), there is a growing recognition that the WTO’s Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS Agreement) could play a more important role in mitigating climate change. This book seeks to make the WTO TRIPS Agreement a more efﬁcient and effective instrument for facilitating innovation and transfer of ESTs, mainly through legal interpretative devices.

Two main research questions addressed in the book will be discussed in the session:

First, whether and, if so, to what extent do the minimum IPR standards established by the TRIPS Agreement facilitate innovation and transfer of ESTs?

Second, whether and, if so, to what extent can the TRIPS flexibilities be interpreted to facilitate innovation and transfer of ESTs so as to address global climate change?

Bio

Dr. Wei ZHUANG is Counsel with Praximondo, a Geneva-based firm. She advises governments, international organizations as well as companies on all aspects of international trade law, assists governments in WTO dispute settlement proceedings and assists companies in trade remedy investigations. Previously, she worked at the United Nations and the WTO. She was a Marie Curie Fellow with the DISSETTLE (Dispute Settlement in Trade: Training in Law and Economics) Programme, a Visiting Fellow at the University of Cambridge (Lauterpacht Centre for International Law), and a Research Fellow at the Max Planck Institute for Intellectual Property and Competition Law.

Wei holds a Ph.D. in international law and an LL.M in international dispute settlement (MIDS). She is the author of the book "Intellectual Property Rights and Climate Change: Interpreting the TRIPS Agreement for Environmentally Sound Technologies" (CUP, 2017).

23th November 2018 – Prof. Patrick Parenteau

Topic: “Comparing Climate Litigation Strategies in the US and the Netherlands”

Abstract

This talk will explore the similarities and differences between cases being litigated in the Netherlands and the United States to address the climate crisis. The cases to be discussed include the recent decision of the Hague Court of Appeals in Urgenda v The Dutch Government and the pending litigation between Milieudefensie and Royal Dutch Shell. On the US side I will discuss the Juliana case asserting a constitutional right to a stable climate, and the cases brought by cities and counties across the country seeking to hold the major oil companies liable for damages caused by climate disruption. At the heart of these cases is the debate over the proper role of the role of the courts in adjudicating rights and responsibilities associated with the causes and consequences of climate change.

Bio

Patrick A. Parenteau is Professor of Law and Senior Counsel in the Environmental and Natural Resources Law Clinic (ENRLC) at Vermont Law School. He previously served as Director of the Environmental Law Center at VLS from 1993-1999, and was the founding director of the ENRLC in 2004.

Professor Parenteau has an extensive background in environmental and natural resources law. His previous positions include Vice President for Conservation with the National Wildlife Federation in Washington, DC (1976-1984); Regional Counsel to the New England Regional Office of the EPA in Boston (1984-1987); Commissioner of the Vermont Department of Environmental Conservation (1987-1989); and Senior Counsel with the Perkins Coie law firm in Portland, Oregon (1989-1993).

Professor Parenteau has been involved in drafting, litigating, implementing, teaching, and writing about environmental law and policy for over four decades. His current focus is on confronting the profound challenges of climate change through his teaching, writing, public speaking and advocacy.

Professor Parenteau is a Fellow in the American College of Environmental Lawyers. He is the recipient of the Kerry Rydberg Award for excellence in public interest environmental law and the National Wildlife Federation’s Conservation Achievement Award.

Professor Parenteau holds a B.S. from Regis University, a J.D. from Creighton University, and an LLM in Environmental Law from the George Washington U.

13th November 2018 – Laurens Naudts

Topic: The Articulation of Fairness in Machine Learning: Justice, Equality and Data Protection.

Abstract

Machine Learning techniques guide decision-making processes in a varying but gradually growing number of contexts, ranging from everyday activities, e.g. shopping advice, to key societal ventures, e.g. law enforcement. These data-driven approaches inherently differentiate between individuals and groups of individuals however. Since they affect amongst others the information, resources and the opportunities that are granted to individuals, the fairness of algorithmic differentiation and its consequences in terms of equality have been questioned. Within this context the seminar will discuss how theories of justice have informed, and are important for informing future, developments and research within the field of ‘fair’ machine learning. The presentation will consider current fair machine learning approaches and whether they are sufficiently adequate in dealing with the ethical concerns generally associated with machine learning. After this preliminary assessment, it will be discussed how theories of justice could contribute towards future fair machine learning efforts. In particular, the seminar will assess how they could facilitate the further articulation of fairness in an algorithmic environment, not only with regard to the design of machine learning techniques, but also with regard to the contextual changes machine learning techniques might generate once they are deployed in an actual environment. Finally, a preliminary evaluation will be made whether the substantive and procedural mechanisms in data protection law, through its focus on the underlying data processes, can provide starting points for a regulatory framework capable of mitigating possibly unfair outcomes of algorithmic differentiation.

Bio

Laurens Naudts is a Doctoral Researcher and researcher in law at the KU Leuven Centre for IT & IP Law. His research interests focus on the interrelationship between artificial intelligence, ethics, justice, fairness and the law. Laurens’ Ph.D. research reconsiders the concepts of equality and data protection within the context of machine learning and algorithmically guided decision-making. As a researcher, Laurens has also been actively involved in several national and EU funded (FP7 and H2020) research projects, including inter alia iLINC, Preemptive and currently VICTORIA, Video Analysis for Investigation of Criminal and Terrorist Activities.

Laurens was formerly appointed as a researcher at the European University Institute (Centre for Media Pluralism and Media Freedom) where he performed scientific research for the project "Strengthening Journalism in Europe: Tools, Networking, Training". E-mail: laurens.naudts@kuleuven.be

16th October 2018 – Izabela Skoczen

Topic:"Trust an "Implicatures in legal contexts"

Abstract

Classical theories of communication rely on the assumption that the purpose of communicating is information transmition. However, human communication seems a much more complex interplay of an array of social goals that are not necessarily connected to informing your interlocutor of some fact. I explore these ‘strategic’ goals and try to suggest a modification of Paul Grice’s theory of communication so as to give an explanatory account of the linguistic processes within the realm of law. I also explore the usefulness of modeling communication in legal contexts through the state-of-the-art rational speech act theory (RSA) based on game theory, bayesian decision theory and machine learning.

Bio

Izabela Skoczen obtained her PhD from the Faculty of Law and Administration at the Jagiellonian University in Krakow. In 2017 she took up employment with the Krakow Faculty of Law and Administraion. Since 2014 she has been teaching two subjects to first-year law and administration students (Introduction to jurisprudence and Logic for lawyers). She is the author of a number of papers on the intersection of legal theory and philosophy of language, some of which have been published in journals such as the International Journal for the Semiotic of Law and an active participant in international conferences. Her book entitled 'Implicatures within legal language' is forthcoming in the Springer Law and Philosophy serie. She is a member of the interdisciplinary Jagiellonian Centre for Law, Language and Philosophy.

Abstract

Legal articles on AI usually commence with a preamble describing its (purported) revolutionary impact on nearly all areas of commerce and social practice. We are, after all, in the middle of another “AI summer.” It is also commonplace to indiscriminately assume that technological progress must be accompanied by new laws and regulations. Unsurprisingly, advancements in AI result in recurring suggestions to emancipate the AI-based system - be it as a symbolic recognition of its sophistication (robot rights!?) or as a method of shielding its human operator from liability for its malfunctions and emergent behaviors. Taking a skeptical point of departure and analyzing the potential problems from the perspective of transaction automation, I ask a number of seemingly simple questions. Is there a threshold in technological advancement the passing of which triggers the need to separate the AI from its human operators and change the liability allocation for its deployment? What would be the criteria of such separation? Is it the complexity of the algorithm, the type of delegated task or the degree to which the original human decision becomes dis-associated from the final output produced by the AI? Does the human operator remain liable if, after initiation, the AI need not receive any further human input? Could or should the AI ever be liable? Who really deserves protection – the operator of the AI or the person interacting with it? The presentation aims to solicit a constructive critique of some initial findings. The broad assumption is that, technological sophistication aside, any regulatory efforts must be preceded by a diagnosis and classification of the actual legal problems created by AI in specific commercial contexts. The next step should be an evaluation whether such problems (if any) can be addressed by existing legal instruments and principles.

Bio

In parallel with a line of research focused on distributed ledger technologies and smart contracts, I am involved in multiple projects relating to the legal implications of automation, the deployment of ‘intelligent agents’ in retail environments as well as the transactional imbalances created by the use of consumer-facing technologies, such as predictive analytics and AI. Before joining academia, I have worked in-house in a number of software companies, Internet start-ups and telecommunication providers in Australia, Poland, Malaysia and the United Arab Emirates. I advised on e-commerce, payment systems, software licensing and technology procurement.

19th June 2018 - Esther Keymolen

Topic: "Trust and Technology: The evolution of a hotel key.”

Abstract

How to deal with hotel guests who forget to return their key? Over the years, hotel owners facing this challenge have come up with several technical solutions to cope with this understandable yet irresponsible behaviour of their guests. The hotel key has been replaced by a keycard and most recently by a digital key that can be downloaded on a smartphone. In her talk, Esther will argue that with every step in the innovation process, the trust relation of hotel owner and hotel guest is mediated in a distinct way. The networked nature of the digital key enables the collection of personal information based on which the hotel can tailor its services to the wishes of the hotel guests. While this may be convenient for guests it, however, also makes them vulnerable as they have only limited control over the data and increasingly come to depend on the conduct of the hotel. The digital key is not merely a key to open a hotel door; it also unlocks the personal information of the guest.

Bio

Esther Keymolen (1982) is assistant professor and director of education at eLaw, the Center for Law and Digital Technologies at Leiden University. Esther has research interests in the philosophy of technology, online trust, digital governance, and ethics of technology. Her dissertation: Trust on the Line. A Philosophical Exploration of Trust in the Networked Era has been published by Wolf Legal Publishers. Currently, she is engaged in several case studies on trust and privacy in relation to networked technologies. Future research will focus on the development of an ethical framework to ground trust in data-driven environments.

30th May 2018 - Ugo Pagallo

Topic: "The Hard Cases of AI, Robotics, and their Legal Governance"?

Abstract

The workshop will focus on today's state-of-the-art in AI and robotics, taking into account cases of disagreement among scholars in the legal field, e.g. current debate on the "electronic personhood" of robots. The aim is to address the hard cases of AI and robotics through both the primary and secondary rules of the law, that is, both the rules that aim to directly govern human behavior, and the rules that allow to change the primary rules of the system. At times, in the field of AI and robotics, we should put secondary rules first.

Bio

A former lawyer and current professor of Jurisprudence at the Department of Law, University of Turin (Italy), he is Faculty Fellow at the Center for Transnational Legal Studies in London, U.K. and NEXA Fellow at the Center for Internet & Society at the Politecnico of Turin. Author of eleven monographs and numerous essays in scholarly journals and book chapters, he has been member of many EU projects and researches, among which the RPAS Steering Group on drones, the Group of Experts for the Onlife Initiative set up by the European Commission, and Expert for the evaluation of proposals in the Horizon 2020 robotics program. He is currently working with the European Institute for Science, Media, and Democracy (Atomium), in order to set up AI4People, the first global forum in Europe on the Social Impacts of Artificial Intelligence. His main interests are Artificial Intelligence & law, network and legal theory, and information technology law (specially data protection law and copyright).

15th May 2018 - Jorge L. Contreras

Topic: “Property Rules, Liability Rules and Genetic Information”

Abstract

Under the laws of most western countries, there is no property interest in mere facts or information. Accordingly, significant biomedical research is conducted around the world using genetic, phenotypic and clinical data gathered from patients and research subjects, typically with the informed consent of the subject. Today, however, an alarming trend has emerged in the United States. Fueled by popular depictions of individuals disadvantaged by an unsympathetic research establishment (e.g., Henrietta Lacks), both individual litigants and policy makers have begun to speak of an individual property interest in genetic data.

Propertizing health data has potentially catastrophic implications for biomedical research, particularly if individuals have the right to withdraw data from in-progress studies or, worse, cause that data to be destroyed. In prior work, we have proposed that the property-like treatment of genetic data be replaced by affirmative regulations of researcher conduct (liability rules per Calabresi and Melamed) to protect individuals from abusive research practices. This approach would shift the landscape from one in which data-based research cannot occur without the consent of individual research participants to one in which research is presumptively allowed, but researchers face liability for overstepping the bounds of permitted activity. We explore two case studies in Denmark and the U.S. in which liability-based approaches have successfully been used to protect individual privacy while at the same time ensuring the integrity of the research enterprise.

17th April 2018 - Maja Brkan

Topic: "Do Algorithms Rule the World? Algorithmic Decision-Making and the Right to Explanation in the GDPR"

Bio

Maja Brkan is Assistant Professor of European Union Law at Maastricht University since 2013 where she is responsible for coordinating the core course on EU institutions and for supervising students researching on data privacy aspects of Big Data and Artificial Intelligence. She is Associate Director of the Maastricht Centre for European Law, a member of the European Centre on Privacy and Cybersecurity (ECPC), holds a position of Associate Editor of the European Data Protection Law Review and regularly presents her work at international conferences in Europe and in the US. She has published widely on privacy and data protection in peer-reviewed journals. She is the author of an award-winning PhD thesis in EU law and holds a prestigious Diploma of the Academy of European Law from the European University Institute. Before moving to Maastricht, she worked as a legal advisor (référéndaire) at the Court of Justice of the EU (2007-2013). Under her supervision, students achieved first and second place at the widely recognised European Law Moot Court Competition.

Abstract

The purpose of this paper is to analyse the rules of the General Data Protection Regulation and of the and the Directive on Data Protection in Criminal Matters on automated decision making in the age of Big Data and to explore how to ensure transparency of such decisions, in particular those taken with the help of algorithms. Both legal acts impose limitations on automated individual decision-making, including profiling. While these limitations of automated decisions might come across as a forceful fortress strongly protecting individuals and potentially even hampering the future development of AI in decision making, the relevant provisions nevertheless contain numerous exceptions allowing for such decisions. Moreover, in case of automated decisions involving personal data of the data subject, the GDPR obliges the controller to provide the data subject with ‘meaningful information about the logic involved’ (Articles 13(2)(f) and 14(2)(g)), thus raising the much-debated question whether the data subject should be granted a right to explanation of the automated decision. While such a right would in principle fit well within the broader framework of GDPR’s quest for a high level of transparency, it also raises several queries: What exactly needs to be revealed to the data subject? How can an algorithm-based decision be explained? Apart from technical obstacles, we are facing also intellectual property obstacles and implementation obstacles to this ‘algorithmic transparency’. The paper seeks to find ways how to reconcile the potential recognition of the right to explanation with the transparency requirement.

27th March 2018 - Sean McDonald

Sean McDonald, who gave a keynote at TILTing in 2017, will present an analysis of the most commonly suggested/pursued legal approaches to data and digital governance. There are multiple models worldwide, each of which implies a different perspective on what data is for, what negative impacts it may have, and what it can achieve.

Sean is the CEO of FrontlineSMS and a fellow at Stanford’s Digital Civil Society Lab. He is author of Ebola: A Big Data Disaster and co-author of the recently published Do No Harm: A Taxonomy of the Harms and Challenges of Humanitarian Innovation.

20th March 2018 - Jens Prüfer

Topic: "Competing with Big Data"

Jens Prüfer is an economist at Tilburg University’s Department of Economics and a member of the Tilburg Law and Economics Center (TILEC). His research focuses on institutional and organizational questions, applying economic methodology to a broad set of disciplines, including law, management, political science, history, religious studies, and computer science. He studied Economics and Chinese studies in Tübingen (Germany) and Singapore and holds a PhD in Economics from Goethe University Frankfurt.

Abstract

We study competition in data-driven markets, where the cost of quality production decreases in the amount of machine-generated data about user preferences or characteristics. This gives rise to data-driven indirect network effects. We construct a dynamic model of R&D competition, where duopolists repeatedly determine innovation investments. Such markets tip under very mild conditions, moving towards monopoly. After tipping, innovation incentives both for the dominant firm and for competitors are small. We show when a dominant firm can leverage its dominance to a connected market, thereby initiating a domino effect. Market tipping can be avoided if competitors share their user information.

27th February 2018 - Gianclaudio Malgieri

Topic: "R.I.P.: Rest in Privacy or Rest in (Quasi-)Property? The protection of data of deceased data subjects between Data Protection Law and National solutions"

Gianclaudio Malgieri is a Doctoral Researcher at LSTS, VUB where he works at a H2020 Project (SUCCESS) for the development of security & privacy by design techniques of smart meters. He also researches on algorithm regulation and data ownership / data market. He is also lecturer of Intellectual Property law at University of Pisa and of Data Protection law at Sant'Anna School of Advanced Studies of Pisa.

Abstract

The protection and management of personal data of deceased person is an open issue both in practical terms and in theoretical terms. The fundamental right to privacy and to personal data protection (as well as secondary legislations, as GDPR) seem inadequate to protect data of deceased subjects. Accordingly, data controllers might be free to process data of deceased subjects without any guarantee. It might have an adverse affect not only on the memory and post-mortem autonomy of deceased people, but also on living relatives.

Different theoretical solutions have been proposed: post-mortem privacy (based post-mortem autonomy); analogical application of copyright law or of inheritance law (data as digital assets). Actually the concept of “quasi-property” (from common law jurisprudence) might also prove interesting.

Some EU member states already provided different solutions for data of deceased people. In particular, we will compare two examples: Italian Data Protection Law and the new French “Loi sur la République Numérique” (which adopts an approach similar to advance healthcare directives).

2017

Aline is a Post Doc researcher at the German Federal Institute for Occupational Safety and Health (BAuA), in the Division Hazardous Substances and Biological Agents. She is leading an interdisciplinary project on the establishment of governance networks, with representatives from academia, industry, and regulatory agencies, to explore their potential for contributing to the regulation of advanced materials such as nanomaterials. Aline received her PhD from the University of Twente in the Group Law and Regulation (Thesis title: Effective Regulation under Conditions of Scientific Uncertainty, 2015). She was a visiting scholar at the University of Padova (2014) and Northeastern University (2013). Aline received her Master of Science (Science and Technology Studies, 2011) and Bachelor of Arts (Arts and Culture, 2009) degrees from Maastricht University.

Abstract

In the regulation of technologies a well-known image is that of the hare (technology) and the tortoise (regulation). The latter moves slowly and is rather stiff whereas the hare is agile, moving quickly in different directions and is running ahead. In this talk I explore the opportunities and limits of a proposed solution to the problem that regulation cannot keep pace with technological change. Often regulators do not have the necessary information available to develop science-based rules for the mitigation of the risks of new technologies.

Most technological innovations in the area of chemicals, e.g. nanomaterials, are developed in startup enterprises and small research institutes. Typically, these enterprises do not have specialized expertise and knowledge available related to chemical safety and they are requiring support in dealing with, and applying, related regulatory obligations. The companies realize the need to learn about these aspects because potential human health and environmental risks are often still uncertain in the early phases of material development but at the same time (potential) customers raise concern about these issues.

I argue that engaging startups in governance networks can lead to the development of effective chemicals regulation. In these networks representatives from academia, industry, research institutions, and regulatory bodies can exchange and extend the knowledge on health and safety aspects of innovative materials. When the collaborators show strong trust in each other they exchange relevant knowledge which can provide the basis for the development of new scientific facts and (amendment of) science-based rules. However, ‘too much’ trust among collaborators bears the risk of capture and development of rules that merely fit the interests of industry. The nature of this ‘trust-based regulation’ will be explored more closely in future research with the idea to identify conditions under which this form of regulation can be applied.

On this basis a governance strategy will be developed for the regulation of new technologies under conditions of risk uncertainty.

Rather than providing clear cut research results, in this talk I want to present preliminary findings related to my postdoc project at the Federal Institute for Occupational Safety and Health (BAuA) in Germany. With this project BAuA starts engaging in the topic of regulatory governance. Ideally, in the future, new research projects in this area can be set up that allow us to strategically reflect on the impact of research and development activities related to chemicals as well as other themes such as robotics or the internet of things.

2nd of November 2017 - Alex Ingrams

Topic: "Governance tensions and big data: A framework for public values"

Alex Ingrams is an assistant professor in the School of Governance at Tilburg University, which he joined in January 2017. He earned his PhD in public administration from Rutgers University in the United States. His research interests focus on government transparency, accountability, and open government. His latest focus has been on open government reforms in the United States and the United Kingdom as well as challenges to public governance reform raised by digital trends such as open data and big data.

Abstract

New Public Governance literature gives substantial attention to governance dilemmas where trade-offs between core pubic values such as democracy and efficiency or hierarchy and networks present challenges for public agencies (Osborne, 2006). In a separate stream of literature, scholars herald a governance transformation that will result from the adoption of digital information systems reform as a tool of policymaking and service provision (Dunleavy et al., 2006).

One focal point of digital reform, big data, may present new public values dilemmas centred around traditional or novel types of trade-offs. For example, some scholars have already noticed that big data can provide formulas for decision-making that are impartial, while also consolidating systems of discrimination against service users (e.g., Janssen and Kuk, 2016). Given the seriousness of these trade-offs, as well as the relevance of the two streams of literature to contemporary governance theory, the two streams need to be brought into closer alignment so that public agencies are better prepared.

This paper undertakes such a synthesis by focusing on the role of organizational design in public information institutions that deal with the tensions such as watchdogs, freedom of information systems, and parliamentary ombudsmen. A framework is developed to analyze tensions and cases of several countries’ public information agencies are explored. The merits and disadvantages of the cases is compared, and theoretical propositions are advanced to identify some of the best ways forward as well as the challenges that lie ahead.

24th of October 2017 –-David Wall

Topic: "Conceptualising Cybercriminals Today"

David S. Wall, PhD is Professor of Criminology in the Centre for Criminal Justice studies, School of Law, University of Leeds, UK where he researches and teaches cybercrime, organised crime, policing and intellectual property crime. He has published a wide range of articles and books on these subjects. He also has a sustained track record of interdisciplinary funded research in these areas from the EU FP6, FP7, H2020, ESRC, EPSRC, AHRC & other funders, such as the Home Office and DSTL. David has been a member of various Governmental working groups, such as the Ministerial Working Group on Horizon Planning 2020-25, the Home Office Cybercrime Working Group (2014-2016) which looked at issues of policy, costs and harms of crime and technology to society, and the HMIC Digital Crime and Policing working group in 2015, plus other committees. He is an Academician of the Academy of Social Sciences (FAcSS) and a Fellow of the Royal Society of Arts (FRSA). He re-joined Leeds University in August 2015 from Durham where he was Professor of Criminology (2010-2015) and Head of the School of Applied Social Sciences (2011-2014). Prior to moving to Durham he was Head of the School of Law (2005-2007) and Director of the Centre for Criminal Justice (2000-2005) at the University of Leeds. His current research projects on internet technology, crime and policing include the following ‘Public Confidence in Policing and Regulating Crime in a Changing Cyberthreat Landscape’ (EPSRC/ESRC CeRes), ‘Identifying and Policing Cybercrime in the Cloud’ (EPSRC/ESRC CRITiCal), ‘Ransomware and Cybercrimes of Extortion’ (EPSRC/ ESRC EMPHASIS); ‘Policing Bitcoin and Cryptocurrencies’ (N8 Catalyst Fund); ‘Understanding Organised Crime and Terrorist Networks’ (Horizon 2020 TAKEDOWN); ‘Transnational Organised Crime in the 21st Century’ (ESRC TNOC Synthesis). His recently completed projects include: ‘The Implications of Economic Cybercrime’ (City Of London Police, w. Cardiff Univ.) and ‘The Infiltration of Legitimate Economies by Organised Crime Groups’ and ‘The Proceeds of Organised Crime in Europe’ (EU FP7 ARIEL & OCP).

Abstract

More than a quarter of a century has passed since the internet captured the imaginations of criminals and cybercrime became part of the threat landscape and an international concern. Yet, how clearer are we today about i) what these offenders are trying to achieve, ii) what motivates them, iii) who they are, iv) how they organize themselves, v) and, what do we do with them? This paper will explore answers to these five questions. It draws upon my 20+ years of research into Cybercrime and also early findings from my new interdisciplinary, international and collaborative research projects that explore Cybercrime and organized crime online in today’s world of cloud technology and the Internet of Things.

11th of October 2017 - Nicolo Zingales

Topic: "The rise of infomediaries and the marketization of data protection enforcement"

Nicolo Zingales is a Lecturer at Sussex Law School, where he teaches antitrust, IP and data protection law. He is also affiliated researcher to TILT, TILEC and the Stanford Center for Internet and Society. His research spans across different areas of internet law and governance, particularly in shaping the roles and responsibilities of intermediaries in the online ecosystem.

Abstract

With the data economy, personal information has become a valuable asset not only for companies offering customized goods and services or whose business model depends on selling consumer data to third parties, but also and increasingly, for consumers themselves. Consumer awareness about the value of the information they provide is on the rise, as various options to trade, compare and extract value from that information become available. The General Data Protection Regulation (GDPR) makes a substantial contribution to consumer empowerment by strengthening enforcement, and granting new substantive rights such as the right to data portability, the right to be forgotten and the so called right to “ explanation" for automated decision-making. While the breadth of scope of those rules still needs to be fleshed out through interpretation, the incentives are clearly lined up for the rise of intermediary entities specializing in the enforcement of data subject rights. This creates the conditions for a perfect storm in data protection law, pitting data-driven businesses against this new powerful force of “infomediaries”, who are likely to alter the competitive dynamics in the online ecosystem by providing a centralized avenue for personal information management and collective empowerment. This presentation aims to reflect on the opportunities and challenges associated with the reliance on third parties for the exercise of data protection rights, contextualizing it as part of the more general trend towards “marketization” of enforcement initiated by the GDPR.

26th of September 2017 - Eleni Kosta

Topic: "Guidelines for ethical assessment of EU projects"

Prof. Dr. Eleni Kosta is full Professor of Technology Law and Human Rights at the Tilburg Institute for Law, Technology and Society (TILT, Tilburg University, the Netherlands). Eleni obtained her law degree at the University of Athens (Greece) in 2002 and a Masters degree in Public Law at the same University in 2004. In 2005 she completed an LL.M. in legal informatics at the University of Hannover (Germany) and in 2011 she was awarded the title of Doctor of Laws at the KU Leuven (Belgium) with a thesis on consent in data protection. Eleni is conducting research on privacy and data protection, specialising in electronic communications and new technologies, as well as on health law. She has been involved in numerous EU research projects and is teaching "Capita Selecta Privacy and Data Protection" at the LLM Law & Technology of the Tilburg Law School. In 2014 Eleni was awarded a personal research grant for research on privacy and surveillance by the Dutch Research Organisation (VENI/NWO), which she is currently working on. Eleni also collaborates as associate with timelex (www.timelex.eu).

Abstract

Regulation 1291/2013 establishing Horizon 2020 - the Framework Programme for Research and Innovation (2014-2020) introduced new rules about ethics as an integral part of research from beginning to end for all activities funded by the European Union, ethics is an integral part of research from beginning to end. Since then, ethical compliance has been seen as pivotal to achieve real research excellence and the European Commission introduced new ethics assessment processes. This seminar will present the new rules on ethics assessments and will provide guidance on how to deal with ethics issues when preparing application for the EU.

14th of September 2017 - Tim Falkiner

Topic: "THE LAW AND SCIENCE - Cybernetics, the Meta-Grammar of Legislation"

Tim is a retired lawyer and town planner from Melbourne, Australia.

Tim was one of the last generalist lawyers, educated in every field of law and he had, largely through accident, an enviably wide practical experience as both a lawyer (humanity) and town planner (science). In the sixties and mid-seventies he worked as an articled clerk and then as a solicitor with a law firm. In 1976 he took a job as a town planner in the Town and Country Planning Board studying part-time to obtain a Diploma of Applied Science (Town Planning). He then worked as the legal officer at the Ministry for Planning. In the late 1980s, Tim was employed in Constitution, Legislation and Advisings at the Victorian Crown Solicitor’s Office. Then he left and joined the Victorian Bar where, as a barrister, he carried out a wide range of advocacy, advice and drafting work.

Tim thus had an unusually wide range of experience, particularly at the Ministry for Planning. There, he was advising on and assisting in the administration of a wide range of planning and environmental statutory schemes at a time of great change and he could get some overview of their operation.

In the sixties and seventies the law was less complex, less voluminous and more certain. In the town planning course
Tim was taught some control systems theory by two great mentors. Tim was interested in computers and, as a member of the Micro Computer Club of Melbourne, he learnt computer programming. He studied micro-Prolog and attempted to write computer programs to solve legal issues.

In the early 1990s, Tim noticed parallels in a book written by a lawyer on legislative drafting and a book written by the project manager of the IBM 360 computer and operating system. Tim tried to write a book integrating the knowledge of writing large sets of instructions (control systems) gained by the legislative draftspersons and software engineers.

The breakthrough came when Tim found a book in the Melbourne library titled, “Cybernetics – Brain of the Firm” by an English scientist, Stafford Beer, which enabled Tim to at least partly understand the underlying nature of control systems and enabled him to write his 1992 book, “Scientific Legislation”.

Nothing happened for over twenty years. But a few years ago, Tim was contacted by an Austrian informatics academic, who very kindly wrote, “… your book is definitely the only book I know which connects cybernetics and legislation and hence is going to the root causes [of vagueness and complexity in law]. A lengthy correspondence followed and this correspondence rekindled Tim’s interest in the work. Tim has since given lectures at the Parliamentary Counsel offices in Victoria and Tasmania.

Tim believes it is important that lawyers learn to understand law from a scientific, a cybernetic, viewpoint. In the 1960s, our laws were far more certain than they are now. Our society is more complex and, unless we can handle that complexity, our social, economic and environmental systems will continue to behave erratically or worse.

Abstract

The object of this lecture is to bridge the widening gap between law as a humanity and the hard sciences. There are natural laws that apply to control systems and, since legislation is a control system, these natural laws apply to legislation too.

General Systems Theory

Recapping some of the basic principles of general systems theory: the importance of interaction of parts of things, the dynamic nature of the interaction and the arbitrariness of the definition of any system.

Control Systems Theory – Cybernetics

Plato and Ampere used the term cybernetics. The modern science of cybernetics was developed in Britain and the USA following World War II.

The basic laws of control systems are:

There are fundamental principles of control which apply to all large systems. (e.g. the mind, computers, social and economic systems).

All large systems have a control mechanism. (e.g. forest, motor car, corporation, modern civilized society).

The control system is isomorphic with the system under control.

The control system is part of and grows with the system under control.

The rate of change of the control system matches that of the system under control.

The regulator must match the variety of the regulated system to assure control (Ashby’s Law of Requisite Variety).

Ashby’s law is further examined together with the viable systems model, the nesting of viable systems and the application of the viable systems model to government.

Designing laws

Some advice is given on the implication of cybernetic laws in the design of legislative schemes with some remarks gathered from the lecturer’s own experience.

The system one is working on must be defined.

The purpose of the system must be defined using Stafford Beer’s principle of POSIWID (the purpose of a system is what it does).

The regulating agency must be organized isomorphically with the system under control.

The variety of the controlling and controlled systems must be balanced using variety attenuators and variety amplifiers.

The model must be tested with a variety of inputs to ascertain the outputs.

Monitoring and feedback mechanisms must be set up.

Feedback systems must be in real time and they should be acted on.

The regulator must be able to accommodate external disturbances.

Levels of government should share the same isomorphic structure.

Book “Scientific Legislation” applying cybernetics and software engineering knowledge to legislation. Need for reviewing and re-writing. Beginnings of outline of future legislative style based on cybernetic principles which might overcome problems of vagueness, complexity, volume and problems with current legislative maintenance.

27th of May 2017 - Lennon Chang

Topic: "Co-Production of Cyber Security"
Dr Lennon Yao-chung Chang is Lecturer in Criminology in the School of Social Sciences at Monash University, Australia. He is also an associate investigator at the Australian Research Council Centre of Excellence in Policing and Security at the Australian National University and a member of the International Cybercrime Research Centre at Simon Fraser University. He is Vice-chairman of the Asia Pacific Association of Technology and Society which he co-founded in 2014. He was previously Assistant Professor of Criminology in the Department of Applied Social Sciences at City University of Hong Kong (2011-2015).

Dr Chang is interested in researching cybercrime, the governance of cyberspace and co-production of cyber security, particularly in the Asia-Pacific region. His book Cybercrime in the Greater China Region: Regulatory Responses and Crime Prevention (Edward Elgar, 2012) is about the nature and range of responses to cybercrime between China and Taiwan. His research is highly topical and he has been invited by the governments of Canada, Taiwan, Korea, and Hong Kong to discuss his research findings with senior national security, foreign policy and policing staff.

He was awarded his PhD by the Australian National University in November 2010. He has a Master in Criminology and Bachelor in Law degrees from National Taipei University. In 2007 he received an Endeavour Asia Award and in 2009 was identified by Peking University and Griffith Asia Institute as an Australia-China Emerging Leader. He was recently selected as a Global Emerging Voices Fellow and an Australia-China Youth Dialogue Fellow.

Abstract

Given the limited resources and capabilities of states to maintain cyber security, a variety of co-production efforts have been made by individuals or by collectives, of varying degrees of organization and coordination. This article identifies different forms of citizen co-production of cyber security and notes the risk of unintended consequences. Safeguards and principles are proposed in order to facilitate constructive citizen/netizen co-production of cyber security. Although co-production of security can contribute to social control, only those activities within the bounds of the law should be encouraged. Activities of private citizens/netizens that test the limits of legality should be closely circumscribed.

Damian Clifford is a doctoral researcher funded by Fonds Wetenschappelijk Onderzoek – Vlaanderen (FWO) at the KU Leuven Centre for IT and IP law (CiTiP). Before being awarded the FWO Aspirant Fellowship, he worked on FP7 and Horizon 2020 projects in the fields of critical infrastructure protection and cloud computing at CiTiP and also worked on a Flemish IWT-SBO project on personalised advertising at the University of Antwerp. Damian’s doctoral research is entitled “The Legal limits to the monetisation of online emotions” and will examine the legal issues surrounding the monetisation of online behaviour and emotions and the nudging or manipulation of internet users.

Abstract

This article explores the alignment of the consumer protection and data protection policy agendas. By exploring the historical roots and development of these respective policy agendas, the analysis aims to draw normative conclusions on the obstacles hindering more holistic citizen-consumer protection and call for the pursuit of a broader consumer protection policy agenda. The article argues that a continued focus on market integration in consumer policy will fail to appropriately balance fundamental rights issues and thus calls for a re-evaluation of Article 169(2)(b) as a potential basis through which consumer policy could be harnessed and mobilised to protect fundamental rights in the digital age. This line of argumentation builds on the recent criticism of the proposed Directive on contracts for the supply of Digital Content (which proposes to recognize data, including personal data, as counter-performance) and the subsequent European Data Protection Supervisor’s opinion on the proposed Directive indicating that Article 114 TFEU is the incorrect legislative basis for matters pertaining to the processing of personal data.

Dr Gregor Urbas is an Associate Professor of Law at the University of Canberra, where he teaches Criminal Law and Procedure, Cybercrime and Evidence Law. He has written extensively on criminal justice issues. He previously held positions at the Australian National University (ANU), the Australian Institute of Criminology (AIC) and the Law Council of Australia, and is an Adjunct Associate Professor at the ANU College of Law and at Simon Fraser University in Vancouver, Canada. Dr Urbas is currently a Visiting Fellow at the Tilburg Institute for Law, Society and Technology (TILT) at Tilburg University. He wrote the Australian country analysis for the Sweetie 2.0 Report published by Terre des Hommes in late 2016.

Abstract

The general legal prohibitions on the use of technologies for covert surveillance have exceptions for public authorities such as law enforcement agencies, whose actions are prospectively through judicial warrants or similar authorisations. Private use of surveillance devices is less well regulated, though there are exemptions based on consent and protection of lawful interests, which arise in some civil litigation. However, neither public nor private regulatory models appear to cover advanced evidence-gathering techniques such as the “Sweetie 2.0” chatbot developed by the Dutch organisation, Terre des Hommes. This presentation explores the issues and potential future directions of surveillance law reform.

14th of March - Mark Patterson

Topic: "The Application of Competition Law to Information Goods"

Mark R. Patterson is Professor of Law at Fordham University School of Law. He received his B.S. and M.S. degrees from The Ohio State University, both in electrical engineering, and he performed engineering research, primarily in robotics, for ten years. He then obtained his J.D. degree from Stanford Law School and practiced law at Choate, Hall & Stewart in Boston.

Professor Patterson teaches and writes about antitrust law, patent law, Internet law, and contracts, and he has taught seminars on International Intellectual Property Licensing, Law and Scientific Research, Competition and Information, and Technology and Human Rights. Professor Patterson has been a visiting professor at Bocconi University in Milan and at the University of Navarra in Pamplona, and a visiting fellow at the European University Institute. His current research focuses on the antitrust treatment of informational issues. He is a registered patent attorney.

Daphne Keller is the Director of Intermediary Liability at the Stanford Center for Internet and Society. She was previously Associate General Counsel for Intermediary Liability and Free Speech issues at Google. In that role she focused primarily on legal and policy issues outside the U.S., including the E.U.’s evolving 'Right to Be Forgotten'.

Her earlier roles at Google included leading the core legal teams for Web Search, Copyright, and Open Source Software. Daphne has taught Internet law as a Lecturer at U.C. Berkeley’s School of Law, and has also taught courses at Berkeley’s School of Information and at Duke Law School.

She has done extensive public speaking in her field, including testifying before the UK’s Leveson Inquiry. Daphne practiced in the Litigation group at Munger, Tolles & Olson.

She is a graduate of Yale Law School and Brown University, and mother to some awesome kids in San Francisco.

Abstract

The "Right to Be Forgotten" under the GDPR may look very different than the one the CJEU established in Google Spain. Stanford's Daphne Keller will discuss new notice and takedown requirements under the GDPR, and how existing European Intermediary Liability law may -- or may not -- work to prevent abuse and over-removal.

24th of January - Angela Daly

Topic: "Private Power Online: how does EU law fare?"

Abstract

The emergence of very large transnational private companies which provide critical Internet infrastructure and services has brought with it corresponding concerns about the power of these companies to control, surveil and otherwise influence our communications. Many of these companies also gather vast amounts of data by and about their users – a bank of data which has proved attractive to the public power of nation-states’ security and law enforcement agencies, which have accessed it in less than transparent and legitimate ways, as Edward Snowden’s revelations from 2013 attest.

Against this backdrop, and adopting a socio-legal methodology, this presentation considers some key topics, such as net neutrality, the Commission investigations into Google and the emergence of cloud computing, and considers how well existing EU legal and regulatory frameworks are able to protect individual Internet users’ interests vis-à-vis private power online.

This presentation is based on my book, Private Power, Online Information Flows and EU Law, which has just been published by Hart.

Bio

Dr Angela Daly is Vice Chancellor’s Research Fellow in QUT’s Faculty of Law and a research associate in the Tilburg Institute for Law, Technology and Society. She is a socio-legal scholar of technology and is the author of Socio-Legal Aspects of the 3D Printing Revolution (Palgrave 2016) and Mind The Gap: Private Power, Online Information Flows and EU Law (Hart 2016).

2016

5th of July 2016 - Angela Daly

Topic: "Don't believe the hype: 3D printing in law and society"

Abstract

Additive manufacturing or '3D printing' has emerged into the mainstream in the last few years, with much hype about its revolutionary potential as the latest 'disruptive technology' after the Internet to destroy existing business models, empower individuals and evade any kind of government control. This lecture will examine some of these themes from a socio-legal perspective, looking at how various areas of law interact with 3D printing theoretically and in practice, including intellectual property, product liability, gun laws, data privacy and fundamental/constitutional rights - and comparing this interaction to the Internet scenario. Despite rhetoric proclaiming 3D printing to usher in the end of government control and the end of corporate-enforced scarcity, 3D printing, especially consumer-oriented printers, may not be as disruptive to law and society as commonly believed. This is because other government and corporate actors, and not just empowered 'prosumers', have been investigating the potential of 3D printing for their own purposes, which may in the end just reinforce existing hierarchies and distributions of power.

Biography

Dr Angela Daly recently joined Queensland University of Technology's Faculty of Law as Vice Chancellor's Research Fellow. She is a socio-legal scholar of technology with expertise in intellectual property, human rights (privacy and free expression), and competition and regulation. Angela is the author of 'Socio-Legal Aspects of the 3D Printing Revolution' (Palgrave 2016), based on her postdoctoral research at the Swinburne Institute for Social Research, and 'Private Power, Online Information Flows and EU Law' (Hart 2017), based on her doctoral research at the European University Institute. Angela has degrees from Oxford University, Universite de Paris 1 Pantheon-Sorbonne and EUI, and has previously worked for Ofcom and the Electronic Frontier Foundation. She is also a research associate in the Tilburg Institute for Law, Technology and Society.

Abstract

Big Data analytics in national security, law enforcement and the fight against fraud can reap great benefits for states but also carry new risks that require extra safeguards to protect citizens’ fundamental rights. Despite the fact that currently most government programmes are still off the mark of being ‘Big Data’ there is a clear trend towards the use of (big) data analytics in government policy. Moreover, there are inherent tensions between some fundamental principles underlying the current legal regime of privacy and data protection and some the basic principles of Big Data in its ideal form. This requires a crucial shift in regulation from regulating data collection, to regulating the analysis and use of Big Data. Dennis Broeders will present and discuss the key findings and recommendations of the recent WRR report ‘Big Data in a free and secure society’.

Bio

Dennis Broeders is professor of Technology and Society at the department of Public Administration and Sociology of the Erasmus University Rotterdam and a senior research fellow at the Netherlands Scientific Council for Government Policy (WRR), an advisory body to the Dutch government within the Prime Minister’s department. His research broadly focuses on the interaction between technology and policy, with specific areas of interest in cyber security governance, internet governance and Big Data. At the WRR he recently coordinated the project on Big Data, privacy and security that resulted in the report ‘(2016, ‘Big Data in een vrije en veilige samenleving’).

21st of June 2016 - Stefania Millan

Topic: "Human rights in cybersecurity"

Bio

Stefania Milan is Assistant Professor of New Media and Digital Culture. Her research explores the interplay between technologies and participation, and activism and social movements in particular, cyberspace governance, and data epistemologies. She is the founder of the Data J Lab (currently in transition to UvA, and largely inactive) and the Principal Investigator of the DATACTIVE project, funded through a Starting Grant of the European Research Council (Stg-2014-639379).Stefania holds a PhD in Political and Social Sciences of the European University Institute, and a Master in Communication Sciences from the University of Padova, Italy. Prior to joining the University of Amsterdam, she worked at the Citizen Lab (University of Toronto), Tilburg University, Central European University, and the University of Lucerne, Switzerland, and the Robert Schuman Center for Advanced Studies (European University Institute). Stefania is also a research associate at the Tilburg Institute for Law, Technology and Society (Tilburg University), at the Internet Policy Observatory of Annenberg School of Communication (University of Pennsylvania), and the Center for Center for Media, Data and Society (Central European University).

19th of April 2016 - Sjaak van der Geest

Topic: "Privacy in different cultures: anthropological glimpses"

On 19 April professor emeritus Sjaak van der Geest visited TILT and gave a presentation during a seminar on: 'Privacy in different cultures: anthropological glimpses'. Sjaak van der Geest is emeritus professor of Medical Anthropology at the University of Amsterdam, the Netherlands. He conducted fieldwork in Ghana and Cameroon and published books and articles on marriage and kinship, perceptions and practices concerning birth control, witchcraft beliefs, anthropological field research, Ghanaian Highlife songs, missionaries and anthropologists, anthropology of the night, and various topics in medical anthropology, in particular the cultural context of pharmaceuticals in non-Western communities, hospital ethnography, perceptions of sanitation and waste management, and social and cultural meanings of care and old age in Ghana.
More info: sjaakvandergeest.nl

18th of April 2016 - Diana Carolina Valencia Tello

Topic: "E-Government and Accountability in Colombia"

Dr. Diana Valencia is a Post-Doctoral Researcher and Professor of State, Society and Technology at the Universidade Federal do Paraná, Brazil. Her research is focused on to what extent public administration has changed due to new technology and the globalization process. Dr. Valencia is interested in opportunities for cooperation with TILT. In addition to giving a seminar for both TILT and TSPB/TSPPA staff members, dr. Valencia gave a guest lecture for BA students Media, ICT and Policy.

15th of March 2016 - Juraj Sajfert

Topic: "From point zero to adoption in six months – how the Police Directive was adopted"

Abstract

Shortly after the political agreement has been achieved on the new EU data protection framework, the speaker will talk about the reasons why the European Parliament and the Council departed from the original Commission proposal of the Police Directive, in opposite directions, how the institutions reached a compromise in less than two months on two so different texts, and the meaning and implications of the end result.

Bio

Juraj Sajfert is a lawyer at the European Commission, DG Justice, Fundamental Rights Directorate. In this capacity he has been closely involved in the process of drafting and negotiating the EU data protection reform, mostly on the new Data Protection Directive for Police and Criminal Justice Authorities. Juraj graduated from the Faculty of Political Science, Zagreb and Faculty of Law, Osijek. He received an LL.M. in Comparative Constitutional Law with specialization in EU law from Central European University, Budapest. Juraj is admitted to the Croatian bar and briefly practiced law as an attorney. Prior to joining the Commission, Juraj was an intern at the European Court of Human Rights in 2010, worked for the European Parliament on preparation of the Croatian version of the acquis communautaire and internal EP's acts, and as a case-processing lawyer at the European Court of Human Rights.

16th of February 2016 - Hans Lindahl

Topic: "Cyber-law and the Globalization of Inclusion and Exclusion"

Bio

Prof.dr. Hans Lindahl (Head of the Department of European and International Public Law at Tilburg Law School) holds the chair of legal philosophy at Tilburg University, the Netherlands. He obtained law and philosophy degrees at the Universidad Javeriana, in Bogotá, Colombia, before taking a doctorate at the Higher Institute of Philosophy of the University of Louvain (Belgium) in 1994. He has worked since at Tilburg, first in the Philosophy Department, currently in the Law School. His primary areas of research are legal and political philosophy. His current research is primarily oriented to issues germane to globalization processes, such as the concept of legal order in a global setting; the relation of boundaries to freedom, justice, and security; a politics of boundary-setting alternative to both cosmopolitanism and communitarianism. In dealing with these topics Lindahl draws on (post-) phenomenology and theories of collective action of analytical provenance, while also seeking to do justice to the nitty-gritty of positive law.

2015

8th of December 2015 – Nedap Healthcare

Nedap Healthcare provided a seminar about ‘EU legislation practise’. Two employees of Nedap Healthcare, André Foeken (responsible for the development of Nedap Healthcare) and Olaf van Zandwijk (security officer and responsible for the platform on which the Nedap Healthcare applications run) discussed the following issues.

Abstract

The upcoming privacy legislation is accompanied by quite some challenges. Big changes are ahead which have not yet taken shape completely, in combination with the uncertainty concerning the time frame in which this will be introduced this poses a great challenge. The business goals and privacy goals have to be drawn from the same pool of capacity, making good decision-making essential. Nedap Healthcare explicitly chooses to embrace the new legislation and to enact the positive aspects for citizens towards its customers. They briefly explained which organizational, technical and legal changes they have made and which steps they still have to make before the law comes into force.

17th of November 2015 – Douwe Groenevelt

(ASML)

Douwe Groenvelt provided a seminar on the dynamics of IP litigation in the high tech sector. This includes topics such as Non-Practising Entities, FRAND terms, the patent law reform and settlement economics.

Bio

Douwe Groenevelt is Legal Director Technology to ASML. He was a lawyer in the Competition & Regulation department of De Brauw Blackstone Westbroek (the Netherlands). His practice ranges from merger control cases and cartel investigations to abuse of dominance-matters, with an industry focus on the technology sector. Building on his substantial experience in intellectual property law (cross-border patent and copyright litigation), Douwe is particularly active on the IP/competition law interface. He will head De Brauw’s Brussels office, scheduled to open mid-2011. Douwe is also a lecturer at the law faculty of the VU University Amsterdam (Computer/Law Institute), where he teaches courses on the competition law aspects of technical standards, internet jurisdiction and the legal implications of online file sharing.

3rd of November 2015 – Students of the Delta Lloyd Clinic

Abstract

The students of the TILT Clinic in cooperation with Delta Lloyd (Ohra) presented the results of their assignment.

Ohra is piloting a new, innovative, car-insurance product based on 'Pay How You Drive'. During the assignment the students worked reviewing the conditions of this pilot insurance policy with specific focus on the privacy aspects.

The students have also presented the results at the office of Delta Lloyd, on 23 October. Delta Lloyd had provided an interactive conclusion with a nice presentation by the students, followed by a spirited discussion.

30th of June 2015 - Jelle Brands

(lecturer Criminology - Leiden University)

Abstract

Against a background of discourses that link economic vitality of city-centers, consumption and safety to greater need for surveillance and policing, my work takes particular interest in the city-centre night-time economy. Nightlife spaces can be considered emotionally charged spaces, offering many opportunities for transgression of social norms that are taken for granted during day-time. This is demonstrated in the media and public debate, where the NTE has often come to be considered a source of crime and disorder. Local authorities have responded through significant increases in surveillance and policing in nightlife spaces. Looking up when out at night, we might spot one of many CCTV camera’s attached to lampposts and run into special police teams that patrol nightlife areas. Such interventions are not only believed to reduce crime and disorder, but also fear of crime and to enhance experiences of safety among nightlife consumers. But if they actually do so remains an unresolved question in many ways. On the one hand this is because studies in human geography on first-hand experiences of safety and the triggers and extent of fear of crime have focused disproportionally on daytime experiences. On the other hand, little geographical research has considered the ways agents of surveillance and policing are experienced in the context of people’s everyday lives. From the perspective of the nightlife consumer then, I have studied, and will speak about, experienced safety, surveillance and policing in the night-time economies of Utrecht and Rotterdam (The Netherlands). For various reasons, results are not straightforwardly supportive of the common assumption that greater surveillance and more extensive policing will enhance experiences of safety among visitors of city-center NTEs. Important differences in the capacities of police and CCTV surveillance to enhance experienced safety are evident. At the same time, I will not only report the efficacy of such interventions as such but also interpret these against the background of nightlife consumers’ actual and embodied experiences when out at night. Doing so, I also reflect on the need for and desirability of surveillance and policing in urban nightlife spaces more generally.

2nd of June 2015 - Philip Boucher

(policy analyst at the European Commission’s Joint Research Centre)

Philip Boucher provided a seminar titled: "Domesticating the Drone".

Abstract

Drones are aircraft that operate without an on-board pilot (i.e., autonomously or by remote pilot). Civil drones are those used for non-military purposes. The drones vary substantially in their form, capability, quality and price. Their applications vary in their purpose, acceptability and legality. While the technology is ready for safe use in many application areas, regulatory barriers remain fairly strong. The Commission is currently preparing policy measures to support a Europe-wide approach to regulating civil drones. This would likely benefit the market, as a fragmented approach would make it difficult to compete with more mature US and Israeli markets.

Concerns have been raised about public opposition to civil drones because of their association with military applications and some potentially controversial applications, such as policing and border control. However, very little is understood about public reactions to the technology, and strategies to manage public acceptability to date have relied upon several untested assumptions. Here, I describe research undertaken at the Joint Research Centre on this subject during 2014. In particular, I describe the results of public engagement activities designed to explore public visions of civil drones. Recommendations are aligned with reference to the responsible research and innovation agenda.

19th of May 2015 - Bibi van den Berg

(eLaw - Leiden University)

Abstract

“Two years ago a series of coincendences led my research endeavors into the field of cybersecurity. Having only superficial prior knowledge of this area of study my first inclination was to ask ‘what is cybersecurity?’. Two years down the line I’m still asking the same question, and I have yet to find an answer. As often happens to buzz words, cybersecurity is a topic of great interest to many, both within and outside of academia, yet also a source of utter conceptual confusion. One of the potential risks that always arises in the wake of conceptual murkiness is that researchers, policy makers and regulators don’t have enough overview of the breadth, width and depth of a phenomenon. In my view this certainly applies to current debates in cybersecurity. Scientists, legislators, opinion makers and the media almost all see the low hanging fruit but miss the rest of the tree. Or in slightly more dramatic terms: they see the tip of the iceberg and are at risk of running the ship of cybersecurity into the ice that’s hidden underneath the water. In this talk I have detailed my explorations of the field of cybersecurity over the last two years, unravelling the conceptual muddle that I encountered and pointing out potential future lines of research for a more comprehensive approach to understanding cybersecurity vulnerabilities.”

7th of April 2015 - Annalisa Pelizza

(STePS- University of Twente)

Annalisa Pelizza provided a seminar titled:"Developing the Vectoral Glance. Contribution of Science and Technology Studies to the iGovernment Agenda".

Abstract

Integrating information systems has become a normative goal for governments worldwide. Systems of key registers are revitalizing early eGovernment’s efforts to streamline internal procedures. At the same time, much needed concerns are arising about the risks for democratic accountability constituted by thickening governmental information infrastructures. Studies concerned with the political implications of interoperable information systems call for a new research agenda that looks at information flows, rather than at technical components. This paper aims at contributing to this endeavour by recalling that departing from technological determinism does not mean neglecting the technicalities of infrastructures within which identities emerge. It does so by recovering STS notion of ‘infrastructural inversion’, and by defining interoperability as a constitutive process of reordering that redefines institutional boundaries. The paper draws on a case study collected while working at governmental organizations in Italy. The analysis reveals a centralization of the power to certify data – traditionally retained by municipalities, taking place in a grey area at the intersection of juridical, technical and economic logics. I argue that to account for similar invisible shifts in institutional authority and accountability, it is paramount to develop a ‘vectorial glance’ that looks into scattered decisions happening at the micro-technical, operative level.

17th of March 2015 - Daniel Trottier

(Erasmus University & University of Westminster)

Abstract

This paper considers digital vigilantism as a user-led violation of privacy that not only transcends online/offline distinctions, but also complicates relations of visibility and control between police and the public. In 2013, Gary Cleary hanged himself in Leicestershire after being pursued by Letzgo Hunting, an online group that exposes suspected paedophiles. Likewise, in 2010 Mary Bale was subject to death threats when a video of her mistreating a cat in Coventry surfaced online. This global reaching practice harms the lives of those who are targeted, with no clear legal or policy recourse. This research will develop a theoretically and empirically grounded understanding of digital vigilantism in order to advance ethical and policy guidelines.

Digital vigilantism (DV) is a process where citizens are collectively offended by other citizen activity, and coordinate retaliation on mobile devices and social platforms. The offending acts range from mild breaches of social protocol to terrorist acts and participation in riots. The vigilantism includes, but is not limited to a Œnaming and shaming¹ type of visibility, where the target¹s home address, work details and other highly sensitive details are published on a public site (Œdoxing¹), followed by online and in-person harassment. The visibility produced through DV is unwanted (the target is typically not soliciting publicity), intense (content like text, photos and videos can circulate to millions of users within a few days), and enduring (the vigilantism campaign may be top search item linked to the target, and even become a cultural reference).

In outlining a research agenda for studying DV, we may consider the following questions: In what ways does digital media culture foster DV? How does DV shape theoretical understandings of structural violence, state power and communication counter-power? How does DV shape theoretical understandings of an online private/public paradox? How does the news media represent DV? What are the roles, challenges and problems of state power in respect to DV? What are the social impacts of DV for targeted victims and participants? How can educators and policy makers minimize harm associated with DV?

24th of February 2015 - Dr. Nicolo Zingales

(TILEC)

Abstract

Should injunctive relief be available to the holder of a Standard-Essential Patent (SEP) which committed to license on fair, reasonable and non-discriminatory (FRAND) terms, in order to prevent a third-party implementer from practicing a standard reading on that SEP, when that implementer is willing to take a license but the parties disagree on the terms of the license?

This paper focuses on the peculiar European dimension of this debate. It examines how Directive 2004/48 on the enforcement of intellectual property rights, while topical, has been implemented and applied in diverging ways across leading Member States. EU competition law can be used to fill in that harmonization gap. The paper reviews the recent Motorola and Samsung decisions of the Commission, and sets out the issues in Huawei v. ZTE, now pending before the Court of Justice of the European Union (CJEU). The CJEU should be aware of the broader role and impact of EU competition law in these matters, and should seek to use its impending judgment to set the right presumptions for the application of competition law in SEP disputes involving FRAND commitments.

17th of February 2015 - Jo Pierson

(SMIT/VUB)

Abstract

In order to explore future challenges for privacy and data protection in social, mobile and ubiquitous media technologies, we need to take an interdisciplinary perspective. This means that findings from privacy and security engineering and legal sciences, need to be complemented with insights from research in humanities and social sciences, more in particular on how people socially engage with these media and how their communication is being (re)configured. In this way we are able to assess to what extent and how people are effectively empowered or rather disempowered in relation to their privacy and publicness. This forms the outset of the presentation, where we incorporate different socio-technological aspects, foremost building on the integration of Science and Technology Studies (STS) with Media and Communication Studies (MCS) (Pierson, 2014; Gillespie et al., 2014), more in particularly taking a critical stance on the co-construction of connective media and technological systems (van Dijck, 2013; Mansell, 2012; Feenberg, 1999).

For this we first take a macro perspective by discussing recent views in humanities and social sciences on the changing landscape of internet and media technologies. We sketch the broader societal context, introducing the critical notions of 'culture of connectivity' and ‘digital seepage’. Next we take a micro perspective on the position of users in this culture of connectivity. We explain and situate user empowerment in relation to online privacy and how this is related to issues of vulnerability. The latter is typically subdivided in an external and internal component, referring to exposure and coping. This division is then used for framing our research regarding (young) people and their privacy awareness, attitude, capabilities and practices. We illustrate and corroborate how empowerment and vulnerability take form in the everyday life and experiences of people. In this way our user-oriented research aims to safeguard a balance between strengthening the empowerment of users while at the same time unburdening the users with respect to the responsibility of protecting their privacy (responsibilisation).

Michael Nagenborg provided a seminar titled: “Anonymous, Anonymity, and the Ethics of the mask".

Abstract

In my paper I will not try to defend “Anonymous” on moral grounds, but I will point out some of the moral questions raised by “Anonymous” in various actions. Especially, I will argue that people acting behind the Guy Fawkes mask should not be seen as being irresponsible even if in the Western tradition of ethics responsibility is often linked with being visible as a person.

While hackers do only form a small part of the people making use and upholding the collective identity of “Anonymous”, “Anons” are quit skilled in making use of the negative stereotype of the “hacker” and the “internet users” in general. “Anonymity” plays as central role here, because it is often assumed – and argued in computer ethics – that the anonymous use of the internet does undermine our moral character.

In my paper I will therefore focus on the ethics of the mask. The mask is an interesting device in resisting surveillance, because the mask does allow a person to be present and yet makes it harder to identify the person. In this sense is a powerful tool for returning the gaze while keeping one owns face out of sight, hence, preventing people not only from identifying the person but also from reading their faces. It’s important to note that the wearer of a mask does not hide, but remains visible. After all, a mask would be of little use if nobody is watching.

Following Agamben’s analysis of the link between the mask and the concept of personhood (Agamben 2009), I will explore the importance of the “ethical space” between the “persona” and the person. As Agamben and others have argued modern technologies of identification threaten to close this reflective space. Therefore, Anonymous’ use of the Guy Fawkes mask can be understood as an attempt to open up this ethical space again.

Finally, the question will be raised if members of “Anonymous” actually make use of this ethical space.

2014

Abstract

In this talk, he used the recent Google Spain judgment to discuss developments on the interface of data protection and other areas of law, in particular consumer protection, intermediary liability, and freedom of expression. In Europe, these developments illustrate significant convergence between these areas of law that could generally be characterized as the rise of data protection as a general ‘data policy’ framework. First, he clarified the underlying logic of convergence: why is convergence happening and why is data protection developing into an area of law with ever wider applicability? Since the Google Spain ruling of the European Court of Justice, it is clear that data protection (both the legislative framework and fundamental right in the EU Charter), will shape the framework for intermediary liability and the protection of freedom of expression in Europe. There are still exceptions in data protection for media publishers themselves, but the applicability of data protection to their primary means to reach an audience data protection has made data protection important for publishers as well. There are several technical, economic, political and legal reasons that could be given for this development some of which (but not all) are specific to Europe. One of those reasons is the rise of the paradigm (and myth) of ‘big data’ in policy and practice and he will argue that the breath and scope of big data may best explain the rise of data protection as a general data-policy framework to ensure freedom, justice and fairness in our increasingly digital societies as well.

Second, he addressed specific challenges with respect to this convergence from a policy and conceptual perspective, including the potential interference of data protection with freedom of expression and the limitations related to the character of data protection itself. Considering the ever wider applicability and relevance of data protection, a crucial question is what data protection as a right and a legal framework really is. How does it relate to the right to privacy? What is at its core and what are its boundaries and goals? Of special interest is the question whether data protection is the right framework and has the suitable instruments for setting substantive boundaries on the legality of data processing practices, specifically with respect to boundaries on what can be lawfully made public and further disseminated? Perhaps some policy makers in Europe are expecting too much from a framework of such broad applicability? Is it possible that its general applicability, flexibility and scope have become detrimental to achieving proper policy outcomes in specific contexts?

25th of November 2014 - Andrew Murray

(London School of Economics)

Abstract

The talk represent work in progress on a book project on 'the objective self' dealing with a re-orientation of how law (should) think about people and technology. Murray talked about the trend of depersonalization, entailing that companies are not interested in us as individuals but only in small bits of us. These bits are (also) combined into profiles and are used in decisions about individuals are being made on the basis of digital representations of them without involving the individual. We are no longer the consumer, we are the product.
Murray argues that freedom of expression and the right to privacy are two sides of the same coin. They are related and share dignity, communication and democracy as their core elements. Copyright and the right to data protection are designed to give legal effect to these fundamental rights and form one dignity regime. Murray therefore states that proprietary rights should be extended to all aspects of the digital person: give individuals dignity by giving them property.

11th of November 2014 - Gianluca Monica

(Phillips)

Gianluca Monica provided a seminar titled: "Challenges and opportunities in the Smart Home environment".

Abstract

Philips is a diversified technology company whose ambition is to grow into a global digital company, as witnessed by the recent launch of a number of smart connected products. Examples include the connected lighting platform Hue, the smart home cooker, baby monitor and air purifier. These propositions, together with products such as smart thermostats, door locks etc., will constitute the building blocks of the upcoming Smart Home environment.

While technologies are available and ready, numerous challenges and concern arise when considering privacy aspects of Smart Home propositions. Challenges include:

Usability of the sensor data due to privacy concerns

Privacy of the service based on sensor and usage behaviour data

Practical difficulty in managing and sharing personal data

Lack of trust to the preservation of privacy concerning submitted personal details

In this seminar, we presented the current trends in the Smart Home market and the Philips propositions in this area. We also presented some scenarios involving personal data processing from Smart Home devices and engaged the audience in a discussion about privacy implications of the sketched scenarios.

Abstract

Technology is becoming an integral part of our daily lives. To ensure that this process unfolds with sufficient respect for important values such as privacy and freedom, in my Vidi project I will develop intelligent software that adapts to human norms and values. In this talk I will discuss the associated computational reasoning and interaction challenges as well as touch on ethical considerations.

23th of September 2014 - Paolo Balboni

Paolo Balboni provided a seminar titled: "The Role of Standardisation in the Future of Global Personal Data Protection".

Abstract

Technology evolves at an increasingly fast pace and the traditional law making process does not provide timely and convincing answers to actual needs. Some companies are in fact becoming more powerful than governments. Trade is naturally taking place on a geographically global scale. Top-down country-related regulation together with a system based on increasing sanctions to encourage desirable conduct may no longer be effective. Standardization can play a crucial role in this global and fast changing scenario. In this lecture I discussed the work I have performed so far within the European Commission's Cloud Select Industry Group on the drafting of a Data Protection Code of Conduct for Cloud Service Providers (under Key Action 2: Safe and fair contract terms and conditions of the EU Cloud Strategy) and the work I am doing within Cloud Security Alliance on the Privacy Level Agreement for Cloud Service Providers. These are very significant standardization efforts which demonstrate that the EU-regulator and the market have acknowledged that cloud services urgently need privacy standards, in the meaning of privacy codes of conduct, and cannot wait for official regulations to be passed. By presenting this experience I showed the audience that the traditional way to regulate is seriously challenged by technology, by companies and by users' behavior and that we need to increasingly look at flexible standards, accepted both by governments and industry, which work across nations/continents to guarantee users an effective right to privacy and data protection in the era of the personal data economy.