Topics in Internet & Society Interdisciplinary Studies 2016

Topics in Internet & Society Interdisciplinary Studies

How to apply

The course (3 credits) is open to students from all PhD programmes, and to any scholar interested (all backgrounds of study are welcome).

PoliTO PhD students have to insert the course in their career plan ("carico didattico"), see info on the course code here.

Course description

It is commonly understood that the Internet now affects almost every aspect of society– from knowledge sharing to economic interactions – and of personal life (digital divide permitting). Understanding the relationship between Internet and society is a complex matter, since its shape and its evolution are not an immutable technological law, but instead the consequence of specific choices, both private and public, that could – at least in principle – very well change over time.

This course – addressed to students of all Doctoral Programmes – is aimed at providing an interdisciplinary overview of a selection of Internet & Society topics currently addressed by scholars at global level, all of which have tangible implications in many domains, and may very well suggest new lines of reasoning to ongoing research of PhD candidates.

These topics include, but are not limited to:

knowledge creation and sharing in the digital environment, applied, e.g., to scientific publications, addressing legal and social aspects of Open Access publishing, and to software, discussing the principles and applications of Free Software;

the role of the Internet in the evolution of journalism, civic hacking, and civic engagement;

the evolution of the Internet architecture and protocols with respect
to entrepreneurial models, and the economic environment;

the future of Internet & Society research, also taking into account ongoing technological innovations such as the Internet of things, discussing their opportunities and risks.

All topics will be addressed from a interdisciplinary point of view (i.e., considering at least two disciplines) and making a broad use of examples.

Programme of the lectures

A lecture based on four chapters from my forthcoming book, Being Human in the 21st Century: How Social and Technological Tools are Reshaping Humanity (Cambridge 2017).

Many people complain that technologies dehumanize. It is difficult, however, to know when a line has been crossed, when the social engineering has gone too far, when something meaningful has been lost. Do we know when technology replaces or diminishes our humanity? Can we detect when this happens? To begin to answer these questions, we would have to know what constitutes our humanity, what makes us human in the first place. And that turns out to be an incredibly difficult question, one that has been debated without resolution for millennia. Given such difficulty at the most foundational level, it should not be surprising that we do not have a reliable method for identifying and evaluating technological impacts on our humanity. Of course, people often claim that technologies dehumanize, especially in recent decades with the widespread adoption of computers, the Internet, and more recently, smartphones. But the public generally takes such claims to be alarmist, and so the claims remain untested and ultimately drowned out by rampant enthusiasm for new technology. Yet techno-social engineering of humans exists on an unprecedented scale and scope, and it is only growing more pervasive as we embed networked sensors in our public and private spaces, our devices, our clothing and ourselves.

Being Human in the Twenty First Century moves beyond the impasse and engages these fundamental questions in a fresh, intellectually rigorous, and conceptually exhilarating manner. The book develops a method to examine the dynamic relationship between humans and the technologies we develop and use. At its core, the method depends on a fundamental and radical repurposing of the Turing test, which is the test that Alan Turing famously proposed to examine whether a machine can think. Turing’s test examined the line between humans and machines, focusing on the machine side of the line and giving rise to the field of artificial intelligence and machine learning. This book examines the human side of the line. It uses machines as a baseline and asks when and how humans behave in a machine-like manner. It begins with different types of intelligence, but it does not end there. There is more to humanity than that. The method is extended to different capacities, such as the capacity to relate to others, as well as the core concepts of free will and autonomy. The radically repurposed Turing tests are plausible tests to employ, but more importantly, the tests serve as conceptual tools that enable a deeper consideration of what makes us human and how our humanity is reflected in and affected by the technologies we develop and use. Throughout the book, the concepts are grounded with familiar examples, such as call centers, the assembly line, public schools, online electronic contracting, wearable technologies, social media, and many others. The last two chapters directly focus on interconnected sensor networks, the Internet of Things, and (big) data enabled automation of systems around, about, on and in human beings.

There is a crisis of trust in media and David Weinberger’s premonition “transparency is the new objectivity” (2009) has become a recipe out of this crisis. Weinberger was referring to the fact that objectivity – even if unattainable - served an important role in how we came to trust information, and in the economics of newspapers in the modern age. The problem with objectivity is that it tries to show what the world looks like from no particular point of view, which is like wondering what something looks like in the dark. With transparency as the new objectivity, what we used to believe because we thought the author was objective we now believe because we can see through the author’s writings to the sources and values that brought the author to that position. Transparency gives the reader information by which he/she can undo some of the unintended effects of the ever-present biases. Transparency brings us to reliability the way objectivity used to. This change is epochal…

Proprietary social media like Academia.edu and ResearchGate allow researchers to share papers and to connect to each other. They are, however, commercial services providing walled gardens, secluded from the open web and unable to ensure long term archiving and preservation. But their very success shows that they do answer - and monetize - a need for connecting researchers and sharing ideas that is apparently unfulfilled both by our legacy scientific publishing system and by the bulk of open access repositories. It could be useful to deal with a couple of questions: 1. from a theoretical point of view, why is our current public use of reason in such a predicament? 2. from a practical point of view, what can (young) researchers do about it?

This lecture presents the preliminary results of ongoing research for the EU Horizon2020 FutureTDM project on the legal barriers to TDM in Europe. It describes the objects of protection of copyright, database and data protection law respectively and the rights and obligations that are triggered when such materials or data are mined. Exceptions under copyright and database law are generally implemented in a fragmentary way, affecting TDM activities with cross-border aspects. Also, concepts such as ‘lawful user’, ‘non-commercial’ and ‘research’ leave room to interpretation, resulting in uncertainty as to the scope of these exceptions. Under data protection law, we find that the rules are very restrictive to TDM, and that exceptions do not provide much leeway as they are limited in scope and vary widely in their national implementations. The policies were assessed as to what extent they permit or promote TDM, in particular on the following aspects: beneficiaries of the policy, object (category of materials, contents or data) covered under permitted use, and the uses permitted by the policies. It discusses the policies of different categories of stakeholders and the merits of Open Access (OA) policies in general and for TDM in particular. We find that funders of research, governments, and research institutions and libraries generally advocate and apply OA; while OA policies to scientific publications are common and well-developed, OA policies regarding research data are still in their infancy, due to challenges as to confidentiality of information and privacy. Where non-OA publishers permit TDM carried out on their collections of publications, this is commonly restricted to academic and non-commercial users and purposes.

10 years after my paper "Attention, Diversity and symmetry in a Many-to-Many Information Society" (*) and building on the more detailed work conducted for my book "Sharing: Culture and the Economy in the Internet Age" (**) I will revisit two key questions for the social and cultural impact of the Internet:

- How much is the attention given to works concentrated on some of them or spread on a wider variety of works and depending on which factors?

- What balance can be established between time allocated to expressing oneself and time allocated to "read / view / listen" expressions of others.

Specific attention will be given to textual and transmedia contents in addition to music and moving image.

Networked computers totally transform the modes of production, diffusion and reception of all categories of documents. They also affect preservation and retrieval. As a result, our documentary world is being totally reshaped. However, this recasting of the documentary sphere is not just technological: various actors holding specific forms of power are striving to locate and hold new forms of social and institutional positioning that will allow them to keep or even improve the degree of power they held in the print/radio/television/film world.

The case of scientific publishing offers a good opportunity to study this process: the Open Access movement, with its twists and turns, is being shaped by conflicting forces acting within the affordances of the Internet. Examining how it has evolved permits a better understanding of which forces are at work, and how they confront each other. It is a complex story that involves researchers, research institutions and their managers, subsidizing agencies or charities, private companies and governments. It affects private citizens, educators and decision makers.

I will speak about the challenge of diversity in a socratic classroom. I will tell the story of Harvard Law School retiring its Royall shield in response to demands from Reclaim. I will explain the depressing effect of social conflict on classroom discourse and introduce you to an online modality that allows students in my class to engage in a synchronous pseudonymous threaded textual discussion in response to a prompt. This addition to the rhetorical modalities of the classroom facilitates forthright response from all students. Their articulation and sharing of ideas and responses on line invigorates ensuing face-to-face discussion in class.

Different forces are shaping the way and the goal of Science in the new century. The Internet and its related technologies, combined with the philosophical postulate of openness in Science, is one of them: Internet increases and extends the openness of Science in new ways, from enabling open access to large amount of resources to foster collaborations based on Web2.0 technologies. In such a context, in 2007 a well-know magazine have even predicted that the Big Data paradigm would make the scientific method obsolete: with the availability of huge amount of data and supercomputing, the traditional, hypothesis-driven scientific method would become obsolete; sophisticated algorithms and statistical tools to delve into massive amount of data will be automatically turned into knowledge. This lecture will give an overview of the new paradigms of the so called “Science 2.0” revolution, investigating the phenomenon from multiple perspectives (philosophical, statistical, normative) and giving the basis for critical reflections in the matter, with a focus on the role of data.

Bruce Sterling will describe the developing technical trends in the Internet of Things, including vocal computing and single-use wireless buttons. Jasmina Tesanovic of "Casa Jasmina" will talk about the "Internet of Women Things and open source projects in the experimental domestic space for the Torino Fab Lab.

Wednesday 29 June

15:00 - 16:30 Examination

Requirements to pass the course

Politecnico di Torino will grant 3 credits for this course. Requirements to pass the course: attending all lectures and pass the examination.