Abstracts

Nils Backhaus: Yet Another Big Brother? Privacy in Cloud Computing.

The cloud is a buzzword and a technological transition of the way we store and process data. Cloud computing means entrusting computing resources to systems that are managed by external parties on remote servers. This transition has an influence on several IT processes and the users involved in those processes. On the one hand computing becomes more flexible and cost-effective caused by ubiquitous network access, on-demand self-service models, and elastic resources. On the other hand new challenges as to privacy, security, and trust arise, since the user is not involved in the collection, processing, storage, and disclosure of his data anymore (Cavoukian, 2008).
This contribution starts with a short cloud computing taxonomy including the evolution of the cloud as computing resources straight from the socket, a preliminary definition with a summary of cloud deployment and service models. This will be necessary in order to describe the different agents and their digital identities acting in the cloud respectively their needs for privacy. Several privacy threats and risks emerge from cloud computing and cloud service adoption. These will be linked to a number of aspects that illustrate the privacy issues in cloud computing best (cf. Pearson, 2013): To begin with, the capacity for processing big data and extending data mining offer boundless possibilities but also potential threats to privacy. Second the remote storage and processing of data leads to a lack of control over the cloud user’s data. Third cloud service providers may – deliberately or inadvertently – disclose the user’s data to themselves or third parties (cf. current mass surveillance disclosure). Fourth the regulatory complexity, litigation and legal uncertainty create cloud computing a legal gray area and make it more and more nebulous.
All of these uncertainties lead to a potential risk, especially when sensitive data is stored or processed in a cloud solution. Many of these problems are closely related to security issues, since the potential harms of privacy are often a result of security gaps. Both, security and privacy can be directly linked to trust in cloud services. Finally, a lack of trust is an obstacle to cloud adoption for the user. The last part of the talk addresses potential design issues and guidelines to design cloud computing services with a good privacy to make the safest and best of cloud computing for the user (Pearson, 2009). Conceivable societal scenarios will be discussed and finally the question arises if cloud computing opened Pandora’s box to a new era of mass surveillance and the end of individual privacy.

The app ecosystem for smart mobile devices plays a noteworthy part in mobile business today. The number of available apps, but also the number of app downloads and installations grow very fast. The main motivation of the smartphone user is to explore and make use of the almost unlimited potentials of their devices. Thus, they install and use new apps unconcernedly. On the other hand, app developers and providers aim for fast deployment of their apps. For them, functionality and usability are the main concerns. A negative side effect of this development has been revealed in many research studies and cases of privacy breaches on smartphones. Many apps need to access and process a multiplicity of personal or context-related data on the device to offer its users a personalized service (e.g. location-based services).
However, current smartphone platforms do not provide users an appropriate level of insight into smartphone applications’ sensitive information processing activities. Often, the user is not aware that sensitive resources have been accessed and processed by applications. This leads to a biased conception of how smartphone application usage impacts users’ privacy. Important principles such as informational self-determination, transparency, and user control are often violated.
My research focuses on the development of new concepts and means to provide smartphone users comprehensible and transparent information on how app usage impacts their privacy. The focus lies on new privacy user interfaces that present the user individual privacy information based on the applications’ behaviour. We have developed proof-of-concept prototypes for the Android platform and the Google Play Store and conducted two lab experiments in order to evaluate the effectiveness of our novel concepts and user interfaces in different contexts (runtime and app discovery). This talk will focus on the objective and the results of the user studies and share the key findings of the evaluations with the audience.

Michael Balasch: E-Health in the Smart Home – Legal framework and users’ perception in the SmartSenior project’s field test

This presentation demonstrates results from the largest German research project in the field of Ambient Assisted Living (AAL), “SmartSenior – Intelligent services for senior citizens” (2009-2012, initiated and coordinated by Telekom Innovation Laboratories, and co-financed by the Federal Ministry of Education and Research (BMBF)). 28 partners, including large corporations, research institutes and small and medium sized enterprises (SMEs) from various industries, developed innovative services that enable older people to continue living in their own homes as long as possible, and stay independent, safe, healthy and mobile in old age through appropriate services supported by technology.
The prototypes developed in the project were not only tested and demonstrated in laboratory environments – in a 3 months field test in Potsdam 35 seniors, aged between 55 and 88, were asked to test functionality, usability and helpfulness of the prototypes in their own homes. They used the central service portal, an interactive training module and all the other components and services developed in the project for the home environment. Health monitoring at home, shown exemplarily with weight, blood pressure and even electrocardiogram (ECG), was an important part of that. To be able to provide services for situations where individual help is needed, and help to identify them automatically, if necessary, and for research questions, quite a lot of data has been collected and processed. The project partners placed a set of sensors in the flats, used the mobile phone for location information, provided medical devices and stored their data in a patient’s record. A set of well-defined, open interfaces allowed an information exchange between the services.
Quite a lot of discussions between the project partners and their legal departments, with potential future users, and with the data protection authorities took place to meet all requirements for the tests in the field, including the necessary approval by the ethics committee. Finally, this goal was achieved. How, and what the users said, will be presented in this talk.

Stefan Brandenburg: On conducting research in the Internet age – subjects’ (data) drain is researchers (data) gain?

Modern technology allow gathering almost any type and amount of data. Hence the temptation is high to use all technical capabilities to collect whatever is storable. However one is not allowed to do so. Manufacturers and users of technical artifacts are obliged to adhere to common data privacy rules (e.g. Ropohl, 1996). Especially engineers and researchers are somewhat liable to the application of technical artifacts. As this is unambiguous for rocket science, it is rather inconclusive for any field gathering human data.
Hence the present talk highlights international conventions on data privacy and their role in research and development. Moreover it emphasizes the special role of engineers and researchers when it comes to people’s privacy rights. Thereby the talk will support its line of argumentation with examples of latest research in Human Computer Interaction. Finally, ways are discussed that help listeners to find their way through the jungle of principles and guidelines. Here online tutorials and online references are presented.

The recent development in mobile computing with the increasing number of location-enabled mobile devices emerged new requirements for privacy awareness and protection in location-based services (LBS). Traditional LBS applications were developed in a reactive manner where the user provides her current location actively to the service and receives information based on that location. Whereas modern LBS applications track the recent movement of a mobile client in order to provide the location relevant information proactively. Especially these proactive LBS applications that constantly keep track of a users’ location should build their applications with privacy protection in mind. Besides having a rather new technology, there are already a number of approaches that help designing a privacy-aware location-based service.

The process of obfuscation tries to modify location tracking information to a level where the path of a tracked user cannot be determined precisely. However, the obfuscated information is still sufficient for a set of LBS applications which only need coarse-grained position accuracy. In this respect, LBS providers should determine their exact needs for position accuracy and update intervals. Additionally, various forms of aggregation can extract or summarize only relevant information of tracks and discards raw data information. Also, the LBS provider can limit itself by storing the location information only for a specific time interval. The encryption of location data enhances the user privacy further. For example, location information can be stored separately in encrypted databases where the users’ location can only be accessed with the correct key. This can be even extended by changing keys in a timely manner so that location information cannot be associated with a specific user after a certain amount of time. Furthermore, the LBS provider can completely anonymize the data by removing any assignment between a users’ identifier and the location information.

Another important aspect for developing privacy-aware LBS applications is the mode of operation and the consent of the user. The LBS provider can follow two different approaches. First, the user can be asked actively by the LBS application to give her consent for the usage of the users’ current location (opt-in). Current opt-in solutions only allow the user to decide whether to use the LBS application or not. More advanced approaches would allow the user to specifically determine the way of how the LBS application can use the users’ location information, e.g. the precision granularity or storage time. Second, the LBS provider tracks the users’ location without his consent, e.g. in indoor environments via WiFi tracking, but allows an opt-out solution where the user is able to avoid being tracked by explicitly stating it. However, it is not known whether this solution is legally fully accepted yet.

Legal requirements did not keep pace with the technological development and do not limit the usage of the location in a restrictive way. However, LBS providers should enable their users to use LBS applications in a confident and secure manner.

Stephan Gauch: From Digital Divide to Privacy Divide

Since the Mid90s there has been a growing concern for the digital divide, the division between those having access as well as skill and capacity to operate in the digital domain. The following presentation will highlight some of the similarities between the concept of the digital divide and what some think will be the next crucial differentiation in terms of digital inequality: The privacy divide. Drawing on literature a concept is developed how to grasp the privacy divide and its importance, how it relates to previous notions of digital divides, how to grasp it theoretically and finally: how future research might shed a more distinct light on the phenomenon and how it might affect future technology developments.

Lydia Kraus: Users’ decision making when selecting Android apps – how important is privacy?

The presentation will give an overview about first results from a lab experiment which was recently conducted at the Quality and Usability Lab. In the experiment Android smartphone users were presented with pairs of Android apps with similar functionality but different number of permissions. We used different user interfaces to comparatively communicate privacy risks to the users and measure if this has an influence on their decision. We also tried to investigate the interplay between perceived privacy and trust and if user intrinsic characteristics such as privacy concern are related to the decision making.

Tobias Hirsch: Closeness and other factors influencing willingness to disclose

Interpersonal disclosure is known to be depended on the situation, kind of data and the person whom you share with (Who, How, Where, What, When). Yet not much is known about which features of a personal relationship influence disclosure.
The talk presents first results of a survey-based study, conducted among 22 employees of the Quality and Usability Lab. Factors of interpersonal relationship (e.g., closeness, trustworthiness) and other factors (e.g., sensitivity of data) will be discussed regarding their influence on the willingness to disclose.

Maija Poikela: I do (not) care if I’m being tracked – mobile users and the privacy paradox

An ever-growing number of location-based applications give their developers access to vast data sets, which can be used to build a complete profile of an unaware user. While the users have concerns regarding this location disclosure for good reasons, they seemingly paradoxically continue using these applications.
This presentation gives an overview of the users’ privacy concerns for location disclosure, and presents some reasons and factors for why the users continue using such applications. Also, some initial results from a field experiment using a location-based polling application are discussed.

Ubiquitous computing is characterized by the integration of computing aspects into the physical environment. Physical and virtual worlds start to merge as physical artifacts gain digital sensing, processing, and communication capabilities. This development introduces a number of privacy challenges. Physical boundaries lose their meaning in terms of privacy demarcation. Furthermore, the tight integration with the physical world necessitates the consideration not only of observation and information privacy aspects but also disturbances of the user’s physical environment. Due to the resulting complexity of systems, users have difficulties to estimate potential privacy implications of their actions and decisions.
From a user’s perspective, privacy is a dynamic regulation process in which exposure is adjusted based on social context and personal experiences in order to maintain a degree of openness required by the current activity. We propose a dynamic privacy adaptation process for ubiquitous computing that leverages the user’s context and individually learned privacy preferences to either provide individualized privacy recommendations to the user or automatically adapt privacy controls, accordingly. The process relies on three major components. A privacy context model captures privacy-relevant context information to detect privacy-relevant context changes in the user’s physical and virtual environment. A privacy decision engine determines the need for privacy adaptation based on the user’s previously learned privacy preferences and derives a sensible course of action with case-based reasoning. Depending on the confidence in the reasoning result, the user is notified about the required change and provided with individualized options or configuration changes are performed autonomously. Based on explicit and implicit feedback, the proposed system adapts to an individual user over time. The third component is the realization of derived or selected privacy policies, which requires mechanisms to enforce privacy policies in different environments.
The practicality of the outlined approach for dynamic privacy adaptation is discussed based on our experiences with respective prototypes in the domains of computer-supported collaborative work and ambient assisted living.