Machine-to-Machine: Vision, Technologies & Applications

The unprecedented communication paradigm of machine-to-machine (M2M), facilitating 24/7 ultra-reliable connectivity between a prior unseen number of automated devices, is currently gripping both industrial as well as academic communities. Whilst applications are diverse, the in-home market is of particular interest since undergoing a fundamental shift of machine-to-human communications towards fully automatized M2M. The aim of this keynote is thus to provide academic, technical and industrial insights into latest key aspects of wireless M2M networks, with particular application to the emerging smart grid and smart home verticals.
Notably, I will provide an introduction to the particularities of M2M systems, mainly in the context of smart homes. Architectural, technical and privacy requirements, and thus applicable technologies will be discussed. Notably, we will dwell briefly on the capillary and mainly cellular embodiments of M2M. The focus of capillary M2M, useful for real-time data gathering in homes, will be on IEEE (.15.4e) and IETF (6LoWPAN, ROLL, COAP) standards compliant low-power multihop networking designs; furthermore, for the first time, low power Wifi will be dealt with and positioned into the eco-system of capillary M2M. The focus of cellular M2M, useful for both data gathering and the increasing multimedia contents in smart homes, will be on latest activities, status and trends in leading M2M standardization bodies with technical focus on ETSI M2M and 3GPP LTE-MTC.
Open technical challenges, along with the industryﾒs vision on smart grid and smart home developments, will be discussed during the talk.

Algorithms for the Analysis of Bio-Sequences

In the past few years Next Generation Sequencing technologies have generated a huge amount of data at a very high pace. With thousands of fully sequenced genomes available, there is a compelling need for efficient algorithms to extract meaningful information from these data. In this talk I will present my research in the field of algorithms for bioinformatics analysis, with focus on efficient pattern discovery techniques. In particular, compact approaches and alignment free techniques will be discussed. Compact approaches perform a partition of the search space in classes to speed up computation, and reduce the size of the output. Alignment free techniques use the substring composition to compute similarity between sequences. These techniques are the solution of choice for genomes full comparison, where alignment-based techniques are very slow.

Intelligent Preference Reasoning for Multi-Agent Decision Making

Preferences are ubiquitous in everyday decision making. Therefore, they are essential ingredients in every reasoning tool. Preferences are mainly studied in artificial intelligence in the context of multi-agent decision making, where each agent expresses its preferences over a set of possible decisions and the goal is to find the best collective decision. Preferences have been classically studied in a social choice context and in particular in voting theory, where several voters express their preferences over the candidates and a voting rule is used to elect the winning candidate. Since this scenario is similar to multi-agent decision making, many interesting papers in the multi-agent area have tried to adapt social choice results to multi-agent setting by taking into account issues that do not occur in a social choice context: a large set of candidates with a combinatorial structure, formalisms to model preferences compactly, preference orderings including indifference and incomparability, uncertainty, as well as computational concerns. This is the basis of a new research area called computational social choice, which studies how social choice and artificial intelligence can fruitfully cooperate to give innovative and improved solutions to aggregate preferences given by multiple agents. This talk will present this interdisciplinary area of research and some of my recent results regarding the issues mentioned above.

Biometric Identity Management for Standard Mobile Medical Networks

The explosion of healthcare costs over the last decade has prompted the ICT industry to respond with solutions for reducing costs while improving healthcare quality. The ISO/IEEE 11073 family of standards recently released is the first step towards interoperability of mobile medical devices used in patient environments.
A successful introduction and usage of mobile e-health systems on a large scale hinges on two key factors: interoperability and security. ISO/IEEE recently published the final version of the 11073 family of standards which ensure interoperability of data transmission, monitoring and controlling of vital signs between mobile medical devices used in a Personal Area Network (PAN). These specifications do not, however, comprise any security procedures on identity management and data encryption.
As a rising number of patients are moving towards homecare, there is a growing need for creating PANs for mobile medical devices. The usage of this type of network is also contingent on security factors. The clinical data measured, transmitted and archived centrally, must be correctly assigned to the patient using the medical device and not to anyone else.
The lecture presents a research proposal for enhancing the ISO/IEEE 11073 family of specifications through a novel, authentication procedure. The authentication is based on a mutual authentication technique which uses fingerprint biometric information. The research addressed also the difficult challenge of developing an adequate algorithm for generation a biometric key based on fingerprint image.
The implementation results demonstrates that the proposed authentication solution is very easily embeddable into the existing ISO/IEEE 11073-20601 Optimized Exchanged Protocol (OEP) standard.

Some Recent Work at JHUAPL in Advanced Communications and Networking

Ubiquitous computing and communication are changing our lives in fundamental ways and will continue to do so. They are also creating major new technical challenges at all layers of networking. For examples, underwater communication, high data rate communication between airborne platforms with high mobility, and inter-planetary communication pose unique new challenges at physical and link layers. The ubiquity also poses challenges in scaling the basic Internet Protocols. In this talk, we will discuss some of these challenges. We will also discuss recent research activities at JHUAPL that address some of these challenges. In particular, we will discuss work on Free Space Optics (FSO) and Hybrid FSO-RF communication, Protocols for Inter Planetary Internet, and new addressing and routing to scale Internet.

In recent years many theoretical and experimental works have demonstrated the importance of multiqubit photon states to operate in a high-dimensional Hilbert space. By entangling different degrees of freedom (DOFs) of a single particle or two or more particles using their DOFs fundamental test of quantum mechanics and innovative quantum information protocols have been realized.

I will present some of the most relevant results obtained by us with hyperentangled 2-photon states. I will also show the possibilities offered by the ﾓintegrated quantum circuitsﾔ to overcome the structural limits of conventional bulk optics in view of an efficient implementation of multiqubit entangled photon states of increasing complexity.

Computing with Evolving Data

We formulate and study a new computational model for dynamic data. In this model, the data changes gradually in time and the computation has access to only a small part of the data in each step. The goal is to design algorithms that output solutions to computational problems on the data at any given time. As the data is constantly changing and the algorithm may not be unaware of these changes, it cannot be expected to always output the exact right
solution; we are interested in algorithms that guarantee good approximate solutions.

We study fundamental computation problems, including sorting and selection, where the true ordering of the elements changes in time and the algorithm can only probe in each step
the order of a few pairs; and connectivity and minimum spanning trees in graphs where edges` existence and weight change over time and the algorithm can only track these changes by probing a few vertex or edges per step. This framework captures the inherent trade off between the complexity of maintaining an up-to-date view of the data and the quality of results computed with the available view.