The advent of the World Wide Web has radically changed Internet usage from host-to-host to service access and data retrieval. The majority of services used by Internet’s clients are content-centric (e.g. web). However, the original Internet revolves around host-to-host communication for which it was conceived. Even if Internet has been able to address the challenges offered by new applications, there is an evident mismatch between the architecture and its current usage. Many projects in national research agencies propose to redesign the Internet architecture around named data. Such research efforts are identified under the name of Information Centric Networking. This thesis focuses on the Content-Centric Networking (CCN) proposition. We first analyze the CCN communication model with particular focus on the bandwidth and storage sharing performance, We compute closed formulas for data delivery time, that we use in the second part of the thesis as guideline for network protocol design. Second, we propose some CCN congestion control and forwarding mechanisms. We present a first window based receiver driven flow control protocol, Interest Control Protocol (ICP). We also introduce a hop-by-hop congestion control mechanism to obtain early congestion detection and reaction. We then extend the original ICP congestion control protocol implementing a Remote Adaptive…

This dissertation is concerned with a longitudinal study of a computer network technology referred to as the Commotion wireless MESH software, and the assemblages of actants that come into contact with it (such as people, objects, organizations, discourses, etc.). I argue that this apparatus produces different versions of itself that uniquely relates the concepts of agency, infrastructure and the Commons because it explicitly claims to be both technical and political within different socio-historical folds. By assuming such an overt political stance, it invites us to think through the notion of mediation in a new light (infrastructural mediation). My research methods reflect different interpretations of this software by seeking to understand whether this wireless mesh network technology represents a compromise solution to redefining the forces that constitute telecommunications infrastructure and its hold on the social bond. To do so, my dissertation expands on a particular ethnographic path which, “by the middle”, attempts to understand the ways in which the existence of a socio-technical system is established. I argue that the establishment of a socio-technical apparatus does not amount to pulling it out of thin air, but rather to make it become what it is. The findings reflect the successive trials and errors that go into this process of developing a sociotechnical and mediatic form that has yet to be recognized, while also shoring up the constitutive elements of a mediation process between the Commons and telecommunications infrastructure.

The aim of this thesis is to address the design of iterative MIMO receivers using LDPC Error Correcting codes. MIMO techniques enable capacity increase in wireless networks with no additional frequency ressources. The associationof MIMO with multicarrier modulation techniques OFDM made them the cornerstone of emerging high rate wireless networks. Optimal reception can be achieved using joint detection and decoding at the expense of a huge complexity making it impractical. Disjoint reception is then the most used. The design of iterative receivers for some applications using LDPC codes like Wifi (IEEE 802.11n) is constrained by the standard code structure which is not optimized for such kind of receivers. By observing the effect of the number of iterations on performance and complexity we underline the interest of scheduling LDPC decoding iterations and turboequalization iterations. We propose to define schedules for the iterative receiver in order to reduce its complexity while preserving its performance. Two approaches are used : static and dynamic scheduling. The second part of this work is concerns Multiuser MIMO using Spatial Division Multiple Access. We explore and evaluate the interest of using iterative reception to cancel residual inter-user interference.

4.
Lafaye, Michaël.
Modélisation de plate-forme avionique pour exploration de performance en avance de phase : Study and design a system for monitoring and detecting critical situations by monitoring actimetric parameters of people at risk indoor and outdoor.

Nowadays, real-time critical embedded systems are more and more complex due to an increase of the integrated components. Following that trend, avionic systems development complexity increases too. So early modeling processes are more and more used in order to anticipate on plat-forms performance and help sizing them. Particularly, hardware resources usage exploration is a key aspect for performance exploration. Current processes allow to model avionic platform from requirements to architectural level of abstraction, but they do not allow to model a behavioral avionic platform. Thus, they do not allow to explore the hardware resources usage of the platform, neither to compare some alternatives of architectures at early phase of development cycle. My PhD work presents our avionic platform modeling and simulation process that answer that problem. The goal is to complete current modeling processes to offer more accurate early performance analysis, and compare them with the…

Lafaye, Michaël. “Modélisation de plate-forme avionique pour exploration de performance en avance de phase : Study and design a system for monitoring and detecting critical situations by monitoring actimetric parameters of people at risk indoor and outdoor.” 2012. Web. 25 May 2019.

Vancouver:

Lafaye M. Modélisation de plate-forme avionique pour exploration de performance en avance de phase : Study and design a system for monitoring and detecting critical situations by monitoring actimetric parameters of people at risk indoor and outdoor. [Internet] [Doctoral dissertation]. Paris, ENST; 2012. [cited 2019 May 25].
Available from: http://www.theses.fr/2012ENST0065.

Council of Science Editors:

Lafaye M. Modélisation de plate-forme avionique pour exploration de performance en avance de phase : Study and design a system for monitoring and detecting critical situations by monitoring actimetric parameters of people at risk indoor and outdoor. [Doctoral Dissertation]. Paris, ENST; 2012. Available from: http://www.theses.fr/2012ENST0065

As more than fifty countries have launched an open data policy, this doctoral dissertation investigates on the emergence and implementation of such policies. It is based on the analysis of public sources and an ethnographic inquiry conducted in seven French local authorities and institutions. By retracing six moments of definitions of the “open data principles” and their implementation by a French institution, Etalab, this work shows how open data has brought attention to data, particularly in their raw form, considered as an untapped resource, the “new oil” lying under the organisations. The inquiry shows that the process of opening generally begins by a phase of identification marked by progressive and uncertain explorations. It allows to understand that data are…

Nowadays, cellular technology is almost everywhere. It has had an explosive success over the last two decades and the volume of traffic will still increase in the near future. For this reason, it is also regarded as one cause of worldwide energy consumption, with high impact on carbon dioxide emission. On the other hand, new mathematical tools have enabled theconception of new models for cellular networks: one of these tools is stochastic geometry, or more particularly spatial Poisson point process. In the last decade, researchers have successfully used stochastic geometry to quantify outage probability, throughput or coverage of cellular networks by treating deployment of mobile stations or (and) base stations as Poisson point processes on a plane. These results also take into account to impact of mobility on the performance of such networks. In this thesis, we apply the theory of Poisson point process to solve some problems of cellular networks, in…

Considering polynomials over the Galois finite fields for two elements, our intention stand over the divisibility of the trinomials x^am+x^bs+1, for m>s ≥ 1, by an irreducible polynomial of degree r, for this, we contribute to the result :If there exist positive integers m, s such that the trinomial x^am+x^bs+1 is divisible by an irreducible polynomial of degree r over F2, then a and b are not divisible by (2^r- 1). For this type of trinomials we conjectured that the ratios πM(a,b)/ πM(1,1) tend to a finite limit (dependently of a and b) when M tend to infinity. Our research stand at sequel on the cyclic codes of rate 1/2 over the two finite fields F3 and F5 and we check our research over whose are isodual. The so-called fundamental problem in coding theory is finding the largest value of dq for which a code of parameters [n, q, d] over Fq exists. In this context we have successfully optimize this distance for the cyclic codes of rate 1/2 over F3 and F5 up to length 74 for the ternary cyclic codes and 42 for whose over F5. We have also successful to construct seven classes of isodual cyclic codes over the field of 3 elements and three classes over the field of 5 elements.

This doctoral thesis in Management Sciences concerns the project WITE 2.0 dedicated to the analysis and the design of technical and organizational ICT device : an integrated platform for teleworking. This platform allows you to work "remotely" on a connected mode or not, from any device (PC, phone, tablet), on a "thin client" and in a work environment like « cloud computing ». Some questions have emerged related to the design of the platform : these questions concern the role of the information and communication technologies (ICT) in the progress of remote working. Our design of research is divided into two research’s actions : firstly, we wanted to know the diversity of remote working configurations and secondly, we wanted to understand how the appropriation’s codes and norms of the new technologies (used for the platform) take place. We followed an « situed action » perspective and a qualitative methodology based on semi-structured interviews and observations. In our results, we describe the remote working’s realities, the limitations of the technologies and the tactics built by the workers while they « enact » the technology remotly. We discovered some use’s norms, often in a tacit dimension, and use’s values of these new technologies. Finally we gave some managerial recommandations concerning the technical, use and service aspects.

Nowadays, with the advent of deregulation, service providers aim to be more competitive and to attract more subscribers in order to cope with the high market pressure. For this purpose, today's providers support a user-centric approach that consists on quickly providing user oriented services. This user-centric approach becomes more and more significant with the emergence of the next generation networks and services (NGN/NGS) context. Within this context, where network convergence and service convergence are omnipresent, the end-user becomes more nomadic and claims the access to any service, anywhere, anytime and by any means. His goal is to dynamically compose a personalized service session while converging a set of multi-domain services (Telco, Web and IT). Then, he wants to maintain the continuity of this service session throughout his spatial and temporal mobility. Within the scope of this thesis, we propose a novel service architecture, namely the NGN/NGS Middleware, that adopts an horizontal distributed event-driven and service oriented approachn and that is based on a novel service model. In addition, we propose two solutions for service continuity management, that are based on virtual communities and on a semantic handover. These solutions take into consideration the user's preferences and ambiant context. At the end, we think we could answer…

In modern wireless communication systems, the per-user data rate demand is constantly growing. To sustain the heavy user data rate demand, network operators try to deploy cellular system with more cells and applying more efficient spectrum reuse techniques. One possible solution to increase system throughput is to get the user closer to the transmitting base station and hence deploy very dense network infrastructure. In this setup strong interference situations will result. Interference has been identified as the main bottleneck of modern wireless cellular communication systems. With small dense cells this is more the case. This consideration has led to intense research activities that has recently pushed network operators and manufacturers to include more proactive and efficient way to suppress/control interference. From an information theoretic point of view this problem can be mathematically studied as an interference channel. In the first part of this thesis, we focus our attention on the beamforming design for the interference channel with particular focus on the MIMO case. There we propose the joint optimization of linear transmitter and receiver according to two criteria : Interference Alignment and weighted sum rate maximization. The second part of the thesis is devoted to the beamforming design problem in cognitive radio settings. We start considering an underlay…

This thesis focuses on the extraction and analysis of Web data objects, investigated from different points of view: temporal, structural, semantic. We first survey different strategies and best practices for deriving temporal aspects of Web pages, together with a more in-depth study on Web feeds for this particular purpose, and other statistics. Next, in the context of dynamically-generated Web pages by content management systems, we present two keyword-based techniques that perform article extraction from such pages. Keywords, automatically acquired, guide the process of object identification, either at the level of a single Web page (SIGFEED), or across different pages sharing the same template (FOREST). We finally present, in the context of the deep Web, a generic framework that aims at discovering the semantic model of a Web object (here, data record) by, first, using FOREST for the extraction of objects, and second, representing the implicit rdf:type similarities between the object attributes and the entity of the form as relationships that, together with the instances extracted from the objects, form a labeled graph. This graph is further aligned to an ontology like YAGO for the discovery of the unknown types and relations.

Network tomography is the study of a network's traffic characteristics using measures. This subject has already been addressed by a whole community of researchers, especially to answer the need for knowledge of residential Internet traffic that ISPs have to carry. One of the main aspects of the Internet is that it evolves very quickly, so that there is a never ending need for Internet measurements. In this work, we address the issue of residential Internet measure from two different perspectives: passive measurements and active measurements. In the first part of this thesis, we passively collect and analyse statistics of residential users' connections…

Dealing with the requirements of reconfigurable radio architectures in the vehicular domain is a very challenging task. Solutions can be found in the context of Software Defined Radio (SDR). Under its umbrella, flexible hardware platforms that support a wide range of different wireless communication standards are designed. One of them is the OpenAirInterface ExpressMIMO platform that is developed by Eurecom and Télécom ParisTech. Main objectives of this thesis are to propose the first receiver chain prototype for ExpressMIMO, to assess the applicability of the platform for latency critical standards, to identify design bottlenecks and to propose and implement solutions to overcome the identified limitations. Standard of interest in this context is IEEE 802.11p which is required for the Car-to-Car communication. Our analysis reveals that the Front-End Processing (FEP) DSP engine is heavily charged and that the required configuration time outreaches the pure execution time for short vectors. To meet this challenge we introduce an Application Specific Instruction-Set Processor (ASIP) as the solution of choice when dealing with strong latency requirements. To complete the receiver chain we further present a first Preprocessor prototype which connects the external A/D and D/A converters with the remaining baseband engine. In this context we focus on a generic, flexible and hardware optimized Sample Rate…

Cardiac implants like ICD are life saving devices for cardiac arrhythmias. In other conditions like heart failure, CRT implants are prescribed to restore the heart rhythm. Such treatment consists of the delivery of electrical stimuli to the cardiac tissue via electrodes in the stimulation lead. Conventionally the stimulation lead come either in unipolar or bipolar configuration which have been found to be sufficient for pacing the right atrium and right ventricle, studies have shown the benefits of a multi-electrode system for pacing left ventricle essential for cardiac resynchronization. This thesis discusses the design and optimization of a multi-electrode system capable of alleviating the limitations and constraints related to left ventricular stimulation. We first present implementation of such system that was taped out in 0.18 µm technology. The chip also features a specially designed communication protocol which enables low power operation and quick configuration. Thereafter we present the design and implementation of a default connection unit to ensure the compatibility of our multi-electrode lead with in the market. This unit was taped out in 0.18 µm technology. Finally we present a proof of concept study for the adaptation and integration of non-volatile memory technologies within the multi-electrode system. The employment of such technologies…

Optical fibers sensors for civil engineering are not a new idea. Their interest is based mainly on the intrinsic properties of optical fibers: electromagnetic neutrality, important capacity of multiplexing and access to long distances of measure. These sensors may cover numerous functions of the traditional sensors: detection, localization and surveillance. Thanks to interactions between the light and the optical fiber, such Brillouin scattering, the optical fiber can be on all its length, a continuously distributed sensor.The phenomenon of Brillouin scattering is well studied due to its big efficiency of scattering, its dependence towards temperature and strain and its pluri-kilometric reach. However, the double sensibility of the Brillouin frequency in temperature and strain is problematic for the simultaneous measurement of these two parameters. We shall present a possibility of discrimination of temperature and strain corresponding to the precisions wished for surveillance health monitoring.

Probabilistic XML is a probabilistic model for uncertain tree-structured data, with applications to data integration, information extraction, or uncertain version control. We explore in this dissertation efficient algorithms for evaluating tree-pattern queries with joins over probabilistic XML or, more specifically, for approximating the probability of each item of a query result. The approach relies on, first, extracting the query lineage over the probabilistic XML document, and, second, looking for an optimal strategy to approximate the probability of the propositional lineage formula. ProApproX is the probabilistic query manager for probabilistic XML presented in this thesis. The system allows users to query uncertain tree-structured data in the form of probabilistic XML documents. It integrates a query engine that searches for an optimal strategy to evaluate the probability of the query lineage. ProApproX relies on a query-optimizer – like approach: exploring different evaluation plans for different parts of the formula and predicting the cost of each plan, using a cost model for the various evaluation algorithms. We demonstrate the efficiency of this approach on datasets used in a number of most popular previous probabilistic XML querying works, as well as on synthetic data. An early version of the system was demonstrated at the ACM SIGMOD 2011 conference. First steps towards the new query solution were discussed in an EDBT/ICDT PhD Workshop paper (2011). A fully redesigned version that implements the techniques and studies shared in the present thesis, is published as a demonstration at CIKM 2012. Our contributions are also part of an IEEE ICDE

While 3D surfaces are essentially represented using triangle meshes in the domain of digital geometry, the structures that allow to interact with those are various and adapted to the different geometry processing tasks that are targetted by the user.This thesis presents results on structures of various dimension and various geometrical representations, going from internal structures like analytical curve skeletons for shape modeling, to on-surface structures allowing automatic selection of feature handles for shape deformation, and external control structures known as “cages” offering a high-level representation of animated 3D data stemming from performance capture. Results on spatial functions are also presented, in particular for the Mean-Value Coordinates, for which the analytical formulae of the gradients and the Hessians are provided, and biharmonic functions, for which a finite elements basis is given for the resolution of the biharmonic Laplace problem with mixed Dirichlet/Neumann boundary conditions, as well as their applications to 3D shapes deformation.

This dissertation studies the extension of the Itô formula to the case of distibution-valued paths of bounded variation lifted by processes which are regular in the sense of Malliavin calculus. We make optimal hypotheses, which gives us access to many applications. The first chapter is a primer in Malliavin calculus. The second chapter provides useful results on the toplogy of the schwartz class and of the space of tempered distributions. in the third chapter, we give optimal conditions under which a tempered distribution may be composed by a random variable and we study the malliavin regularity of the object thus defined. Interpolation techniques give access to results in fractional spaces. We also give results for the case where the tempered distribution is itself stochastic. These results allow us to obtain, in chapter 4, a weak Itô formula under hypotheses which are much weaker than those usually made in the litterature. We also give an Itô-Wentzell and an anticipative version. In the case where the process to which the ito formula is applied is the solution to an SDE, we give a more precise result, which we use to study the reguarity of the multi-dimensional local time. Finally the fifth chapter solves a variational problem under hypotheses which are much weaker than the usual assumption of hypoellipticity

This thesis deals with different aspects of mesh processing, and the way those operations can be done in parallel, or using distributed memory, when GPUs and supercomputers are more and more commonly used. We present surfacic and volumetric mesh smoothing algorithms, based upon image processing techniques (bilateral filter, local histograms). After those geometric considerations, we talk about topologic methods, as local remeshing, enabling one to generate, from a tetrahedral mesh, one layer of good quality prisms and hexahedron, allowing fluid mechanics simulations in those near-surface areas. Finally, we present a meshing technique based upon particular interactions, in order to construct quad-dominant meshes.

Recent years have witnessed a massive evolution of mobile communications. When no agreement between the network providers exists, changing the attached network still means breaking the session and relying on the application to recover the lost data. In parallel, it is hardly possible for a mobile user to control the connectivity of his terminal. The objective of this thesis is to present the concept of an innovative technological framework for the autonomous control of multimode terminals in heterogeneous and non-federated wireless environments. The aim is to enable a self-configuring terminal to connect and roam seamlessly across independent networks, while respecting its user’s choices and preferences. The target scheme involves abstraction and cross-layer mechanisms. It takes into account constraints based on heterogeneous wireless systems, autonomous architectures and enables generic services such as smart access network selection, connectivity and session management. This scheme applies to the mobile terminal, with mechanisms independent of the network infrastructure. The thesis analyses how existing technologies are enhanced and combined with new features to achieve this objective and gives a description of the overall concept and of its implementation. A simulated model is used to assess the validity of the proposed framework. Diverse…

We are witnessing in recent years a steady growth of the so-called structured Web, in which documents (Web pages) are no longer quasi-textual, but are data-centric, presen-ting structured content, complex objects. Such schematized pages are often generated dynamically by means of formatting templates over a database, possibly using user input via forms (hidden Web). The current Web search platforms allow only to retrieve Web pages by traditional keyword search methods, which are not adapted to query the structured Web. Indeed, keyword search is semantically…

He subjects addressed in this thesis are inspired from research problems faced by the Lokad company. These problems are related to the challenge of designing efﬁcient parallelization techniques of clustering algorithms on a Cloud Computing platform. Chapter 2 provides an introduction to the Cloud Computing technologies, especially the ones devoted to intensivecomputations. Chapter 3 details more speciﬁcally Microsoft Cloud Computing offer : Windows Azure. The following chapter details technical aspects of cloud application development and provides some cloud design patterns. Chapter 5 is dedicated to the parallelization of a well-known clustering algorithm: the Batch K-Means. It provides insights on the challenges of a cloud implementation of distributed Batch K-Means, especially the impact of communication costs on the implementation efﬁciency. Chapters 6 and 7 are devoted to the parallelization of another clustering algorithm, the Vector Quantization (VQ). Chapter 6 provides an analysis of different parallelization schemes of VQ and presents the various speedups to convergence provided by them. Chapter 7 provides a cloud implementation of these schemes. It highlights that it is the online nature of the VQ technique that enables an asynchronous cloud implementation, which drastically reducesthe communication costs introduced in Chapter 5.

This thesis is a volume of 134 pages and includes 5 research articles. The thesis is a contribution to the empirical literature that has developed since the early 2000s on the changes introduced by the Internet trade in cultural property. It examines in particular a set of questions on the complementarity or substitution distribution channels, physical and virtual, the effect of the "Long Tail Theory" and price dispersion on the Internet. The interest of this thesis is to provide empirical evidence to the debate, thanks to the creation of databases obtained by automated data capture observable on the Internet. Statistical and econometric results from these studies detail the specifics of best-selling books, CDs and DVDs as distribution channels (Amazon, Amazon Marketplace, physical stores) but also according to their format (paper books / ebooks). Regarding price dispersion, the results show a low variability of prices by sellers of Amazon Marketplace and low impact of traditional measures of reputation (ratings of sellers) compared to the size of the catalog vendors challenging the using of notation as a proxy for reputation.

While RFID systems are one of the key enablers helping the prototype of pervasive computer applications, the deployment of RFID technologies also comes with new privacy and security concerns ranging from people tracking and industrial espionage to produ ct cloning and denial of service. Cryptographic solutions to tackle these issues were in general challenged by the limited resources of RFID tags, and by the formalizations of RFID privacy that are believed to be too strong for such constrained devices. It follows that most of the existing RFID-based cryptographic schemes failed at ensuring tag privacy without sacrificing RFID…

This thesis has two main parts. Part I uses stochastic anlysis to provide bounds for the overload probability of diﬀerent systems thanks to concentration inequalities. Although the results are general, we apply them to real wireless network systems such as WiMax and mutliclass user traﬃc in an OFDMA system. In part I I, we ﬁnd more connections between the topology of the coverage of a sensor network and the topology of its corresponding simplicial complex. These connections highlight new aspects of Betti numbers, the number of k-simplices, and Euler characteristic. Then, we use algebraic topology in conjunction with stochastic analysis, after assuming that the positions of the sensors are points of a Point point process. As a consequence we obtain, in d dimensions, the statistics of the number of k-simplices and of Euler characteristic, as well as upper bounds for the distribution of Betti numbers. We also prove that the number of k-simplices tends to a Gaussian distribution as the density of sensors grows, and we specify the convergence rate. Finally, we restrict ourselves to one dimension. In this case, the problem becomes equivalent to solving a M/M/1/1 preemptive queue. We obtain analytical results for quantites such as the distribution of the number of connected components and the probability of complete coverage.

The design of circuits to operate at critical environments, such as those used in control-command systems at nuclear power plants, is becoming a great challenge with the technology scaling. These circuits have to pass through a number of tests and analysis procedures in order to be qualified to operate. In case of nuclear power plants, safety is considered as a very high priority constraint, and circuits designed to operate under such critical environment must be in accordance with several technical standards such as the IEC 62566, the IEC 60987, and the IEC 61513. In such standards, reliability is treated as a main consideration, and methods to analyze and improve the circuit reliability are highly required. The present dissertation introduces some methods to analyze and to improve the reliability of circuits in order to facilitate their qualification according to the aforementioned technical standards. Concerning reliability analysis, we first present a fault-injection based tool used to assess the reliability of digital circuits. Next, we introduce a method to evaluate the reliability of circuits taking into account the ability of a given application to tolerate errors. Concerning reliability improvement techniques, first two different strategies to selectively harden a circuit are proposed. Finally, a method to automatically partition a TMR design based on a given reliability requirement is introduced.

During the last decade, real-time video streaming over ad-hoc networks has gathered an increasing interest, because of the attractive property of being able to deploy a visual communication system anytime and anywhere, without the need for a pre-existing infrastructure. A wide range of target applications, from military and rescue operations, to business, educational, and recreational scenarios, has been envisaged, which has created great expectations with respect to the involved technologies. The goal of this thesis is to provide an efficient and robust real-time video streaming system over mobile ad-hoc networks, proposing cross-layer solutions that overcome the limitations of both the application and network solutions available at this time. Our contributions cover several aspects of the mobile video streaming paradigm: a new multiple description video coding technique, which provides an acceptable video quality even in presence of high loss rates; a novel cross-layer design for an overlay creation and maintenance protocol, which, with a low overhead, distributedly manages a set of multicast trees, one for each description of the stream; an original distributed congestion-distortion optimisation framework, which, through a compact representation of the topology information, enables the nodes to learn the structure of the overlay and optimise their behaviour accordingly; and, finally, an integration…

In the last decades, Intelligent Transportation Systems (ITS) have been considered as one of the most emerging research area due to their promising role in promoting traffic efficiency and enhancing road safety. ITS cooperative safety applications, being the most vital and critical, have gained a lot of attention. The effectiveness of these applications depends widely on the efficient exchange of two main types of information. The periodic awareness corresponding to the one-hop location information of surrounding environment and the multi-hop event-driven information generated at the detection of a safety situation. Due to the large scale characteristic of ITS, this information is expected to be subject to severe congestion which might impact its reliable reception. The goal of this thesis is to focus on the reliable and robust control of safety-related information by reducing the channel congestion and at the same time taking into account the requirements of safety applications. We address first the event-driven safety information. We proposed a multi-hop policy showed to improve the dissemination of the event-driven information. However, it…

Today, new higher-speed, low cost and low consumption optical sources are becoming a necessity for the deployment of access and metropolitan networks.The aim of this thesis is to study experimentally and by simulation two techniques in order to combat the chromatic dispersion effects through the chirp engineering of the source. The first technique concerns directly modulated DFB (Distributed FeedBack) lasers. First, a complete and flexible model of a DFB laser developed during the thesis has been used to confirm the experimental study of the facet phase effect on the chirp behavior. The results showed the existence of two laser’s families according to the position of the lasing mode with respect to the bandgap. Second, a theoretical and experimental study showed the chirp stabilization and control of DFB lasers due to the presence of a well adjusted external optical feedback.The second technique concerns the dual modulation concept of integrated modulated laser (D-EML : Dual Electroabsorption Modulated Laser) exploiting the adjustment of the chirp resulting from the juxtaposition of the frequency modulation applied to the laser and the intensity modulation applied to the modulator. Experimental and theoretical evalutation of D-EML performances has proven its compatibitlity with high bit-rates (20, 25 and 40 Gb/s) and its effectiveness with respect to the simple…

IT networks evolution, chiefly Internet, roots within the emergence of preeminent paradigms such as mobility and social networks. This development naturally triggers the impulse to reorganize the control of data spreading throughout the whole network. Taking into account access to services such as video or voice on demand coming from terminals which can be fixed or mobile such as smartphones, or also permeability of sensitive information provided to social networks, these factors compel a necessary interrogation about digital identity as a concept. It also intrinsically raises a full-fledged reconsideration of security and trust concepts. The contribution of this thesis project is in line, in a first part, with the analysis of the existing manifold digital identity frameworks as well as the study of current authentication protocols and trust issues raised by the lack of trusted environment such as smartcards. In a second part, as an answer to the concerns suggested in the first part, we will advocate an identity framework strongly bounded to the TLS authentication protocol which needs to be embedded in a secure component, thus providing the mandatory security assets for today’s networks while naturally fitting with a varied scope of terminals, be it fixed or mobile. In a last part, we will finally exhibit a few practical applications of this identity framework, which have been thoroughly tested and validated,…