Choose your preferred view mode

Please select whether you prefer to view the MDPI pages with a view tailored for mobile displays or to view the MDPI
pages in the normal scrollable desktop version. This selection will be stored into your cookies and used automatically
in next visits. You can also change the view style at any point from the main header when using the pages with your
mobile device.

Special Issue Information

Dear Colleagues,

Information theory has been applied in communications for over a half a century now, starting from the pioneering work of Shannon. The aim of this Special Issue is to encourage researchers to present original and recent developments on information theory for 5G communication systems and algorithms. 5G wireless communication systems are envisaged to support an exponential increase in the number of connected devices and the corresponding data demands. The use of innovative technologies in the context of 5G, such as Software Defined Networking, Network Function Virtualization, Cloud Computing, Internet of Things, among others, are expected to bring increasingly more attention of the research community. The mobile infrastructure resources evolve from closed physical equipment running private software to dynamic and open software instances running on top of virtualized infrastructure. It has created a new market based on the provisioning of customized mobile services. However, the lack of coordination and management of the resources limit the development of these novel technologies. The present Special Issue focuses on the design and management issues in Future Networks, especially in the context of 5G mobile systems. It aims to provide a holistic view of research challenges and opportunities in the future management, analysis and monitoring in this emerging area. For this purpose, submissions of comprehensive overviews and surveys for future networks, as well as original papers related to these techniques, are proposed. Any paper submitted to this special issue should be relevant to entropy ,information theory ,probability theory or related aspect.

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

How to measure the uncertainty of the basic probability assignment (BPA) function is an open issue in Dempster–Shafer (D–S) theory. The main work of this paper is to propose a new belief entropy, which is mainly used to measure the uncertainty of BPA.

How to measure the uncertainty of the basic probability assignment (BPA) function is an open issue in Dempster–Shafer (D–S) theory. The main work of this paper is to propose a new belief entropy, which is mainly used to measure the uncertainty of BPA. The proposed belief entropy is based on Deng entropy and probability interval consisting of lower and upper probabilities. In addition, under certain conditions, it can be transformed into Shannon entropy. Numerical examples are used to illustrate the efficiency of the new belief entropy in measurement uncertainty.
Full article

Correctly estimating the features characterizing human mobility from mobile phone traces is a key factor to improve the performance of mobile networks, as well as for mobility model design and urban planning. Most related works found their conclusions on location data based on

Correctly estimating the features characterizing human mobility from mobile phone traces is a key factor to improve the performance of mobile networks, as well as for mobility model design and urban planning. Most related works found their conclusions on location data based on the cells where each user sends or receives calls or messages, data known as Call Detail Records (CDRs). In this work, we test if such data sets provide enough detail on users’ movements so as to accurately estimate some of the most studied mobility features. We perform the analysis using two different data sets, comparing CDRs with respect to an alternative data collection approach. Furthermore, we propose three filtering techniques to reduce the biases detected in the fraction of visits per cell, entropy and entropy rate distributions, and predictability. The analysis highlights the need for contextualizing mobility results with respect to the data used, since the conclusions are biased by the mobile phone traces collection approach.
Full article

Network virtualization can offer more flexibility and better manageability for next generation Internet. With the increasing deployments of virtual networks in military and commercial networks, a major challenge is to ensure virtual network survivability against hybrid multiple failures. In this paper, we study

Network virtualization can offer more flexibility and better manageability for next generation Internet. With the increasing deployments of virtual networks in military and commercial networks, a major challenge is to ensure virtual network survivability against hybrid multiple failures. In this paper, we study the problem of recovering virtual networks affected by hybrid multiple failures in substrate networks and provide an integer linear programming formulation to solve it. We propose a heuristic algorithm to tackle the complexity of the integer linear programming formulation, which includes a faulty virtual network reconfiguration ranking method based on weighted relative entropy, a hybrid multiple failures ranking algorithm, and a virtual node migration method based on weighted relative entropy. In the faulty virtual network reconfiguration ranking method based on weighted relative entropy and virtual node migration method based on weighted relative entropy, multiple ranking indicators are combined in a suitable way based on weighted relative entropy. In the hybrid multiple failures ranking algorithm, the virtual node and its connective virtual links are re-embedded, firstly. Evaluation results show that our heuristic method not only has the best acceptance ratio and normal operation ratio, but also achieves the highest long-term average revenue to cost ratio compared with other virtual network reconfiguration methods.
Full article

Near-optimal transmit beamformers are designed for multiuser multiple-input single-output interference channels with slowly time-varying block fading. The main contribution of this article is to provide a method for deriving closed-form solutions to effective beamforming in both low and high signal-to-noise ratio regimes. The

Near-optimal transmit beamformers are designed for multiuser multiple-input single-output interference channels with slowly time-varying block fading. The main contribution of this article is to provide a method for deriving closed-form solutions to effective beamforming in both low and high signal-to-noise ratio regimes. The proposed method basically leverages side information obtained from the channel correlation between adjacent coding blocks. More specifically, our methodology is based on a linear algebraic approach, which is more efficient than the optimal scheme based on the Gaussian input in the sense of reducing the average number of search space dimensions for designing the near-optimal transmit beamformers. The proposed method is shown to exhibit near-optimal performance via computer simulations in terms of the average sum-rate.
Full article

Nowadays, there is a lot of critical information and services hosted on computer systems. The proper access control to these resources is essential to avoid malicious actions that could cause huge losses to home and professional users. The access control systems have evolved

Nowadays, there is a lot of critical information and services hosted on computer systems. The proper access control to these resources is essential to avoid malicious actions that could cause huge losses to home and professional users. The access control systems have evolved from the first password based systems to the modern mechanisms using smart cards, certificates, tokens, biometric systems, etc. However, when designing a system, it is necessary to take into account their particular limitations, such as connectivity, infrastructure or budget. In addition, one of the main objectives must be to ensure the system usability, but this property is usually orthogonal to the security. Thus, the use of password is still common. In this paper, we expose a new password based access control system that aims to improve password security with the minimum impact in the system usability.
Full article

In Fast Software Encryption (FSE) 2015, while presenting a new idea (i.e., the design of stream ciphers with the small internal state by using a secret key, not only in the initialization but also in the keystream generation), Sprout was proposed. Sprout was

In Fast Software Encryption (FSE) 2015, while presenting a new idea (i.e., the design of stream ciphers with the small internal state by using a secret key, not only in the initialization but also in the keystream generation), Sprout was proposed. Sprout was insecure and an improved version of Sprout was presented in FSE 2017. We introduced Fruit stream cipher informally in 2016 on the web page of IACR (eprint) and few cryptanalysis were published on it. Fortunately, the main structure of Fruit was resistant. Now, Fruit-80 is presented as a final version which is easier to implement and is secure. The size of LFSR and NFSR in Fruit-80 is only 80 bits (for 80-bit security level), while for resistance to the classical time-memory-data tradeoff (TMDTO) attacks, the internal state size should be at least twice that of the security level. To satisfy this rule and to design a concrete cipher, we used some new design ideas. It seems that the bottleneck of designing an ultra-lightweight stream cipher is TMDTO distinguishing attacks. A countermeasure was suggested, and another countermeasure is proposed here. Fruit-80 is better than other small-state stream ciphers in terms of the initialization speed and area size in hardware. It is possible to redesign many of the stream ciphers and achieve significantly smaller area size by using the new idea.
Full article

It was recently studied how to achieve the optimal degrees of freedom (DoF) in a multi-antenna full-duplex system with partial channel state information (CSI). In this paper, we revisit the DoF of a multiple-antenna full-duplex system using opportunistic transmission under the partial CSI,

It was recently studied how to achieve the optimal degrees of freedom (DoF) in a multi-antenna full-duplex system with partial channel state information (CSI). In this paper, we revisit the DoF of a multiple-antenna full-duplex system using opportunistic transmission under the partial CSI, in which a full-duplex base station having M transmit antennas and M receive antennas supports a set of half-duplex mobile stations (MSs) having a single antenna each. Assuming no self-interference, we present a new hybrid opportunistic scheduling method that achieves the optimal sum DoF under an improved user scaling law. Unlike the state-of-the-art scheduling method, our method is designed in the sense that the scheduling role between downlink MSs and uplink MSs is well-balanced. It is shown that the optimal sum DoF of 2M is asymptotically achievable provided that the number of MSs scales faster than SNRM, where SNR denotes the signal-to-noise ratio. This result reveals that, in our full-duplex system, better performance on the user scaling law can be obtained without extra CSI, compared to the prior work that showed the required user scaling condition (i.e., the minimum number of MSs for guaranteeing the optimal DoF) of SNR2M−1. Moreover, the average interference decaying rate is analyzed. Numerical evaluation is performed to not only validate our analysis but also show superiority of the proposed method over the state-of-the-art method.
Full article

Uplink and Downlink channel estimation in massive Multiple Input Multiple Output (MIMO) systems is an intricate issue because of the increasing channel matrix dimensions. The channel feedback overhead using traditional codebook schemes is very large, which consumes more bandwidth and decreases the overall

Uplink and Downlink channel estimation in massive Multiple Input Multiple Output (MIMO) systems is an intricate issue because of the increasing channel matrix dimensions. The channel feedback overhead using traditional codebook schemes is very large, which consumes more bandwidth and decreases the overall system efficiency. The purpose of this paper is to decrease the channel estimation overhead by taking the advantage of sparse attributes and also to optimize the Energy Efficiency (EE) of the system. To cope with this issue, we propose a novel approach by using Compressed-Sensing (CS), Block Iterative-Support-Detection (Block-ISD), Angle-of-Departure (AoD) and Structured Compressive Sampling Matching Pursuit (S-CoSaMP) algorithms to reduce the channel estimation overhead and compare them with the traditional algorithms. The CS uses temporal-correlation of time-varying channels to produce Differential-Channel Impulse Response (DCIR) among two CIRs that are adjacent in time-slots. DCIR has greater sparsity than the conventional CIRs as it can be easily compressed. The Block-ISD uses spatial-correlation of the channels to obtain the block-sparsity which results in lower pilot-overhead. AoD quantizes the channels whose path-AoDs variation is slower than path-gains and such information is utilized for reducing the overhead. S-CoSaMP deploys structured-sparsity to obtain reliable Channel-State-Information (CSI). MATLAB simulation results show that the proposed CS based algorithms reduce the feedback and pilot-overhead by a significant percentage and also improve the system capacity as compared with the traditional algorithms. Moreover, the EE level increases with increasing Base Station (BS) density, UE density and lowering hardware impairments level.
Full article

Fog computing extends the cloud computing paradigm by placing resources close to the edges of the network to deal with the upcoming growth of connected devices. Smart city applications, such as health monitoring and predictive maintenance, will introduce a new set of stringent

Fog computing extends the cloud computing paradigm by placing resources close to the edges of the network to deal with the upcoming growth of connected devices. Smart city applications, such as health monitoring and predictive maintenance, will introduce a new set of stringent requirements, such as low latency, since resources can be requested on-demand simultaneously by multiple devices at different locations. It is then necessary to adapt existing network technologies to future needs and design new architectural concepts to help meet these strict requirements. This article proposes a fog computing framework enabling autonomous management and orchestration functionalities in 5G-enabled smart cities. Our approach follows the guidelines of the European Telecommunications Standards Institute (ETSI) NFV MANO architecture extending it with additional software components. The contribution of our work is its fully-integrated fog node management system alongside the foreseen application layer Peer-to-Peer (P2P) fog protocol based on the Open Shortest Path First (OSPF) routing protocol for the exchange of application service provisioning information between fog nodes. Evaluations of an anomaly detection use case based on an air monitoring application are presented. Our results show that the proposed framework achieves a substantial reduction in network bandwidth usage and in latency when compared to centralized cloud solutions.
Full article

In recent years, an important increase in the amount and impact of Distributed Denial of Service (DDoS) threats has been reported by the different information security organizations. They typically target the depletion of the computational resources of the victims, hence drastically harming their

In recent years, an important increase in the amount and impact of Distributed Denial of Service (DDoS) threats has been reported by the different information security organizations. They typically target the depletion of the computational resources of the victims, hence drastically harming their operational capabilities. Inspired by these methods, Economic Denial of Sustainability (EDoS) attacks pose a similar motivation, but adapted to Cloud computing environments, where the denial is achieved by damaging the economy of both suppliers and customers. Therefore, the most common EDoS approach is making the offered services unsustainable by exploiting their auto-scaling algorithms. In order to contribute to their mitigation, this paper introduces a novel EDoS detection method based on the study of entropy variations related with metrics taken into account when deciding auto-scaling actuations. Through the prediction and definition of adaptive thresholds, unexpected behaviors capable of fraudulently demand new resource hiring are distinguished. With the purpose of demonstrate the effectiveness of the proposal, an experimental scenario adapted to the singularities of the EDoS threats and the assumptions driven by their original definition is described in depth. The preliminary results proved high accuracy.
Full article

Dynamic adaptive streaming over Hypertext Transfer Protocol (HTTP) is an advanced technology in video streaming to deal with the uncertainty of network states. However, this technology has one drawback as the network states frequently and continuously change. The quality of a video streaming

Dynamic adaptive streaming over Hypertext Transfer Protocol (HTTP) is an advanced technology in video streaming to deal with the uncertainty of network states. However, this technology has one drawback as the network states frequently and continuously change. The quality of a video streaming fluctuates along with the network changes, and it might reduce the quality of service. In recent years, many researchers have proposed several adaptive streaming algorithms to reduce such changes. However, these algorithms only consider the current state of a network. Thus, these algorithms might result in inaccurate estimates of a video quality in the near term. Therefore, in this paper, we propose a method using fuzzy logic and a mathematics moving average technique, in order to reduce mobile video quality fluctuation in Dynamic Adaptive Streaming over HTTP (DASH). First, we calculate the moving average of the bandwidth and buffer values for a given period. On the basis of differences between real and average values, we propose a fuzzy logic system to deduce the value of the video quality representation for the next request. In addition, we use the entropy rate of a bandwidth measurement sequence to measure the predictable/stabilization of our method. The experiment results show that our proposed method reduces video quality fluctuation as well as improves 40% of bandwidth utilization compared to existing methods.
Full article

There has been a growing interest in sleep management recently, and sleep care services using mobile or wearable devices are under development. However, devices with one sensor have limitations in analyzing various sleep states. If Internet of Things (IoT) technology, which collects information

There has been a growing interest in sleep management recently, and sleep care services using mobile or wearable devices are under development. However, devices with one sensor have limitations in analyzing various sleep states. If Internet of Things (IoT) technology, which collects information from multiple sensors and analyzes them in an integrated manner, can be used then various sleep states can be more accurately measured. Therefore, in this paper, we propose a Smart Model for Sleep Care to provide a service to measure and analyze the sleep state using various sensors. In this model, we designed and implemented a Sleep Information Gathering Protocol to transmit the information measured between physical sensors and sleep sensors. Experiments were conducted to compare the throughput and the consumed power of this new protocol with those of the protocols used in the existing service—we achieved the throughput of about two times and 20% reduction in power consumption, which has confirmed the effectiveness of the proposed protocol. We judge that this protocol is meaningful as it can be applied to a Smart Model for Sleep Care that incorporates IoT technology and allows expanded sleep care if used together with services for treating sleep disorders.
Full article

Mobile Service selection is an important but challenging problem in service and mobile computing. Quality of service (QoS) predication is a critical step in service selection in 5G network environments. The traditional methods, such as collaborative filtering (CF), suffer from a series of

Mobile Service selection is an important but challenging problem in service and mobile computing. Quality of service (QoS) predication is a critical step in service selection in 5G network environments. The traditional methods, such as collaborative filtering (CF), suffer from a series of defects, such as failing to handle data sparsity. In mobile network environments, the abnormal QoS data are likely to result in inferior prediction accuracy. Unfortunately, these problems have not attracted enough attention, especially in a mixed mobile network environment with different network configurations, generations, or types. An ensemble learning method for predicting missing QoS in 5G network environments is proposed in this paper. There are two key principles: one is the newly proposed similarity computation method for identifying similar neighbors; the other is the extended ensemble learning model for discovering and filtering fake neighbors from the preliminary neighbors set. Moreover, three prediction models are also proposed, two individual models and one combination model. They are used for utilizing the user similar neighbors and servicing similar neighbors, respectively. Experimental results conducted in two real-world datasets show our approaches can produce superior prediction accuracy.
Full article

Recently, green networks are considered as one of the hottest topics in Information and Communication Technology (ICT), especially in mobile communication networks. In a green network, energy saving of network nodes such as base stations (BSs), switches, and servers should be achieved efficiently.

Recently, green networks are considered as one of the hottest topics in Information and Communication Technology (ICT), especially in mobile communication networks. In a green network, energy saving of network nodes such as base stations (BSs), switches, and servers should be achieved efficiently. In this paper, we consider a heterogeneous network architecture in 5G networks with separated data and control planes, where basically a macro cell manages control signals and a small cell manages data traffic. Then, we propose an optimized handover scheme based on context information such as reference signal received power, speed of user equipment (UE), traffic load, call admission control level, and data type. In this paper, the main objective of the proposed optimal handover is either to reduce the number of handovers or the total energy consumption of BSs. To this end, we develop optimization problems with either the minimization of the number of total handovers or the minimization of energy consumption of BSs as the objective function of the optimization problem. The solution of the optimization problem is obtained by particle swarm optimization, since the developed optimization problem is an NP hard problem. Performance analysis results via simulation based on various probability distributions of the characteristics of UE and BS show that the proposed optimized handover based on context information performs better than the previous call admission control based handover scheme, from the perspective of the number of handovers and total energy consumption. We also show that the proposed handover scheme can efficiently reduce either the number of handovers or the total energy consumption by applying either handover minimization or energy minimization depending on the objective of the application.
Full article

Outage probabilities are important measures of the performance of wireless communication systems, but to obtain outage probabilities it is necessary to first determine detailed system parameters, followed by complicated calculations. When there are multiple candidates of diversity techniques applicable for a system, the

Outage probabilities are important measures of the performance of wireless communication systems, but to obtain outage probabilities it is necessary to first determine detailed system parameters, followed by complicated calculations. When there are multiple candidates of diversity techniques applicable for a system, the diversity order can be used to roughly but quickly compare the techniques for a wide range of operating environments. For a system transmitting over frequency selective fading channels, the diversity order can be defined as the number of multi-paths if multi-paths have all equal energy. However, diversity order may not be adequately defined when the energy values are different. In order to obtain a rough value of diversity order, one may use the number of multi-paths or the reciprocal value of the multi-path energy variance. Such definitions are not very useful for evaluating the performance of diversity techniques since the former is meaningful only when the target outage probability is extremely small, while the latter is reasonable when the target outage probability is very large. In this paper, we propose a new definition of diversity order for frequency selective fading channels. The proposed scheme is based on Renyi entropy, which is widely used in biology and many other fields. We provide various simulation results to show that the diversity order using the proposed definition is tightly correlated with the corresponding outage probability, and thus the proposed scheme can be used for quickly selecting the best diversity technique among multiple candidates.
Full article

5G networks expect to provide significant advances in network management compared to traditional mobile infrastructures by leveraging intelligence capabilities such as data analysis, prediction, pattern recognition and artificial intelligence. The key idea behind these actions is to facilitate the decision-making process in order

5G networks expect to provide significant advances in network management compared to traditional mobile infrastructures by leveraging intelligence capabilities such as data analysis, prediction, pattern recognition and artificial intelligence. The key idea behind these actions is to facilitate the decision-making process in order to solve or mitigate common network problems in a dynamic and proactive way. In this context, this paper presents the design of Self-Organized Network Management in Virtualized and Software Defined Networks (SELFNET) Analyzer Module, which main objective is to identify suspicious or unexpected situations based on metrics provided by different network components and sensors. The SELFNET Analyzer Module provides a modular architecture driven by use cases where analytic functions can be easily extended. This paper also proposes the data specification to define the data inputs to be taking into account in diagnosis process. This data specification has been implemented with different use cases within SELFNET Project, proving its effectiveness.
Full article

Nowadays, different protocols coexist in Internet that provides services to users. Unfortunately, control decisions and distributed management make it hard to control networks. These problems result in an inefficient and unpredictable network behaviour. Software Defined Networks (SDN) is a new concept of network

Nowadays, different protocols coexist in Internet that provides services to users. Unfortunately, control decisions and distributed management make it hard to control networks. These problems result in an inefficient and unpredictable network behaviour. Software Defined Networks (SDN) is a new concept of network architecture. It intends to be more flexible and to simplify the management in networks with respect to traditional architectures. Each of these aspects are possible because of the separation of control plane (controller) and data plane (switches) in network devices. OpenFlow is the most common protocol for SDN networks that provides the communication between control and data planes. Moreover, the advantage of decoupling control and data planes enables a quick evolution of protocols and also its deployment without replacing data plane switches. In this survey, we review the SDN technology and the OpenFlow protocol and their related works. Specifically, we describe some technologies as Wireless Sensor Networks and Wireless Cellular Networks and how SDN can be included within them in order to solve their challenges. We classify different solutions for each technology attending to the problem that is being fixed.
Full article