CryptographyLatest open access articles published in Cryptography at http://www.mdpi.com/journal/cryptographyhttp://www.mdpi.com/journal/cryptography
MDPIenCreative Commons Attribution (CC-BY)MDPIsupport@mdpi.com

Cryptography, Vol. 1, Pages 25: Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic Encryptionhttp://www.mdpi.com/2410-387X/1/3/25
We introduce a robust framework that allows for cryptographically secure multiparty computations, such as distributed private value auctions. The security is guaranteed by two-sided authentication of all network connections, homomorphically encrypted bids, and the publication of zero-knowledge proofs of every computation. This also allows a non-participant verifier to verify the result of any such computation using only the information broadcasted on the network by each individual bidder. Building on previous work on such systems, we design and implement an extensible framework that puts the described ideas to practice. Apart from the actual implementation of the framework, our biggest contribution is the level of protection we are able to guarantee from attacks described in previous work. In order to provide guidance to users of the library, we analyze the use of zero knowledge proofs in ensuring the correct behavior of each node in a computation. We also describe the usage of the library to perform a private-value distributed auction, as well as the other challenges in implementing the protocol, such as auction registration and certificate distribution. Finally, we provide performance statistics on our implementation of the auction.Cryptography, Vol. 1, Pages 25: Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic Encryption

We introduce a robust framework that allows for cryptographically secure multiparty computations, such as distributed private value auctions. The security is guaranteed by two-sided authentication of all network connections, homomorphically encrypted bids, and the publication of zero-knowledge proofs of every computation. This also allows a non-participant verifier to verify the result of any such computation using only the information broadcasted on the network by each individual bidder. Building on previous work on such systems, we design and implement an extensible framework that puts the described ideas to practice. Apart from the actual implementation of the framework, our biggest contribution is the level of protection we are able to guarantee from attacks described in previous work. In order to provide guidance to users of the library, we analyze the use of zero knowledge proofs in ensuring the correct behavior of each node in a computation. We also describe the usage of the library to perform a private-value distributed auction, as well as the other challenges in implementing the protocol, such as auction registration and certificate distribution. Finally, we provide performance statistics on our implementation of the auction.

]]>Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic EncryptionAnunay KulshresthaAkshay RampuriaMatthew DentonAshwin Sreenivasdoi: 10.3390/cryptography1030025Cryptography2017-12-12Cryptography2017-12-1213Article2510.3390/cryptography1030025http://www.mdpi.com/2410-387X/1/3/25Cryptography, Vol. 1, Pages 24: Anomalous Traffic Detection and Self-Similarity Analysis in the Environment of ATMSimhttp://www.mdpi.com/2410-387X/1/3/24
Internet utilisation has steadily increased, predominantly due to the rapid recent development of information and communication networks and the widespread distribution of smartphones. As a result of this increase in Internet consumption, various types of services, including web services, social networking services (SNS), Internet banking, and remote processing systems have been created. These services have significantly enhanced global quality of life. However, as a negative side-effect of this rapid development, serious information security problems have also surfaced, which has led to serious to Internet privacy invasions and network attacks. In an attempt to contribute to the process of addressing these problems, this paper proposes a process to detect anomalous traffic using self-similarity analysis in the Anomaly Teletraffic detection Measurement analysis Simulator (ATMSim) environment as a research method. Simulations were performed to measure normal and anomalous traffic. First, normal traffic for each attack, including the Address Resolution Protocol (ARP) and distributed denial-of-service (DDoS) was measured for 48 h over 10 iterations. Hadoop was used to facilitate processing of the large amount of collected data, after which MapReduce was utilised after storing the data in the Hadoop Distributed File System (HDFS). A new platform on Hadoop, the detection system ATMSim, was used to identify anomalous traffic after which a comparative analysis of the normal and anomalous traffic was performed through a self-similarity analysis. There were four categories of collected traffic that were divided according to the attack methods used: normal local area network (LAN) traffic, DDoS attack, and ARP spoofing, as well as DDoS and ARP attack. ATMSim, the anomaly traffic detection system, was used to determine if real attacks could be identified effectively. To achieve this, the ATMSim was used in simulations for each scenario to test its ability to distinguish between normal and anomalous traffic. The graphic and quantitative analyses in this study, based on the self-similarity estimation for the four different traffic types, showed a burstiness phenomenon when anomalous traffic occurred and self-similarity values were high. This differed significantly from the results obtained when normal traffic, such as LAN traffic, occurred. In further studies, this anomaly detection approach can be utilised with biologically inspired techniques that can predict behaviour, such as the artificial neural network (ANN) or fuzzy approach.Cryptography, Vol. 1, Pages 24: Anomalous Traffic Detection and Self-Similarity Analysis in the Environment of ATMSim

Internet utilisation has steadily increased, predominantly due to the rapid recent development of information and communication networks and the widespread distribution of smartphones. As a result of this increase in Internet consumption, various types of services, including web services, social networking services (SNS), Internet banking, and remote processing systems have been created. These services have significantly enhanced global quality of life. However, as a negative side-effect of this rapid development, serious information security problems have also surfaced, which has led to serious to Internet privacy invasions and network attacks. In an attempt to contribute to the process of addressing these problems, this paper proposes a process to detect anomalous traffic using self-similarity analysis in the Anomaly Teletraffic detection Measurement analysis Simulator (ATMSim) environment as a research method. Simulations were performed to measure normal and anomalous traffic. First, normal traffic for each attack, including the Address Resolution Protocol (ARP) and distributed denial-of-service (DDoS) was measured for 48 h over 10 iterations. Hadoop was used to facilitate processing of the large amount of collected data, after which MapReduce was utilised after storing the data in the Hadoop Distributed File System (HDFS). A new platform on Hadoop, the detection system ATMSim, was used to identify anomalous traffic after which a comparative analysis of the normal and anomalous traffic was performed through a self-similarity analysis. There were four categories of collected traffic that were divided according to the attack methods used: normal local area network (LAN) traffic, DDoS attack, and ARP spoofing, as well as DDoS and ARP attack. ATMSim, the anomaly traffic detection system, was used to determine if real attacks could be identified effectively. To achieve this, the ATMSim was used in simulations for each scenario to test its ability to distinguish between normal and anomalous traffic. The graphic and quantitative analyses in this study, based on the self-similarity estimation for the four different traffic types, showed a burstiness phenomenon when anomalous traffic occurred and self-similarity values were high. This differed significantly from the results obtained when normal traffic, such as LAN traffic, occurred. In further studies, this anomaly detection approach can be utilised with biologically inspired techniques that can predict behaviour, such as the artificial neural network (ANN) or fuzzy approach.

]]>Anomalous Traffic Detection and Self-Similarity Analysis in the Environment of ATMSimHae-Duck JeongWonHwi AhnHyeonggeun KimJong-Suk Leedoi: 10.3390/cryptography1030024Cryptography2017-12-12Cryptography2017-12-1213Article2410.3390/cryptography1030024http://www.mdpi.com/2410-387X/1/3/24Cryptography, Vol. 1, Pages 23: FPGA Implementation of a Cryptographically-Secure PUF Based on Learning Parity with Noisehttp://www.mdpi.com/2410-387X/1/3/23
Herder et al. (IEEE Transactions on Dependable and Secure Computing, 2017) designed a new computational fuzzy extractor and physical unclonable function (PUF) challenge-response protocol based on the Learning Parity with Noise (LPN) problem. The protocol requires no irreversible state updates on the PUFs for security, like burning irreversible fuses, and can correct for significant measurement noise when compared to PUFs using a conventional (information theoretical secure) fuzzy extractor. However, Herder et al. did not implement their protocol. In this paper, we give the first implementation of a challenge response protocol based on computational fuzzy extractors. Our main insight is that “confidence information” does not need to be kept private, if the noise vector is independent of the confidence information, e.g., the bits generated by ring oscillator pairs which are physically placed close to each other. This leads to a construction which is a simplified version of the design of Herder et al. (also building on a ring oscillator PUF). Our simplifications allow for a dramatic reduction in area by making a mild security assumption on ring oscillator physical obfuscated key output bits.Cryptography, Vol. 1, Pages 23: FPGA Implementation of a Cryptographically-Secure PUF Based on Learning Parity with Noise

Herder et al. (IEEE Transactions on Dependable and Secure Computing, 2017) designed a new computational fuzzy extractor and physical unclonable function (PUF) challenge-response protocol based on the Learning Parity with Noise (LPN) problem. The protocol requires no irreversible state updates on the PUFs for security, like burning irreversible fuses, and can correct for significant measurement noise when compared to PUFs using a conventional (information theoretical secure) fuzzy extractor. However, Herder et al. did not implement their protocol. In this paper, we give the first implementation of a challenge response protocol based on computational fuzzy extractors. Our main insight is that “confidence information” does not need to be kept private, if the noise vector is independent of the confidence information, e.g., the bits generated by ring oscillator pairs which are physically placed close to each other. This leads to a construction which is a simplified version of the design of Herder et al. (also building on a ring oscillator PUF). Our simplifications allow for a dramatic reduction in area by making a mild security assumption on ring oscillator physical obfuscated key output bits.

]]>FPGA Implementation of a Cryptographically-Secure PUF Based on Learning Parity with NoiseChenglu JinCharles HerderLing RenPhuong NguyenBenjamin FullerSrinivas DevadasMarten van Dijkdoi: 10.3390/cryptography1030023Cryptography2017-12-09Cryptography2017-12-0913Article2310.3390/cryptography1030023http://www.mdpi.com/2410-387X/1/3/23Cryptography, Vol. 1, Pages 22: Learning Global-Local Distance Metrics for Signature-Based Biometric Cryptosystemshttp://www.mdpi.com/2410-387X/1/3/22
Biometric traits, such as fingerprints, faces and signatures have been employed in bio-cryptosystems to secure cryptographic keys within digital security schemes. Reliable implementations of these systems employ error correction codes formulated as simple distance thresholds, although they may not effectively model the complex variability of behavioral biometrics like signatures. In this paper, a Global-Local Distance Metric (GLDM) framework is proposed to learn cost-effective distance metrics, which reduce within-class variability and augment between-class variability, so that simple error correction thresholds of bio-cryptosystems provide high classification accuracy. First, a large number of samples from a development dataset are used to train a global distance metric that differentiates within-class from between-class samples of the population. Then, once user-specific samples are available for enrollment, the global metric is tuned to a local user-specific one. Proof-of-concept experiments on two reference offline signature databases confirm the viability of the proposed approach. Distance metrics are produced based on concise signature representations consisting of about 20 features and a single prototype. A signature-based bio-cryptosystem is designed using the produced metrics and has shown average classification error rates of about 7% and 17% for the PUCPR and the GPDS-300 databases, respectively. This level of performance is comparable to that obtained with complex state-of-the-art classifiers.Cryptography, Vol. 1, Pages 22: Learning Global-Local Distance Metrics for Signature-Based Biometric Cryptosystems

Biometric traits, such as fingerprints, faces and signatures have been employed in bio-cryptosystems to secure cryptographic keys within digital security schemes. Reliable implementations of these systems employ error correction codes formulated as simple distance thresholds, although they may not effectively model the complex variability of behavioral biometrics like signatures. In this paper, a Global-Local Distance Metric (GLDM) framework is proposed to learn cost-effective distance metrics, which reduce within-class variability and augment between-class variability, so that simple error correction thresholds of bio-cryptosystems provide high classification accuracy. First, a large number of samples from a development dataset are used to train a global distance metric that differentiates within-class from between-class samples of the population. Then, once user-specific samples are available for enrollment, the global metric is tuned to a local user-specific one. Proof-of-concept experiments on two reference offline signature databases confirm the viability of the proposed approach. Distance metrics are produced based on concise signature representations consisting of about 20 features and a single prototype. A signature-based bio-cryptosystem is designed using the produced metrics and has shown average classification error rates of about 7% and 17% for the PUCPR and the GPDS-300 databases, respectively. This level of performance is comparable to that obtained with complex state-of-the-art classifiers.

]]>Learning Global-Local Distance Metrics for Signature-Based Biometric CryptosystemsGeorge EkladiousRobert SabourinEric Grangerdoi: 10.3390/cryptography1030022Cryptography2017-11-25Cryptography2017-11-2513Article2210.3390/cryptography1030022http://www.mdpi.com/2410-387X/1/3/22Cryptography, Vol. 1, Pages 21: A Cryptographic System Based upon the Principles of Gene Expressionhttp://www.mdpi.com/2410-387X/1/3/21
Processes of gene expression such as regulation of transcription by the general transcription complex can be used to create hard cryptographic protocols which should not be breakable by common cipherattack methodologies. The eukaryotic processes of gene expression permit expansion of DNA cryptography into complex networks of transcriptional and translational coding interactions. I describe a method of coding messages into genes and their regulatory sequences, transcription products, regulatory protein complexes, transcription proteins, translation proteins and other required sequences. These codes then serve as the basis for a cryptographic model based on the processes of gene expression. The protocol provides a hierarchal structure that extends from the initial coding of a message into a DNA code (ciphergene), through transcription and ultimately translation into a protein code (cipherprotein). The security is based upon unique knowledge of the DNA coding process, all of the regulatory codes required for expression, and their interactions. This results in a set of cryptographic protocols that is capable of securing data at rest, data in motion and providing an evolvable form of security between two or more parties. The conclusion is that implementation of these protocols will enhance security and substantially burden cyberattackers to develop new forms of countermeasures.Cryptography, Vol. 1, Pages 21: A Cryptographic System Based upon the Principles of Gene Expression

Processes of gene expression such as regulation of transcription by the general transcription complex can be used to create hard cryptographic protocols which should not be breakable by common cipherattack methodologies. The eukaryotic processes of gene expression permit expansion of DNA cryptography into complex networks of transcriptional and translational coding interactions. I describe a method of coding messages into genes and their regulatory sequences, transcription products, regulatory protein complexes, transcription proteins, translation proteins and other required sequences. These codes then serve as the basis for a cryptographic model based on the processes of gene expression. The protocol provides a hierarchal structure that extends from the initial coding of a message into a DNA code (ciphergene), through transcription and ultimately translation into a protein code (cipherprotein). The security is based upon unique knowledge of the DNA coding process, all of the regulatory codes required for expression, and their interactions. This results in a set of cryptographic protocols that is capable of securing data at rest, data in motion and providing an evolvable form of security between two or more parties. The conclusion is that implementation of these protocols will enhance security and substantially burden cyberattackers to develop new forms of countermeasures.

]]>A Cryptographic System Based upon the Principles of Gene ExpressionHarry Shawdoi: 10.3390/cryptography1030021Cryptography2017-11-21Cryptography2017-11-2113Article2110.3390/cryptography1030021http://www.mdpi.com/2410-387X/1/3/21Cryptography, Vol. 1, Pages 20: Performance Analysis of Secure and Private Billing Protocols for Smart Meteringhttp://www.mdpi.com/2410-387X/1/3/20
Traditional utility metering is to be replaced by smart metering. Smart metering enables fine-grained utility consumption measurements. These fine-grained measurements raise privacy concerns due to the lifestyle information which can be inferred from the precise time at which utilities were consumed. This paper outlines and compares two privacy-respecting time of use billing protocols for smart metering and investigates their performance on a variety of hardware. These protocols protect the privacy of customers by never transmitting the fine-grained utility readings outside of the customer’s home network. One protocol favors complexity on the trusted smart meter hardware while the other uses homomorphic commitments to offload computation to a third device. Both protocols are designed to operate on top of existing cryptographic secure channel protocols in place on smart meters. Proof of concept software implementations of these protocols have been written and their suitability for real world application to low-performance smart meter hardware is discussed. These protocols may also have application to other privacy conscious aggregation systems, such as electronic voting.Cryptography, Vol. 1, Pages 20: Performance Analysis of Secure and Private Billing Protocols for Smart Metering

Traditional utility metering is to be replaced by smart metering. Smart metering enables fine-grained utility consumption measurements. These fine-grained measurements raise privacy concerns due to the lifestyle information which can be inferred from the precise time at which utilities were consumed. This paper outlines and compares two privacy-respecting time of use billing protocols for smart metering and investigates their performance on a variety of hardware. These protocols protect the privacy of customers by never transmitting the fine-grained utility readings outside of the customer’s home network. One protocol favors complexity on the trusted smart meter hardware while the other uses homomorphic commitments to offload computation to a third device. Both protocols are designed to operate on top of existing cryptographic secure channel protocols in place on smart meters. Proof of concept software implementations of these protocols have been written and their suitability for real world application to low-performance smart meter hardware is discussed. These protocols may also have application to other privacy conscious aggregation systems, such as electronic voting.

]]>Performance Analysis of Secure and Private Billing Protocols for Smart MeteringTom EcclesBasel Halakdoi: 10.3390/cryptography1030020Cryptography2017-11-17Cryptography2017-11-1713Article2010.3390/cryptography1030020http://www.mdpi.com/2410-387X/1/3/20Cryptography, Vol. 1, Pages 19: Practical Architectures for Deployment of Searchable Encryption in a Cloud Environmenthttp://www.mdpi.com/2410-387X/1/3/19
Public cloud service providers provide an infrastructure that gives businesses and individuals access to computing power and storage space on a pay-as-you-go basis. This allows these entities to bypass the usual costs associated with having their own data centre such as: hardware, construction, air conditioning and security costs, for example, making this a cost-effective solution for data storage. If the data being stored is of a sensitive nature, encrypting it prior to outsourcing it to a public cloud is a good method of ensuring the confidentiality of the data. With the data being encrypted, however, searching over it becomes unfeasible. In this paper, we examine different architectures for supporting search over encrypted data and discuss some of the challenges that need to be overcome if these techniques are to be engineered into practical systems.Cryptography, Vol. 1, Pages 19: Practical Architectures for Deployment of Searchable Encryption in a Cloud Environment

Public cloud service providers provide an infrastructure that gives businesses and individuals access to computing power and storage space on a pay-as-you-go basis. This allows these entities to bypass the usual costs associated with having their own data centre such as: hardware, construction, air conditioning and security costs, for example, making this a cost-effective solution for data storage. If the data being stored is of a sensitive nature, encrypting it prior to outsourcing it to a public cloud is a good method of ensuring the confidentiality of the data. With the data being encrypted, however, searching over it becomes unfeasible. In this paper, we examine different architectures for supporting search over encrypted data and discuss some of the challenges that need to be overcome if these techniques are to be engineered into practical systems.

]]>Practical Architectures for Deployment of Searchable Encryption in a Cloud EnvironmentSarah RenwickKeith Martindoi: 10.3390/cryptography1030019Cryptography2017-11-15Cryptography2017-11-1513Article1910.3390/cryptography1030019http://www.mdpi.com/2410-387X/1/3/19Cryptography, Vol. 1, Pages 18: Synchronization in Quantum Key Distribution Systemshttp://www.mdpi.com/2410-387X/1/3/18
In the description of quantum key distribution systems, much attention is paid to the operation of quantum cryptography protocols. The main problem is the insufficient study of the synchronization process of quantum key distribution systems. This paper contains a general description of quantum cryptography principles. A two-line fiber-optic quantum key distribution system with phase coding of photon states in transceiver and coding station synchronization mode was examined. A quantum key distribution system was built on the basis of the scheme with automatic compensation of polarization mode distortions. Single-photon avalanche diodes were used as optical radiation detecting devices. It was estimated how the parameters used in quantum key distribution systems of optical detectors affect the detection of the time frame with attenuated optical pulse in synchronization mode with respect to its probabilistic and time-domain characteristics. A design method was given for the process that detects the time frame that includes an optical pulse during synchronization. This paper describes the main quantum communication channel attack methods by removing a portion of optical emission. This paper describes the developed synchronization algorithm that takes into account the time required to restore the photodetector’s operation state after the photon has been registered during synchronization. The computer simulation results of the developed synchronization algorithm were analyzed. The efficiency of the developed algorithm with respect to synchronization process protection from unauthorized gathering of optical emission is demonstrated herein.Cryptography, Vol. 1, Pages 18: Synchronization in Quantum Key Distribution Systems

In the description of quantum key distribution systems, much attention is paid to the operation of quantum cryptography protocols. The main problem is the insufficient study of the synchronization process of quantum key distribution systems. This paper contains a general description of quantum cryptography principles. A two-line fiber-optic quantum key distribution system with phase coding of photon states in transceiver and coding station synchronization mode was examined. A quantum key distribution system was built on the basis of the scheme with automatic compensation of polarization mode distortions. Single-photon avalanche diodes were used as optical radiation detecting devices. It was estimated how the parameters used in quantum key distribution systems of optical detectors affect the detection of the time frame with attenuated optical pulse in synchronization mode with respect to its probabilistic and time-domain characteristics. A design method was given for the process that detects the time frame that includes an optical pulse during synchronization. This paper describes the main quantum communication channel attack methods by removing a portion of optical emission. This paper describes the developed synchronization algorithm that takes into account the time required to restore the photodetector’s operation state after the photon has been registered during synchronization. The computer simulation results of the developed synchronization algorithm were analyzed. The efficiency of the developed algorithm with respect to synchronization process protection from unauthorized gathering of optical emission is demonstrated herein.

]]>Synchronization in Quantum Key Distribution SystemsAnton PljonkinKonstantin RumyantsevPradeep Singhdoi: 10.3390/cryptography1030018Cryptography2017-10-31Cryptography2017-10-3113Article1810.3390/cryptography1030018http://www.mdpi.com/2410-387X/1/3/18Cryptography, Vol. 1, Pages 17: Leveraging Distributions in Physical Unclonable Functionshttp://www.mdpi.com/2410-387X/1/3/17
A special class of Physical Unclonable Functions (PUFs) referred to as strong PUFs can be used in novel hardware-based authentication protocols. Strong PUFs are required for authentication because the bit strings and helper data are transmitted openly by the token to the verifier, and therefore are revealed to the adversary. This enables the adversary to carry out attacks against the token by systematically applying challenges and obtaining responses in an attempt to machine learn, and later predict, the token’s response to an arbitrary challenge. Therefore, strong PUFs must both provide an exponentially large challenge space and be resistant to machine-learning attacks in order to be considered secure. We investigate a transformation called temperature–voltage compensation (TVCOMP), which is used within the Hardware-Embedded Delay PUF (HELP) bit string generation algorithm. TVCOMP increases the diversity and unpredictability of the challenge–response space, and therefore increases resistance to model-building attacks. HELP leverages within-die variations in path delays as a source of random information. TVCOMP is a linear transformation designed specifically for dealing with changes in delay introduced by adverse temperature–voltage (environmental) variations. In this paper, we show that TVCOMP also increases entropy and expands the challenge–response space dramatically.Cryptography, Vol. 1, Pages 17: Leveraging Distributions in Physical Unclonable Functions

A special class of Physical Unclonable Functions (PUFs) referred to as strong PUFs can be used in novel hardware-based authentication protocols. Strong PUFs are required for authentication because the bit strings and helper data are transmitted openly by the token to the verifier, and therefore are revealed to the adversary. This enables the adversary to carry out attacks against the token by systematically applying challenges and obtaining responses in an attempt to machine learn, and later predict, the token’s response to an arbitrary challenge. Therefore, strong PUFs must both provide an exponentially large challenge space and be resistant to machine-learning attacks in order to be considered secure. We investigate a transformation called temperature–voltage compensation (TVCOMP), which is used within the Hardware-Embedded Delay PUF (HELP) bit string generation algorithm. TVCOMP increases the diversity and unpredictability of the challenge–response space, and therefore increases resistance to model-building attacks. HELP leverages within-die variations in path delays as a source of random information. TVCOMP is a linear transformation designed specifically for dealing with changes in delay introduced by adverse temperature–voltage (environmental) variations. In this paper, we show that TVCOMP also increases entropy and expands the challenge–response space dramatically.

]]>Leveraging Distributions in Physical Unclonable FunctionsWenjie CheVenkata KajuluriFareena SaqibJim Plusquellicdoi: 10.3390/cryptography1030017Cryptography2017-10-30Cryptography2017-10-3013Article1710.3390/cryptography1030017http://www.mdpi.com/2410-387X/1/3/17Cryptography, Vol. 1, Pages 16: A Text-Independent Speaker Authentication System for Mobile Deviceshttp://www.mdpi.com/2410-387X/1/3/16
This paper presents a text independent speaker authentication method adapted to mobile devices. Special attention was placed on delivering a fully operational application, which admits a sufficient reliability level and an efficient functioning. To this end, we have excluded the need for any network communication. Hence, we opted for the completion of both the training and the identification processes directly on the mobile device through the extraction of linear prediction cepstral coefficients and the naive Bayes algorithm as the classifier. Furthermore, the authentication decision is enhanced to overcome misidentification through access privileges that the user should attribute to each application beforehand. To evaluate the proposed authentication system, eleven participants were involved in the experiment, conducted in quiet and noisy environments. Public speech corpora were also employed to compare this implementation to existing methods. Results were efficient regarding mobile resources’ consumption. The overall classification performance obtained was accurate with a small number of samples. Then, it appeared that our authentication system might be used as a first security layer, but also as part of a multilayer authentication, or as a fall-back mechanism.Cryptography, Vol. 1, Pages 16: A Text-Independent Speaker Authentication System for Mobile Devices

This paper presents a text independent speaker authentication method adapted to mobile devices. Special attention was placed on delivering a fully operational application, which admits a sufficient reliability level and an efficient functioning. To this end, we have excluded the need for any network communication. Hence, we opted for the completion of both the training and the identification processes directly on the mobile device through the extraction of linear prediction cepstral coefficients and the naive Bayes algorithm as the classifier. Furthermore, the authentication decision is enhanced to overcome misidentification through access privileges that the user should attribute to each application beforehand. To evaluate the proposed authentication system, eleven participants were involved in the experiment, conducted in quiet and noisy environments. Public speech corpora were also employed to compare this implementation to existing methods. Results were efficient regarding mobile resources’ consumption. The overall classification performance obtained was accurate with a small number of samples. Then, it appeared that our authentication system might be used as a first security layer, but also as part of a multilayer authentication, or as a fall-back mechanism.

]]>A Text-Independent Speaker Authentication System for Mobile DevicesFlorentin ThullierBruno BouchardBob-Antoine Menelasdoi: 10.3390/cryptography1030016Cryptography2017-09-22Cryptography2017-09-2213Article1610.3390/cryptography1030016http://www.mdpi.com/2410-387X/1/3/16Cryptography, Vol. 1, Pages 15: Beyond Bitcoin: A Critical Look at Blockchain-Based Systemshttp://www.mdpi.com/2410-387X/1/2/15
After more than eight years since the launch of Bitcoin, the decentralized transaction ledger functionality implemented through the blockchain technology is being used not only for cryptocurrencies, but to register, confirm and transfer any kind of contract and property. In this work, we analyze the most relevant functionalities and known issues of this technology, with the intent of pointing out the possible behaviours that are not as efficient and reliable as they should be when thinking with a broader outlook.Cryptography, Vol. 1, Pages 15: Beyond Bitcoin: A Critical Look at Blockchain-Based Systems

After more than eight years since the launch of Bitcoin, the decentralized transaction ledger functionality implemented through the blockchain technology is being used not only for cryptocurrencies, but to register, confirm and transfer any kind of contract and property. In this work, we analyze the most relevant functionalities and known issues of this technology, with the intent of pointing out the possible behaviours that are not as efficient and reliable as they should be when thinking with a broader outlook.

]]>Beyond Bitcoin: A Critical Look at Blockchain-Based SystemsDiego RomanoGiovanni Schmiddoi: 10.3390/cryptography1020015Cryptography2017-09-01Cryptography2017-09-0112Article1510.3390/cryptography1020015http://www.mdpi.com/2410-387X/1/2/15Cryptography, Vol. 1, Pages 14: Recursive Cheating Strategies for the Relativistic FQ Bit Commitment Protocolhttp://www.mdpi.com/2410-387X/1/2/14
In this paper, we study relativistic bit commitment, which uses timing and location constraints to achieve information theoretic security. Using those constraints, we consider a relativistic bit commitment scheme introduced by Lunghi et al. This protocol was shown secure against classical adversaries as long as the number of rounds performed in the protocol is not too large. In this work, we study classical attacks on this scheme. We use the correspondence between this protocol and the CHSHQ game—which is a variant of the CHSH game—to derive cheating strategies for this protocol. Our attack matches the existing security bound for some range of parameters and shows that the scaling of the security in the number of rounds is essentially optimal.Cryptography, Vol. 1, Pages 14: Recursive Cheating Strategies for the Relativistic FQ Bit Commitment Protocol

In this paper, we study relativistic bit commitment, which uses timing and location constraints to achieve information theoretic security. Using those constraints, we consider a relativistic bit commitment scheme introduced by Lunghi et al. This protocol was shown secure against classical adversaries as long as the number of rounds performed in the protocol is not too large. In this work, we study classical attacks on this scheme. We use the correspondence between this protocol and the CHSHQ game—which is a variant of the CHSH game—to derive cheating strategies for this protocol. Our attack matches the existing security bound for some range of parameters and shows that the scaling of the security in the number of rounds is essentially optimal.

]]>Recursive Cheating Strategies for the Relativistic FQ Bit Commitment ProtocolRémi BricoutAndré Chaillouxdoi: 10.3390/cryptography1020014Cryptography2017-08-24Cryptography2017-08-2412Article1410.3390/cryptography1020014http://www.mdpi.com/2410-387X/1/2/14Cryptography, Vol. 1, Pages 13: Transparent, Auditable, and Stepwise Verifiable Online E-Voting Enabling an Open and Fair Electionhttp://www.mdpi.com/2410-387X/1/2/13
Many e-voting techniques have been proposed but not widely used in reality. One of the problems associated with most existing e-voting techniques is the lack of transparency, leading to a failure to deliver voter assurance. In this work, we p verifiable, viewable, and mutual restraining e-voting protocol that exploits the existing multi-party political dynamics such as in the US. The new e-voting protocol consists of three original technical contributions—universal verifiable voting vector, forward and backward mutual lock voting, and in-process check and enforcement—that, along with a public real time bulletin board, resolves the apparent conflicts in voting such as anonymity vs. accountability and privacy vs. verifiability. Especially, the trust is split equally among tallying authorities who have conflicting interests and will technically restrain each other. The voting and tallying processes are transparent/viewable to anyone, which allow any voter to visually verify that his vote is indeed counted and also allow any third party to audit the tally, thus, enabling open and fair election. Depending on the voting environment, our interactive protocol is suitable for small groups where interaction is encouraged, while the non-interactive protocol allows large groups to vote without interaction.Cryptography, Vol. 1, Pages 13: Transparent, Auditable, and Stepwise Verifiable Online E-Voting Enabling an Open and Fair Election

Many e-voting techniques have been proposed but not widely used in reality. One of the problems associated with most existing e-voting techniques is the lack of transparency, leading to a failure to deliver voter assurance. In this work, we p verifiable, viewable, and mutual restraining e-voting protocol that exploits the existing multi-party political dynamics such as in the US. The new e-voting protocol consists of three original technical contributions—universal verifiable voting vector, forward and backward mutual lock voting, and in-process check and enforcement—that, along with a public real time bulletin board, resolves the apparent conflicts in voting such as anonymity vs. accountability and privacy vs. verifiability. Especially, the trust is split equally among tallying authorities who have conflicting interests and will technically restrain each other. The voting and tallying processes are transparent/viewable to anyone, which allow any voter to visually verify that his vote is indeed counted and also allow any third party to audit the tally, thus, enabling open and fair election. Depending on the voting environment, our interactive protocol is suitable for small groups where interaction is encouraged, while the non-interactive protocol allows large groups to vote without interaction.

]]>Transparent, Auditable, and Stepwise Verifiable Online E-Voting Enabling an Open and Fair ElectionXukai ZouHuian LiFeng LiWei PengYan Suidoi: 10.3390/cryptography1020013Cryptography2017-08-17Cryptography2017-08-1712Article1310.3390/cryptography1020013http://www.mdpi.com/2410-387X/1/2/13Cryptography, Vol. 1, Pages 12: Multiparty Delegated Quantum Computinghttp://www.mdpi.com/2410-387X/1/2/12
Quantum computing has seen tremendous progress in the past few years. However, due to limitations in the scalability of quantum technologies, it seems that we are far from constructing universal quantum computers for everyday users. A more feasible solution is the delegation of computation to powerful quantum servers on the network. This solution was proposed in previous studies of blind quantum computation, with guarantees for both the secrecy of the input and of the computation being performed. In this work, we further develop this idea of computing over encrypted data, to propose a multiparty delegated quantum computing protocol in the measurement-based quantum computing framework. We prove the security of the protocol against a dishonest server and against dishonest clients, under the assumption of common classical cryptographic constructions.Cryptography, Vol. 1, Pages 12: Multiparty Delegated Quantum Computing

Quantum computing has seen tremendous progress in the past few years. However, due to limitations in the scalability of quantum technologies, it seems that we are far from constructing universal quantum computers for everyday users. A more feasible solution is the delegation of computation to powerful quantum servers on the network. This solution was proposed in previous studies of blind quantum computation, with guarantees for both the secrecy of the input and of the computation being performed. In this work, we further develop this idea of computing over encrypted data, to propose a multiparty delegated quantum computing protocol in the measurement-based quantum computing framework. We prove the security of the protocol against a dishonest server and against dishonest clients, under the assumption of common classical cryptographic constructions.

]]>Multiparty Delegated Quantum ComputingElham KashefiAnna Pappadoi: 10.3390/cryptography1020012Cryptography2017-07-30Cryptography2017-07-3012Article1210.3390/cryptography1020012http://www.mdpi.com/2410-387X/1/2/12Cryptography, Vol. 1, Pages 11: Simple, Near-Optimal Quantum Protocols for Die-Rollinghttp://www.mdpi.com/2410-387X/1/2/11
Die-rolling is the cryptographic task where two mistrustful, remote parties wish to generate a random D-sided die-roll over a communication channel. Optimal quantum protocols for this task have been given by Aharon and Silman (New Journal of Physics, 2010) but are based on optimal weak coin-flipping protocols that are currently very complicated and not very well understood. In this paper, we first present very simple classical protocols for die-rolling that have decent (and sometimes optimal) security, which is in stark contrast to coin-flipping, bit-commitment, oblivious transfer, and many other two-party cryptographic primitives. We also present quantum protocols based on the idea of integer-commitment, a generalization of bit-commitment, where one wishes to commit to an integer. We analyze these protocols using semidefinite programming and finally give protocols that are very close to Kitaev’s lower bound for any D ≥ 3 . Lastly, we briefly discuss an application of this work to the quantum state discrimination problem.Cryptography, Vol. 1, Pages 11: Simple, Near-Optimal Quantum Protocols for Die-Rolling

Die-rolling is the cryptographic task where two mistrustful, remote parties wish to generate a random D-sided die-roll over a communication channel. Optimal quantum protocols for this task have been given by Aharon and Silman (New Journal of Physics, 2010) but are based on optimal weak coin-flipping protocols that are currently very complicated and not very well understood. In this paper, we first present very simple classical protocols for die-rolling that have decent (and sometimes optimal) security, which is in stark contrast to coin-flipping, bit-commitment, oblivious transfer, and many other two-party cryptographic primitives. We also present quantum protocols based on the idea of integer-commitment, a generalization of bit-commitment, where one wishes to commit to an integer. We analyze these protocols using semidefinite programming and finally give protocols that are very close to Kitaev’s lower bound for any D ≥ 3 . Lastly, we briefly discuss an application of this work to the quantum state discrimination problem.

]]>Simple, Near-Optimal Quantum Protocols for Die-RollingJamie Sikoradoi: 10.3390/cryptography1020011Cryptography2017-07-08Cryptography2017-07-0812Article1110.3390/cryptography1020011http://www.mdpi.com/2410-387X/1/2/11Cryptography, Vol. 1, Pages 10: Password-Hashing Statushttp://www.mdpi.com/2410-387X/1/2/10
Computers are used in our everyday activities, with high volumes of users accessing provided services. One-factor authentication consisting of a username and a password is the common choice to authenticate users in the web. However, the poor password management practices are exploited by attackers that disclose the users’ credentials, harming both users and vendors. In most of these occasions the user data were stored in clear or were just processed by a cryptographic hash function. Password-hashing techniques are applied to fortify this user-related information. The standardized primitive is currently the PBKDF2 while other widely-used schemes include Bcrypt and Scrypt. The evolution of parallel computing enables several attacks in password-hash cracking. The international cryptographic community conducted the Password Hashing Competition (PHC) to identify new efficient and more secure password-hashing schemes, suitable for widespread adoption. PHC advanced our knowledge of password-hashing. Further analysis efforts revealed security weaknesses and novel schemes were designed afterwards. This paper provides a review of password-hashing schemes until the first quarter of 2017 and a relevant performance evaluation analysis on a common setting in terms of code size, memory consumption, and execution time.Cryptography, Vol. 1, Pages 10: Password-Hashing Status

Computers are used in our everyday activities, with high volumes of users accessing provided services. One-factor authentication consisting of a username and a password is the common choice to authenticate users in the web. However, the poor password management practices are exploited by attackers that disclose the users’ credentials, harming both users and vendors. In most of these occasions the user data were stored in clear or were just processed by a cryptographic hash function. Password-hashing techniques are applied to fortify this user-related information. The standardized primitive is currently the PBKDF2 while other widely-used schemes include Bcrypt and Scrypt. The evolution of parallel computing enables several attacks in password-hash cracking. The international cryptographic community conducted the Password Hashing Competition (PHC) to identify new efficient and more secure password-hashing schemes, suitable for widespread adoption. PHC advanced our knowledge of password-hashing. Further analysis efforts revealed security weaknesses and novel schemes were designed afterwards. This paper provides a review of password-hashing schemes until the first quarter of 2017 and a relevant performance evaluation analysis on a common setting in terms of code size, memory consumption, and execution time.

]]>Password-Hashing StatusGeorge Hatzivasilisdoi: 10.3390/cryptography1020010Cryptography2017-06-27Cryptography2017-06-2712Article1010.3390/cryptography1020010http://www.mdpi.com/2410-387X/1/2/10Cryptography, Vol. 1, Pages 9: Cryptanalysis and Improvement of ECC Based Authentication and Key Exchanging Protocolshttp://www.mdpi.com/2410-387X/1/1/9
Elliptic curve cryptography (ECC) is extensively used in various multifactor authentication protocols. In this work, various recent ECC-based authentication and key exchange protocols are subjected to threat modeling and static analysis to detect vulnerabilities and to enhance them to be more secure against threats. This work demonstrates how currently-used ECC-based protocols are vulnerable to attacks. If protocols are vulnerable, damage could include critical data loss and elevated privacy concerns. The protocols considered in this work differ in their usage of security factors (e.g., passwords, pins and biometrics), encryption and timestamps. The threat model considers various kinds of attacks including denial of service, man in the middle, weak authentication and SQL injection. Countermeasures to reduce or prevent such attacks are suggested. Beyond cryptanalysis of current schemes and the proposal of new schemes, the proposed adversary model and criteria set forth provide a benchmark for the systematic evaluation of future two-factor authentication proposals.Cryptography, Vol. 1, Pages 9: Cryptanalysis and Improvement of ECC Based Authentication and Key Exchanging Protocols

Elliptic curve cryptography (ECC) is extensively used in various multifactor authentication protocols. In this work, various recent ECC-based authentication and key exchange protocols are subjected to threat modeling and static analysis to detect vulnerabilities and to enhance them to be more secure against threats. This work demonstrates how currently-used ECC-based protocols are vulnerable to attacks. If protocols are vulnerable, damage could include critical data loss and elevated privacy concerns. The protocols considered in this work differ in their usage of security factors (e.g., passwords, pins and biometrics), encryption and timestamps. The threat model considers various kinds of attacks including denial of service, man in the middle, weak authentication and SQL injection. Countermeasures to reduce or prevent such attacks are suggested. Beyond cryptanalysis of current schemes and the proposal of new schemes, the proposed adversary model and criteria set forth provide a benchmark for the systematic evaluation of future two-factor authentication proposals.

]]>Cryptanalysis and Improvement of ECC Based Authentication and Key Exchanging ProtocolsSwapnoneel RoyChanchal Khatwanidoi: 10.3390/cryptography1010009Cryptography2017-06-13Cryptography2017-06-1311Article910.3390/cryptography1010009http://www.mdpi.com/2410-387X/1/1/9Cryptography, Vol. 1, Pages 8: Analysis of Entropy in a Hardware-Embedded Delay PUFhttp://www.mdpi.com/2410-387X/1/1/8
The magnitude of the information content associated with a particular implementation of a Physical Unclonable Function (PUF) is critically important for security and trust in emerging Internet of Things (IoT) applications. Authentication, in particular, requires the PUF to produce a very large number of challenge-response-pairs (CRPs) and, of even greater importance, requires the PUF to be resistant to adversarial attacks that attempt to model and clone the PUF (model-building attacks). Entropy is critically important to the model-building resistance of the PUF. A variety of metrics have been proposed for reporting Entropy, each measuring the randomness of information embedded within PUF-generated bitstrings. In this paper, we report the Entropy, MinEntropy, conditional MinEntropy, Interchip hamming distance and National Institute of Standards and Technology (NIST) statistical test results using bitstrings generated by a Hardware-Embedded Delay PUF called HELP. The bitstrings are generated from data collected in hardware experiments on 500 copies of HELP implemented on a set of Xilinx Zynq 7020 SoC Field Programmable Gate Arrays (FPGAs) subjected to industrial-level temperature and voltage conditions. Special test cases are constructed which purposely create worst case correlations for bitstring generation. Our results show that the processes proposed within HELP to generate bitstrings add significantly to their Entropy, and show that classical re-use of PUF components, e.g., path delays, does not result in large Entropy losses commonly reported for other PUF architectures.Cryptography, Vol. 1, Pages 8: Analysis of Entropy in a Hardware-Embedded Delay PUF

The magnitude of the information content associated with a particular implementation of a Physical Unclonable Function (PUF) is critically important for security and trust in emerging Internet of Things (IoT) applications. Authentication, in particular, requires the PUF to produce a very large number of challenge-response-pairs (CRPs) and, of even greater importance, requires the PUF to be resistant to adversarial attacks that attempt to model and clone the PUF (model-building attacks). Entropy is critically important to the model-building resistance of the PUF. A variety of metrics have been proposed for reporting Entropy, each measuring the randomness of information embedded within PUF-generated bitstrings. In this paper, we report the Entropy, MinEntropy, conditional MinEntropy, Interchip hamming distance and National Institute of Standards and Technology (NIST) statistical test results using bitstrings generated by a Hardware-Embedded Delay PUF called HELP. The bitstrings are generated from data collected in hardware experiments on 500 copies of HELP implemented on a set of Xilinx Zynq 7020 SoC Field Programmable Gate Arrays (FPGAs) subjected to industrial-level temperature and voltage conditions. Special test cases are constructed which purposely create worst case correlations for bitstring generation. Our results show that the processes proposed within HELP to generate bitstrings add significantly to their Entropy, and show that classical re-use of PUF components, e.g., path delays, does not result in large Entropy losses commonly reported for other PUF architectures.

]]>Analysis of Entropy in a Hardware-Embedded Delay PUFWenjie CheVenkata KajuluriMitchell MartinFareena SaqibJim Plusquellicdoi: 10.3390/cryptography1010008Cryptography2017-06-07Cryptography2017-06-0711Article810.3390/cryptography1010008http://www.mdpi.com/2410-387X/1/1/8Cryptography, Vol. 1, Pages 7: Maximum-Order Complexity and Correlation Measureshttp://www.mdpi.com/2410-387X/1/1/7
We estimate the maximum-order complexity of a binary sequence in terms of its correlation measures. Roughly speaking, we show that any sequence with small correlation measure up to a sufficiently large order k cannot have very small maximum-order complexity.Cryptography, Vol. 1, Pages 7: Maximum-Order Complexity and Correlation Measures

We estimate the maximum-order complexity of a binary sequence in terms of its correlation measures. Roughly speaking, we show that any sequence with small correlation measure up to a sufficiently large order k cannot have very small maximum-order complexity.

]]>Maximum-Order Complexity and Correlation MeasuresLeyla IşıkArne Winterhofdoi: 10.3390/cryptography1010007Cryptography2017-05-13Cryptography2017-05-1311Article710.3390/cryptography1010007http://www.mdpi.com/2410-387X/1/1/7Cryptography, Vol. 1, Pages 6: Garbled Quantum Computationhttp://www.mdpi.com/2410-387X/1/1/6
The universal blind quantum computation protocol (UBQC) enables an almost classical client to delegate a quantum computation to an untrusted quantum server (in the form of a garbled quantum circuit) while the security for the client is unconditional. In this contribution, we explore the possibility of extending the verifiable UBQC, to achieve further functionalities following the analogous research for classical circuits (Yao 1986). First, exploring the asymmetric nature of UBQC (the client preparing only single qubits, while the server runs the entire quantum computation), we present a “Yao”-type protocol for secure two-party quantum computation. Similar to the classical setting, our quantum Yao protocol is secure against a specious (quantum honest-but-curious) garbler, but in our case, against a (fully) malicious evaluator. Unlike the previous work on quantum two-party computation of Dupuis et al., 2010, we do not require any online-quantum communication between the garbler and the evaluator and, thus, no extra cryptographic primitive. This feature will allow us to construct a simple universal one-time compiler for any quantum computation using one-time memory, in a similar way to the classical work of Goldwasser et al., 2008, while more efficiently than the previous work of Broadbent et al., 2013.Cryptography, Vol. 1, Pages 6: Garbled Quantum Computation

The universal blind quantum computation protocol (UBQC) enables an almost classical client to delegate a quantum computation to an untrusted quantum server (in the form of a garbled quantum circuit) while the security for the client is unconditional. In this contribution, we explore the possibility of extending the verifiable UBQC, to achieve further functionalities following the analogous research for classical circuits (Yao 1986). First, exploring the asymmetric nature of UBQC (the client preparing only single qubits, while the server runs the entire quantum computation), we present a “Yao”-type protocol for secure two-party quantum computation. Similar to the classical setting, our quantum Yao protocol is secure against a specious (quantum honest-but-curious) garbler, but in our case, against a (fully) malicious evaluator. Unlike the previous work on quantum two-party computation of Dupuis et al., 2010, we do not require any online-quantum communication between the garbler and the evaluator and, thus, no extra cryptographic primitive. This feature will allow us to construct a simple universal one-time compiler for any quantum computation using one-time memory, in a similar way to the classical work of Goldwasser et al., 2008, while more efficiently than the previous work of Broadbent et al., 2013.

]]>Privacy in a Digital, Networked World: Technologies, Implications and Solutions. By Sherali Zeadally and Mohamad Badra. Springer International Publishing: 418 pp.; $51.89; ISBN-10: 3319084690, ISBN-13: 978-3319084695Nicolas Sklavosdoi: 10.3390/cryptography1010005Cryptography2017-03-19Cryptography2017-03-1911Book Review510.3390/cryptography1010005http://www.mdpi.com/2410-387X/1/1/5Cryptography, Vol. 1, Pages 4: Cryptography in Wireless Multimedia Sensor Networks: A Survey and Research Directionshttp://www.mdpi.com/2410-387X/1/1/4
Wireless multimedia sensor networks will play a central role in the Internet of Things world, providing content-rich information for an uncountable number of monitoring and control scenarios. As more applications rely on multimedia data, security concerns gain attention, and new approaches arise to provide security for such networks. However, the usual resource constraints of processing, memory and the energy of multimedia-based sensors have brought different challenges for data encryption, which have driven the development of different security approaches. In this context, this article presents the state-of-the-art of cryptography in wireless multimedia sensor networks, surveying innovative works in this area and discussing promising research directions.Cryptography, Vol. 1, Pages 4: Cryptography in Wireless Multimedia Sensor Networks: A Survey and Research Directions

Wireless multimedia sensor networks will play a central role in the Internet of Things world, providing content-rich information for an uncountable number of monitoring and control scenarios. As more applications rely on multimedia data, security concerns gain attention, and new approaches arise to provide security for such networks. However, the usual resource constraints of processing, memory and the energy of multimedia-based sensors have brought different challenges for data encryption, which have driven the development of different security approaches. In this context, this article presents the state-of-the-art of cryptography in wireless multimedia sensor networks, surveying innovative works in this area and discussing promising research directions.

]]>Cryptography in Wireless Multimedia Sensor Networks: A Survey and Research DirectionsDaniel CostaSolenir FiguerêdoGledson Oliveiradoi: 10.3390/cryptography1010004Cryptography2017-01-05Cryptography2017-01-0511Review410.3390/cryptography1010004http://www.mdpi.com/2410-387X/1/1/4Cryptography, Vol. 1, Pages 3: A Privacy-Preserving, Mutual PUF-Based Authentication Protocolhttp://www.mdpi.com/2410-387X/1/1/3
This paper describes an authentication protocol using a Hardware-Embedded Delay PUF called HELP. HELP derives randomness from within-die path delay variations that occur along the paths within a hardware implementation of a cryptographic primitive, such as AES or SHA-3. The digitized timing values which represent the path delays are stored in a database on a secure server (verifier) as an alternative to storing PUF response bitstrings. This enables the development of an efficient authentication protocol that provides both privacy and mutual authentication. The security properties of the protocol are analyzed using data collected from a set of Xilinx Zynq FPGAs.Cryptography, Vol. 1, Pages 3: A Privacy-Preserving, Mutual PUF-Based Authentication Protocol

This paper describes an authentication protocol using a Hardware-Embedded Delay PUF called HELP. HELP derives randomness from within-die path delay variations that occur along the paths within a hardware implementation of a cryptographic primitive, such as AES or SHA-3. The digitized timing values which represent the path delays are stored in a database on a secure server (verifier) as an alternative to storing PUF response bitstrings. This enables the development of an efficient authentication protocol that provides both privacy and mutual authentication. The security properties of the protocol are analyzed using data collected from a set of Xilinx Zynq FPGAs.

]]>A Privacy-Preserving, Mutual PUF-Based Authentication ProtocolWenjie CheMitchell MartinGoutham PocklasseryVenkata KajuluriFareena SaqibJim Plusquellicdoi: 10.3390/cryptography1010003Cryptography2016-11-25Cryptography2016-11-2511Article310.3390/cryptography1010003http://www.mdpi.com/2410-387X/1/1/3Cryptography, Vol. 1, Pages 2: Balanced Permutations Even–Mansour Ciphershttp://www.mdpi.com/2410-387X/1/1/2
The r-rounds Even–Mansour block cipher is a generalization of the well known Even–Mansour block cipher to r iterations. Attacks on this construction were described by Nikolić et al. and Dinur et al. for r = 2 , 3 . These attacks are only marginally better than brute force but are based on an interesting observation (due to Nikolić et al.): for a “typical” permutation P, the distribution of P ( x ) ⊕ x is not uniform. This naturally raises the following question. Let us call permutations for which the distribution of P ( x ) ⊕ x is uniformly “balanced” — is there a sufficiently large family of balanced permutations, and what is the security of the resulting Even–Mansour block cipher? We show how to generate families of balanced permutations from the Luby–Rackoff construction and use them to define a 2 n -bit block cipher from the 2-round Even–Mansour scheme. We prove that this cipher is indistinguishable from a random permutation of { 0 , 1 } 2 n , for any adversary who has oracle access to the public permutations and to an encryption/decryption oracle, as long as the number of queries is o ( 2 n / 2 ) . As a practical example, we discuss the properties and the performance of a 256-bit block cipher that is based on our construction, and uses the Advanced Encryption Standard (AES), with a fixed key, as the public permutation.Cryptography, Vol. 1, Pages 2: Balanced Permutations Even–Mansour Ciphers

The r-rounds Even–Mansour block cipher is a generalization of the well known Even–Mansour block cipher to r iterations. Attacks on this construction were described by Nikolić et al. and Dinur et al. for r = 2 , 3 . These attacks are only marginally better than brute force but are based on an interesting observation (due to Nikolić et al.): for a “typical” permutation P, the distribution of P ( x ) ⊕ x is not uniform. This naturally raises the following question. Let us call permutations for which the distribution of P ( x ) ⊕ x is uniformly “balanced” — is there a sufficiently large family of balanced permutations, and what is the security of the resulting Even–Mansour block cipher? We show how to generate families of balanced permutations from the Luby–Rackoff construction and use them to define a 2 n -bit block cipher from the 2-round Even–Mansour scheme. We prove that this cipher is indistinguishable from a random permutation of { 0 , 1 } 2 n , for any adversary who has oracle access to the public permutations and to an encryption/decryption oracle, as long as the number of queries is o ( 2 n / 2 ) . As a practical example, we discuss the properties and the performance of a 256-bit block cipher that is based on our construction, and uses the Advanced Encryption Standard (AES), with a fixed key, as the public permutation.

]]>Balanced Permutations Even–Mansour CiphersShoni GilboaShay GueronMridul Nandidoi: 10.3390/cryptography1010002Cryptography2016-04-01Cryptography2016-04-0111Article210.3390/cryptography1010002http://www.mdpi.com/2410-387X/1/1/2Cryptography, Vol. 1, Pages 1: Cryptography: A New Open Access Journalhttp://www.mdpi.com/2410-387X/1/1/1
Cryptography has very long history, from ancient ciphers, such as Ceaser cipher, machine (or rotor) cipherx during WWI and WWII, and modern ciphers, which play a fundamental role in providing Confidentiality, Integrity, and Authentication services during transmission, processing, and storage of the sensitive data over the open or public networks. [...]Cryptography, Vol. 1, Pages 1: Cryptography: A New Open Access Journal

Cryptography has very long history, from ancient ciphers, such as Ceaser cipher, machine (or rotor) cipherx during WWI and WWII, and modern ciphers, which play a fundamental role in providing Confidentiality, Integrity, and Authentication services during transmission, processing, and storage of the sensitive data over the open or public networks. [...]

]]>Cryptography: A New Open Access JournalKwangjo Kimdoi: 10.3390/cryptography1010001Cryptography2016-02-15Cryptography2016-02-1511Editorial110.3390/cryptography1010001http://www.mdpi.com/2410-387X/1/1/1