ideas about modeling and testing of communication protocols

Tag Archives: protocol

Introduction

Simultaneously the ICAO released updated and new versions of their ICAO test specifications for eMRTD and corresponding inspection systems. While the Technical Advisory Group (TAG) of ICAO endorsed the update on the ICAO website, from now on these test specifications can be referenced officially and can also be used for certification. Additionally, there is also a new test specification part 5 available focusing on PKI objects like certificates and revocation lists.

ISO7816_R_07: Positive test case to verify the behavior of a PACE-protected eMRTD in response to the INTERNAL AUTHENTICATE

Test suite ISO7816_T: New test cases for Chip Authentication (this tests were moved from BSI TR-03105 Part 3.2 with minor corrections and remove robustness test of invalid class bytes)

LDS_E_01: Test the LDS tag of the DG14 object

LDS_E_02: Test the ASN.1 encoding of the SecurityInfos for Chip Authentication

LDS_E_03: Test the ASN.1 encoding of the ChipAuthenticationPublicKeyInfo

LDS_E_04: Test the ASN.1 encoding of the ChipAuthenticationInfo

LDS_K_05: Test the ASN.1 encoding of a PCKS#7 signedData object

LDS_K_06: Test the value that is encoded into the signedData element

LDS_K_07: Test the SignerInfo element of the signedData structure

LDS_K_08: Test the signing certificate used to verify the EF.CardSecurity object

Modified test cases in Version 2.11

The ICAO test specification part 3 contains the following modified test cases:

ISO7816_A_2: SELECT Application command with CLA byte ‘8F’ is removed

ISO7816_C_15: Test case removed (invalid CLA byte)

ISO7816_C_19: Test case removed (invalid CLA byte)

ISO7816_D_2: Test case removed (invalid CLA byte)

ISO7816_E_2: Test case removed (invalid CLA byte)

ISO7816_F_2: Test case removed (invalid CLA byte)

ISO7816_G_2: Test case removed (invalid CLA byte)

ISO7816_P_04: Test case removed (invalid CLA byte)

ISO7816_P_18: Test case removed

ISO7816_P_37: Test case removed (invalid CLA byte)

ISO7816_P_38: Test case removed (invalid CLA byte)

ISO7816_P_39: Test case removed (invalid CLA byte)

ISO7816_P_40: Test case removed (invalid CLA byte)

ISO7816_P_66: Hint, that in test scenario step 1 the length encoding of DO ’80’ must be correct

ISO7816_P_67: Hint, that in test scenario step 1 the length encoding of DO ’80’ must be correct

ISO7816_R_01: Generic length encoding of APDU

ISO7816_R_02: Generic use of DO ’97’ and length encoding of APDU

ISO7816_R_03: Hint, that INTERNAL AUTHENTICATE command must use Secure Messaging if access control mechanism is supported

ISO7816_R_04: Hint, that INTERNAL AUTHENTICATE command must use Secure Messaging if access control mechanism is supported

ISO781_S_03: Padding indicator ’01’ in DO ’85’ is removed (odd ins)

ISO781_S_04: Padding indicator ’01’ in DO ’85’ is removed (odd ins)

LDS_A_03: In EF.COM there is also LDS version 1.8 accepted

LDS_A_05: Purpose of test case is corrected

LDS_C_04: Check that the number of Biometric Information Group Templates (BIGT) is one

LDS_D_04: Check also the LDS security object version in test scenario step 4; a future version of Doc9303 part 10 requires that the signedData certificates field in LDS v1.8
SHALL include the Document Signer Certificate (CDS)

LDS_D_06: Check that LDS Version Info element does not exist in test scenario step 8 and handle also LDS version 1.8 in this test case

LDS_D_07: Check that the validity period of the signing certificate MUST be within the validity period of the country signing certificate in test scenario step 4

Test suite LDS_E: Numbering of test cases has changed

LDS_E_07: Check also the hash algorithm output length of the signatureAlgorithm

LDS_J_05: ECDSA is also be handled in test scenario step 3 from now on

ICAO Test Specification Part 4 (Inspection Systems)

Also ICAO test specification part 4 version 2.11 is the result of a typical ISO process finalized by the same meetings in Paris and Tokyo of ISO SC17 WG3 TF4 mentioned above. Finally this specification contains also some interesting changes compared with the previous version 2.10 released in 2016:

Configuration of default EAC MRTD specifies EF.DG15

Specification of a default PACE-CAM protected eMRTD

Clarifications concerning PACE and PACE-CAM

Additionally, there are some clarifications and minor editorial changes.

LDS_D_39: Test case verifies that the inspection system checks the signature in EF.CardSecurity and detects an invalid signature

Modified test cases in Version 2.11

The ICAO test specification part 4 contains the following modified test cases:

ISO7816_C_01: Test case include three line MRZ from now on

ISO7816_C_03: Editorial correction of protocol name

ISO7816_C_29: Use new configuration for PACE CAM protected eMRTD

ISO7816_C_30: Use new configuration for PACE CAM protected eMRTD

ISO7816_C_31: Use new configuration for PACE CAM protected eMRTD

ISO7816_C_32: Use new configuration for PACE CAM protected eMRTD

ISO7816_C_33: Use new configuration for PACE CAM protected eMRTD

ISO7816_C_34: Use a special configuration to indicate that the IS performs Passive Authentication

ISO7816_C_36: Change context form PACE-CAM to PACE (CAM -> GM)

ISO7816_C_39: Use new configuration for PACE CAM protected eMRTD

ISO7816_E_01: Use signature production function B.6

ISO7816_E_02: Clarification of EF.DG14 and specification of EF.DG15 and using of signature production function B.6

ISO7816_E_03: Use signature production function B.6

ISO7816_E_04: Use signature production function B.6

ISO7816_E_05: Use signature production function B.6

ISO7816_E_06: Use signature production function B.6

ISO7816_E_07: Use signature production function B.6

ISO7816_E_08: Use signature production function B.6

ISO7816_E_09: Perform Active Authentication with B.6 method of ISO/IEC 9796-2 and use RSA-SHA1; Hint, that the RSA operation during AA must result in a value bigger than n/2 with n being the modulus to ensure that method B6 is really used in this test case

The European Commission (EC) organised another eMRTD interoperability test. This time the event took place in Ispra at the Joint Research Center of the European Union. The objective of this interoperability test was to assure that countries and companies have established a stable PACE protocol in their eMRTD, respective ePassports, and ID cards.

Test setup of eMRTD interoperability test

Every participant had the chance to submit up to two different sets of documents with different implementations. Altogether there were 42 different samples available at the beginning of the event. 2 samples didn’t pass the smoke test and one sample was not suitable to be tested by the labs. All remaining samples were tested in two different test procedures: crossover test and conformity test. Twelve document verification system providers with 16 different solutions took part in the crossover test. And 23 document providers threw 42 sets in the ring (28 countries, 14 industries).

Information concerning documents

The document providers describe in the implementation conformance statement (ICS) the features of their chips. Not all ICS were fulfilled consistently, so the following information concerning the documents should be read carefully. Concerning the LDS version 16 providers reported version 1.7 to be used in their documents. And three reported version 1.8, while all others don’t deliver any information concerning the version number.

The following diagram describes the relation between DH and ECDH in PACE:

PACE algorithms

The following diagram describes the relation of the mapping protocols in PACE:

Mapping protocols

16 documents supported besides the MRZ also a CAN as a password to get access to the stored data.

Again, the number of PACEInfos store in EF.CardAccess varied:

28 documents stored one PACEInfo,

Eight documents stored two PACEInfos,

One stored each three, seven or ten PACEInfos.

Investigations concerning EF.ATR/INFO in documents

The file EF.ATR/INFO allows storing some information about the chip, that allows the reader to handle the chip optimally. On this way the chip can offer the ideal buffer size used with extended length during reading and writing. In context of the event I had a closer look at EF.ATR/INFO. 26 documents of 42 stored an EF.ATR/INFO but 5 of them don’t offer information concerning extended length and buffer sizes there. So at the end I’ve investigated the reading buffer sizes of 21 documents of the eMRTD interoperability test with the following result:

Seven documents support buffer sizes between 1 and 1.5 KByte,

Three documents support buffer sizes between 1.5 and 2 KByte,

Eleven documents support buffer sizes of ~64 KByte.

Buffer sizes in EF.ATR/INFO (reading)

With these large buffer sizes data groups like DG2 storing the facial image or DG3 storing the finger prints can be read in only one command. This allows the inspection system to read the content of the chip faster and improves the reading time at eGates.

Results of conformity testing

During the conformity test, all three test labs performed 18.135 test cases altogether. Less than 1 percent of these test cases failed during the conformity test.

Overall results (layer 6 and layer 7):

Passed: 11.563

Failed: 155 (0,86%)

Not applicable: 6.417

Layer 6 (16.614 test cases performed):

Passed: 10.885

Failed: 124 (0,74%)

Layer 7 (1.521 test cases performed):

Passed: 679

Failed: 32 (2,10%)

The following diagram shows failed test cases per document during the eMRTD interoperability test:

Number of failures per document

The diagram below shows the number of failure per test case during the eMRTD interoperability test:

Failures per test case

Observations during conformity testing

There are minor differences between implementation conformance statement (ICS) and chip.

Test results differ between test labs in some test cases.

There are differences in handling errors at the test tools and labs (e.g. no CAN causes a failure at the one lab and a ‘not applicable’ at the other lab).

Relatively more failures on layer 7 (personalisation) than on layer 6 (COS).

During IT Flash Paderborn #3 I gave a short presentation and demo concerning security risk smart home. With the described passive attack you can profile all residents of the smart home. And with the described active attack you can manipulate the smart home and the devices used there.

From the movie ‘The Lives of Others’

In one of my previous blog post I described how to run a passive attack on a smart home in context of the protocol EnOcean. With the collected information you can set up a profile of all people living in this home.

For the passive attack I used a new tool that I own for a few weeks now: HackRF One. It’s an typical piece of hardware that you can use in context of Software Defined Radio (SDR). You can see this helpful tool on the following picture (Source: Great Scott Gadgets):

HackRF One

HackRF One was initiated as a Kickstarter project a few years ago and is used by a large community in the area of reengineering protocols. In my demonstration I used HackRF One on the one hand to find the exact frequency that is used by the EnOcean devices that I used. And on the other hand I used HackRF One to capture and replay the EnOcean telegram.

In a first step you need the exact frequency that is used by your EnOcean stuff. To find this frequency I used the tool gqrx. Gqrx is an open source software defined radio receiver powered by the project GNU Radio. The tool allows visualizing frequencies that are used in your environment. For example: EnOcean is works on 868 MHz, but gqrx helps you to find the exact frequency of this protocol, in my case: 868,290 MHz. The following screenshot shows the way gqrx works (Source gqrx-website):

Screenshot of gqrx

As soon as you have found the exact frequency, you can use the software distributed with HackRF one to capture and to replay the messages in your smart home. In the demonstration I used a pushed button and a light actuator adapter to visualize the attack.

In case of EnOcean there are mechanisms to protect against these attacks available. One of these mechanisms is called ‘Rolling Code’ where telegrams are encrypted which makes the capture and replay attack above useless. The following command stores the traffic in a file:

hackrf_transfer -t switch.raw -f 869290000

Once the traffic is stored in a file, you can send this information again (capture and replay) with your HackRF One with the following command:

hackrf_transfer -t switch.raw -f 869290000 -x 47

You can find some more information in the slides of presentation. As you can see, smart home devices should be used carefully if you want to protect your privacy. Today it’s very easy to collect and manipulate a smart home. So always keep in mind the security risk smart home when you plan your smart home.

The European Commission will organize conformity and interoperability tests for eMRTDs together with a conference on 25th and 26th September 2017. It will be held in the European Commission Joint Research Centre (JRC) premises in Ispra, Italy. The tests will focus on the latest access control specifications (e.g. the operation of the PACE protocol with Chip Authentication Mapping). This security mechanism, known as “Password Authenticated Connection Establishment with Chip Authentication Mapping” (PACE-CAM) is specified in Technical Report BSI TR-03110 “Advanced Security Mechanisms for Machine Readable Travel Documents and eIDAS token” and it combines PACE and Chip Authentication (CA) into one protocol leading to faster ID document verification.

The conference will take place on the second day (26/09/2017) and will include speakers from the EU Commission, ICAO (requested) and Member States (requested). At its end, the high-level aggregate results of the tests will be presented.

The main beneficiaries of these tests are EU Member States. Depending on the number of EU Member States that will participate in the event, and provided that it is possible from an organisational perspective, a limited number of non-EU ICAO Member States and private sector travel document manufacturers will be allowed to participate in the test (on a first come first serve basis).

During Apple’s Word Wide Developer Conference (WWDC) this week the company shared the latest information concerning support of Near Field Communication (NFC) protocol in iOS 11. Developers coding for iOS 11 will be able to create apps that can read NFC tags. This opens the door for wireless exchange of information between an iPhone and various connected devices in a user’s environment.

Apple and NFC (Wikipedia)

Currently the NFC chip in the iPhone is only used to handle contactless Apple Pay transactions. But in the new framework called Core NFC the company provide the foundation for multiple use cases by third-party apps. Using Core NFC, you can read tags of types 1 through 5 that contain data in the NFC Data Exchange Format (NDEF). At present the API supports only read access for the tags. Hopefully in the future there will be also the possibility integrated to write to the tags. With the new framework Apple could let third-party developers make use of NFC in new ways, or it could simply expand NFC functions beyond Apple Pay for use in its own apps and services. The specification says, “For example, your app might give users information about products they find in a store or exhibits they visit in a museum”.

There are also first code samples available, e.g. iOS11-NFC-Example implemented by Hans Knöchel. In his GitHub repository Hans describes a quick example showing how to use the API in iOS 11 and Swift 4. The new framework requires at least XCode 9, iOS 11 and an iPhone 7 or iPhone 7 Plus.

However, the possibilities for NFC outside of banking area look set to expand with Apple’s next-generation mobile operating systems. So I’m looking forward to blogging of an additional version of iOS, which also allows complex contactless protocols we know in the context of eID like Chip Authentication Mapping (PACE-CAM). This would enable the iPhone to read ID cards of ePassports using ISO 14443 for contactless communication. Nevertheless this first step in iOS opens a world of possibilities for new apps on iPhones.

I wish you a happy and prosperous new year 2017. Thanks for visiting my blog in 2016 and thanks for your feedback in various forms like comments and discussions. In 2016 I published several articles on this blog and I plan to continue writing about my activities in the area of protocols and testing also in 2017. Before I start with new blog posts in the next weeks, I give a short summary of 2016.

First time visiting CeBIT as a blogger in 2016

In 2016 it was the first time that I visited the CeBIT in Hanover as an accredited blogger. There were several companies supporting bloggers and also the CeBIT itself established rooms and areas in the exhibition to work and refresh. ePassports were not focused in the exhibition but several companies and organisations demonstrated their ideas concerning IoT and the protocols used there.

Blogger Press Card CeBIT 2016

protocolbench is now a registered trade mark

Last year I decided to protect the name of this blog. so I’ve registered the word mark ‘protocolbench’ at the German Patent and Trade Mark Office (DPMA). Under register number 302015219473 you can find more information about the trade mark.

Certificate of word mark ‘protocolbench’

Support this blog

In middle of December I decided to use Flattr and Patreon on my blog. These organisations allow visitors and readers to support my blog in an easy way. If you like this blog, please support my work by donating via Flattr.

New job at secunet

And last but not least, I’m working for a new company. Since September 2016 I’m working as a Principal at secunet Security Networks AG in the division ‘Homeland Security’. My area of responsibility is similar to the one before, like testing, eID and standardisation and of course GlobalTester. With the new facilities in Paderborn secunet has established the ninth location in Germany.

Conclusion and summary

So, also in 2017 I will publish new blog posts here in context of protocols and testing. One of the next articles for example will describe the digitalisation of ePassports. If you have ideas of subjects you are interested in or subjects you work on and you would like to get more visibility, just contact me.

Motivation

Typically protocols are connecting two different systems. In an open system with several stakeholders interoperability between these systems is a fundamental requirement. To assure this interoperability there are various way. In this blog post I present you two popular approaches in cooperation with my colleague Dr. Guido Frank who works at the German Federal Office for Information Security (BSI).

Interoperability

Interoperability is a characteristic of a product or system, whose interfaces are completely understood, to work with other products or systems, present or future, in either implementation or access, without any restrictions (Source: Wikipedia).

From a system perspective, this means that all implementations need to comply to the same technical specifications. Interoperability is essential because these systems are open systems with different stakeholders. It refers to the collaboration ability of cross-system protocols.

Crossover Testing vs. Conformity Testing

To ensure interoperability, implementations need to be tested. In general, there are different approaches to test systems or implementations.

Crossover Testing

The scope of a crossover test is to test every system component with all other system components. This procedure allows to detect incompatibilities between existing implementations with a fixed release status.

The efforts to perform this kind of test increases disproportionately with every additional instance of the system. Therefore these kinds of tests can practically be performed only with a small number of involved test partners. The following figure illustrates the interaction between different systems in a crossover test.

Crossover Testing

Another problem of crossover testing is maintenance: every new implementation or any new version of existing implementations must be tested with every corresponding system, which again increases the testing efforts significantly. A benefit of crossover testing can be found in the early phase of developing when crossover testing helps the developer to implement their own system and can be used as an indicator to use the right way. On the other side, this kind of testing only indicates a positive test case with two correct systems to test (“smoke test”). The behaviour of the systems in bad cases is not tested. Additionally, with two different systems to be tested in a crossover test it’s difficult to decide which system behaves correctly in case of a failure and which implementation has to be changed.

Conformity Testing

The purpose of conformity tests is to verify that a system implements the specifications correctly (i.e. it “conforms” to the specifications). These specifications need to be defined by stakeholders and finally implemented e.g. by test labs to run their conformity test tools.

This way these test suites verify the implementation under test with protocol data units which mimic both “correct” and “incorrect” behaviour of the system. The figure below illustrates the interactions between the test suite implementation and the system in a conformity test. The test definitions have to be tested with regard to harmonised interpretation among test participants.

Conformity Testing

Conclusion

Both approaches of testing allow assuring interoperability to other systems components to a certain extent. But with increasing complexity of the systems to be tested and the increasing number of systems at all the crossover testing is getting more and more extensive. Only conformity testing scales well with the complexity and the number of systems in an adequate way. The following diagram illustrates the increasing efforts of crossover tests in relation to conformity test.

Compared efforts of crossover testing and conformity testing

Direct tests between a subset of implementations are be useful during implementation or integration phase of a node. Such tests could also be performed via bilateral appointment between different stakeholders, e.g. via pilots. Experience from such test could also be used as additional input for conformity tests. Crossover testing or a central coordination of such tests would not be necessary for this purpose. As soon as there are several system components to be tested, conformity testing should be chosen.

The benefits of such an approach of interoperability testing can also be seen in several so called ‘InterOp tests‘ that have been performed for more than a decade in context of eMRTD. Detailed failure analysis allows to improve the stability of the whole eMRTD system. Additionally, the results of ‘InterOp tests’ helps not only to improve the stability of ePassports and inspection systems but also to improve the quality of (test) specifications and test tools.

Another open system with several vendors is banking. All cash cards or credit cards must fulfill international test specifications. This way of interoperability testing allows customers to use their cards worldwide at cash machines of various banks.

To assure systematic interoperability, it is necessary to set up conformity test specifications that systematically test the requirements as defined in the technical specifications. The tests should not only define good cases but also bad cases that mirror the pitfalls typically occurring in a system. Only this way allows to assure real system-wide interoperability.

Setup of test specification

Important components of a conformance test specifications are:

Description of general test requirements

Test setup / Testing environment

Definition of suitable test profiles / implementation profiles

Implementation Conformance Statement (ICS)

Definition of testing or configuration data

Definition of test cases according to a unified data structure

Each test case should concentrate on a single feature to be tested

The following structure of a test specification has been established since the beginning of eMRTD testing in 2005. It is based on the ISO/OSI layer model where data is tested on layer 7 and protocols are tested on layer 6.

Typical structure of a test case in this context:

Test case ID: unique identifier for each test case

Purpose: objective of the test case

Version: current version of this test case independent from the test specification

Reference: where is this feature / behaviour specified

Preconditions: setup of test case

Test scenario: description of test case, step by step

Postconditions: setdown of test case

Test cases can be combined in test suites to cluster test cases of similar topics or objectives. As the test specifications need to be implemented in suitable testing tools, it is useful to define the test cases already in a way, that eases their implementation, e.g. via XML using a suitable XML scheme.

Currently I’m preparing a project where an ePassport has to be tested. These tests start with the booklet and end with the chip. During the preparation the need for a test specification overview popped up. This need was the root of a new service here on this blog: an overview of all current specifications in the domains of this blog starting with eMRTDs and their corresponding inspection systems.To list all current specifications I’ve added a new page called ‘test specifications‘ in menu above. I will keep this list up-to-date in the future. Finally with every new version of a test specification I will update this list. Currently the list contains test specification released by ICAO and BSI. Both organisations are in the front of implementing tests in context of eMRTD and the corresponding back-end-systems. These certification schemes of BSI and ANSSI also base on these test specification.

Test specifications are “living documents”, which causes several modifications over the time. You need the test specifications, listed here, to prove conformity and finally certify your passport or inspection system.

With every new protocol you need some more or some modified test cases in the specifications. And also maintenance is an important fact to keep the test cases up-to-date. Additionally, I will list also test specifications of other domains like IoT in the closer future.

Introduction

Simultaneously with Part 3, the ICAO released also version 2.10 of the test specification ‘RF and Protocol Testing Part 4‘ to test the interoperability of inspection systems (IS) in context of eMRTD. While the Technical Advisory Group (TAG) of ICAO endorsed the update on the ICAO website, from now on the test specification can be referenced officially. Finally in version 2.10 of the test specification there are some significant modifications compared with the previous version 1.01 released in 2013:

LDS_D_36: Security Object with LDS Version 1.8 but Version wrong number

LDS_D_37: Security Object with LDS Version 1.7 but Version number 1

LDS_D_38: EF.SOD with future LDS Version 1.9

Modified test cases in Version 2.10 Update

Due to the new document structure of version 2.10, it’s difficult to detect all modifications. Therefore please be aware that the list of modified test cases may not be complete and there might be more changes compared to previous version 1.01.

ISO7816_C_04: Added new OID for PACE-CAM in table corresponding to test case

ISO7816_D_07: Test case deleted

With the release of this test specification, version 2.10 is relevant for certification. So from now on, your inspection system must fulfill these conformity tests to achieve a certificate.

Introduction

There is an update of ICAO’s test specification ‘RF and Protocol Testing Part 3‘ available. The specification is focusing on conformity testing and protocol testing for eMRTDs implementing protocols like BAC and PACE.

The Technical Advisory Group (TAG) of ICAO endorsed the updated release on the ICAO website, so from now on the test specification can be referenced officially. In version 2.10 of the test specification there are some major modifications:

Additional test cases for PACE-CAM (this includes modifications of existing test cases and also new test cases especially for PACE-CAM).

New test suite 7816_S to verify access rights (read and select) of EF.CardSecurity.

New test suite LDS_K to test presence and coding of SecurityInfo structures in EF.CardSecurity

The referenced documents are updated to Doc 9303 Edition 7 and old specifications including supplements are replaced.

With 7th edition of Doc 9303 the wording is changed from ‘PACEv2’ and ‘SAC’ to ‘PACE’.

And of course there are some minor editorial corrections.

The interim version 2.08 of this test specification was used during the interoperability test in London 2016 (first results of this event can be found in a previous post). This version was prepared at the meeting of ISO WG3 TF4R in Berlin to establish a valid version for the test event. Version 2.10 includes all the updates and some minor changes. In the following the update of version 2.10 is listed more detailed.

New test cases in layer 7

LDS_E_09: Test that EF.DG14 contains at least one valid set of SecurityInfos for Chip Authentication. A chip supporting PACE-CAM must also support CA.

LDS_I_05: Verify that EF.CardAccess contains at least one valid PACEInfo for PACE-GM or PACE-IM as an additional mapping procedure if PACE-CAM is supported.

LDS_K_01: Test the ASN.1 encoding of the SecurityInfos.

LDS_K_02: Verify the ASN.1 encoding of the ChipAuthenticationPublicKey.

LDS_K_03: Test the coherency between the EF.CardSecurity and EF.CardAccess.

LDS_K_04: Verify that the parameterID also denotes the ID of the Chip Authentication key used, i.e. the chip MUST provide a ChipAuthenticationPublicKeyInfo with keyID equal to parameterID.

Modified test cases in layer 7

LDS_I_02: Added OIDs for PACE-CAM and new step 3 (to check that a valid OID is present for each declared configuration).

LDS_I_03: Added OID for PACE-CAM.

LDS_I_04: Test case deleted.

LDS_J_04: Correction in referenced RFC.

Previous ideas to migrate this test specification to an ISO document are canceled due to political reasons. Part 3 (eMRTD) and Part 4 (inspection systems) will be ICAO documents furthermore whereas Part 1 (durability of ePassports) and Part 2 (contactless interface) are still migrated to ISO documents (ISO 18745-1 and ISO 18745-2).

Introduction

A few days before the Eclipse Neon release, the Eclipse Foundation has released several projects in context of IoT (Internet of things). The Eclipse IoT working group is engaged in projects like SmartHome, Kura, Paho and OM2M.

The Internet of Things is all about connecting devices (sensors and actuators) to the internet. You can find these devices in automobiles, wearables, industrial factories, homes, etc. A key challenge is the complexity of implementing an IoT solution where you need to deal with various hardware platforms, manage the IoT gateways to connect the devices to the internet, manage the connectivity and interoperability, and integrate the data in the existing systems and databases.

An important way to reduce this complexity is to create reusable libraries and frameworks. As a result these frameworks are abstract and implement key features. Consequently right here is the approach of Eclipse IoT delivering several technologies combined in an open source stack with all key features and standards that you need to develop your own IoT solution. Furthermore the Eclipse Foundation set up a community with more than 200 contributors to assure the enhancement of the IoT stack.

The current release includes Eclipse SmartHome Version 0.8 and Eclipse Paho Version 1.2. The projects Eclipse Kura and Eclipse OM2M will be available later this month. Additionally, the foundation starts a new project proposal called Eclipse Kapua. Goal is to create a modular integration platform for IoT devices and smart sensors. On this way there will be a bridge between Operation Technology and Information Technology.

Eclipse IoT

The Eclipse IoT ecosystem contains standards, gateways and of course frameworks. The following paragraphs will describe these modules. In addition you can find a reference to an Eclipse project that is relevant in this domain. Please keep in mind that this list is not complete, there are currently 24 different projects available in context of Eclipse IoT.

IoT stack

The following graphic describes the structure of Eclipse IoT stack. The stack includes frameworks, open communication standards and a gateway to assure management services. Consequently most modules based on Java and OSGi. OSGi describes a modular system and service platform for Java that implements a complete and dynamic component model. Finally the Eclipse IoT stack with its components addresses all key requirements in IoT: interoperability, security and connectivity.

Eclipse Open IoT Stack

Open communication standards

It’s an elementary feature in context of IoT to provide several mechanisms for protocols used in this domain. All devices must be connected, secured and managed. On this way the Eclipse projects in the IoT ecosystem supports the relevant protocols and standards:

MQTT: Eclipse Paho delivers a MQTT client implementation (Java, C/C++, JavaScript). The corresponding MQTT broker is implemented in Eclipse Mosquito (implementation in C). MQTT (Message Queuing Telemetry Transport) is a light-weight publish and subscribe messaging protocol specified in ISO/IEC 20922. Almost the new version 1.2 of Paho includes for example WebSocket support for Java and Pythons clients.

Lightweight M2M: The server side of LwM2M is delivered by Eclipse Wakaama (C/C++) and the client side by Eclipse Leshan (Java). Wakaama provides an API for server applications to send commands to registered LwM2M clients. Leshan relies on the Eclipse IoT Californium project for the CoAP and DTLS implementation.

DNSSEC: Eclipse Tiaki provides a DNSSEC implementation in Java. Domain Name System Security Extensions (DNSSEC) is specified by IETF for securing certain kinds of information provided by the Domain Name System (DNS).

DTLS: Eclipse TinyDTLS implements the Data Transport Layer Security (DTLS) standard in C. The implementation covers both the client and the server state machine. DTLS in general is a communication protocol that provides security also for datagram protocols like UDP.

Gateways

A gateway manages the connectivity between devices and provides a platform for the upper applications. Eclipse Kura offers a set of services that helps to manage the devices and applications deployed onto IoT gateways. Same to other Eclipse projects the gateway is based on Java and OSGi services. Kura manages the cloud connectivity, supports different protocols and configures the network.

Frameworks

Eclipse IoT provides a set of frameworks:

Eclipse SmartHome is a set of Java and OSGi services for building and running Smart Homes. It is designed to run on “simple” devices like Raspberry Pi, BeagleBone Black or Intel Edison. Additionally, this framework supports typical protocols with their diversity used in Smart Homes like EnOcean, KNX or ZigBee. Most of all this way allows the devices and systems to communicate with each other. Almost the new version 0.8 of Eclipse Smart Home contains now a new REST interface that allows easier interaction with the clients. Furthermore, new bindings are supported, e.g. for DigitalStrom.

Eclipse SCADA is a set of Java and OSGi services that implements nearly all of the services required to run a SCADA (supervisory control and data acquisition) system. As one type of an Industrial Control System (ICS) Eclipse SCADA delivers functions for data acquisition, monitoring, visualization, etc. Additionally, the framework supports typical industrial automation standards like ModBus, Siemens PLC, SNMP and OPC.

Eclipse OM2M is one implementation of ETSI’s MSM standard. This implementation provides a horizontal Service Capability Layer (SCL) to be deployed in a M2M network.

Summary

In conclusion Eclipse IoT provides an open source stack including all relevant frameworks, protocols and standards to develop your own IoT application. The stack allows you to develop new devices but also to modernise existing ‘legacy’ devices.

During 10th Security Document World 2016 an additional Interoperability Test for eMRTD with PACE took place in London. In context of ePassports this was the 14th event starting 2003 on Canberra. This time there were two test labs involved, 17 document providers and twelve inspection system providers. Here I will focus on the conformity test including test labs and document providers and the InteropTest results. The event was organised by the colleagues of secunet. The following document providers delivered in sum 27 samples:

Arjo Systems

Atos IT Solutions and Services

Bundesdruckerei

Canadian Banknote Company

cryptovision

De La Rue

Gemalto

ID&Trust

Imprimerie Nationale Group

Iris Corporation Berhad

MaskTech

Morpho

NXP Semiconductors

Oberthur Technologies

PAV Card

Polygraph combine ‘Ukraina’

PWPW

And the following test laboratories performed a subset of tests focusing on PACE (and of course PACE-CAM):

Keolabs (France)

HJP Consulting / TÜViT (Germany)

The test cases performed during the event based on ICAO’s test specification ‘RF Protocol and Application Test Standard for eMRTD – Part 3‘ Version 2.08 RC2 including some minor adaptions based on the last WG3TF4 meeting in Berlin. The final version 2.08 of this test specification will be released soon and deltas will be listed in an additional blog post here. With focus on PACE-CAM the following test suites were performed by both test labs:

In the preliminary InteropTest results presented by Michael Schlüter during the SDW he mentioned, that 8502 test cases were performed during conformity testing by the test labs and 98% of the relevant test cases were passed by the samples. Additionally, the test results of both labs were fairly consistent. There was one test case that causes the most failures and this test case verifies ChipAuthenticationPublicKey in EF.CardSecurity (LDS_K_2). Here we need some clarification in the specification Doc9303 and finally in the test specification.

During the crossover test there were three problems detected: At first the sequence of PACE, CA and TA was performed correctly while the sequence of PACE-CAM and TA causes some problems during the inspection procedure of the readers. This might be based in the fact, that PACE-CAM is specified in an ICAO document and TA in a BSI document. Some inspection systems had also problems with alternative File IDs for EF.CVCA. The alternative FID can be defined in TerminalAuthenticationInfo (see A.1.1.3 in TR-03110 Part 3) and must be used by the inspection systems to address and read EF.CVCA. But a bad surprise in the InteropTest results was, that around 50% of the inspection systems don’t perform Passive Authentication (PA) correctly. During the preparation of the InteropTest a wrong CSCA certificate was distributed and 50% of the systems have not detected this wrong certificate, this means: 50% of the inspection systems failed in Passive Authentication! During the conference Dr. Uwe Seidel (Bundeskriminalamt, BKA) noticed, that this number mirrors the real world and that in fact PA is currently a real problem in border control.

The InteropTest results can be summed up in two statements:

There is a very good quality of the eMRTD samples.

Reader vendors have still some work to do, especially to implement Passive Authentication correctly.

A detailed test report of this event and the InteropTest results will be published by secunet in June 2016.

Update: The final test report can now be downloaded here (after a short registration at the SDW website).