Don’t ask WHY I need to understand how to do ASN.1 encoding, but I haz to

Nice how-to with example here. I’m not saying it makes sense (“ignore this bit, then shift right nibble of the leftmost byte to left by 1 position…etc”) but this is how it works.

Challenge number 2: try to encode my own stuff, which is not as simple as this nice example 🙂

Why do I have these issues? Mwell:

1. Global Platform specifies in GP SE Access Control spec that the binding between a Secure Element (javacard) application and a handset application shall be done according to a set of management rules. These rules can be either newly configured in an ARA-C (Access Rule Application Client) and ARA-M (Access Rule Application Master), a set of “databases” containing access rules. One other way is to use the PKCS#15 support that should be/is present on the cards. The PKCS#15 can be implemented as a smartcard filesystem structure – which I am studying now…

2. PKCS#15 standard is, of course, defined by RSA and specifies how the structure of this filesystem looks like. My interest is in the Data Object Identifiers (OIDs), which are used for smartcards as per GP in order to regulate access to smartcard applications from the handset

3. OIDs are specified according to ASN.1 notation, which, of course, is specified in a different standard, ISO/IEC 8825-1 (ITU X.690).

Now, for the funny part of this deal, is that the examples provided by these 3 different standards look alike and not so much 😛 That being said, an enthusiastic Cristina spent the entire day computing examples of ASN.1 encoding on paper, and for the same OID, she was able to get different encoding for each kind of explanations provided by the standards.

Asked for help from my smarter/more experienced friends around the Internets and got close to the ASN.1 enlightenment. Still, some things still don’t make any sense.

Have any of you actually used ASN.1 encoding? I understand how to encode the initial OIDs. I also understand how to encode integers that are less than 127B. I can also understand how to encode integers that are larger than 128B _and_ represented on 2 Bytes.

I cannot manage to encode integers that are larger than 127B and represented on 1 Byte. Also, I cannot properly encode integers that are represented on more than 2 Bytes.