Question:
A stream of cipher operates on a data stream of 6-bit characters using a simple mono-alphabetic substitution technique. Estimate and explain the number of different substitution alphabets possible. The key is effectively the substitution alphabet, which can be expressed as a $384$-bit number (i.e. $64×6$ bits).

Discuss the security of this system compared with DES/3DES and a one-time pad, providing a full justification for your conclusions.

This is a past exam question which I am struggling to solve. I am not a security expert nor intending to move in that direction as a in career path. The module is part of my MSc course and has nothing to do with my career. :-) So please spare me.

Answer (what I have so far):

The number of different substitution alphabets is $26!$ (factorial), assuming English alphabet. It is also assumed that letters can be in any position and cannot repeat themself.

Why is the key suddenly expressed as a $384$ bit number? I don't understand. Why $64·6$?

We don't have $64$ characters in alphabet...if each character within English alphabet is represented with $6$ bit, then it should be $6·26=156$ bits?

Now I am thinking if the question says $64$, maybe alphabet is custom(like including uppercase letters and some numbers + characters), so number of permutations is $64!$ (factorial)?

What does this key look like(or example of such key)?
I understand key substitution with a single key word say:abcdefghijklmnopqrstuvwxyzhatredbcfgijklmnopqsuvwxyz

Can you help me determine what the substitution alphabet key should look like?

I went to Wikipedia's page Six bit character code – it suddenly became even worse within my head. Is this link even relevant?

I would appreciate examples more than answers. Thank you. If you find my question too long please ignore the part about comparison with other encryption systems.

Yes, there are not 26 characters, but 64 – this is mentioned in the question. (This might be something like uppercase, lower case, digits and some punctuation, or simply 6-bit units without any meaning, like an encoded image file.) Your Wikipedia link shows some example on what your six-bit alphabet might represent. But this doesn't have to concern you as a cryptographer.
–
Paŭlo Ebermann♦Jul 31 '13 at 19:58

1 Answer
1

The format they are proposing for the key seems to be some sort of bit-packed array. First, with 6 bits, there are 64 possible values (0-63). Now, imagine you have replacement rules (your key) like these:

0 -> 17

1 -> 43

2 -> 12

...

63 -> 8

Which means: when encrypting, replace all occurrences of the value 0 with 17, all 1s with 43, etc. These values may correspond to letters, but that is not necessary at this point. As the numbers on the left side of the mapping table are sequential, an alternative form of writing this key would be $17,43,12,...,8$. This could easily be transformed into the more verbose initial version. Each number in the list takes 6 bits to represent (since it must also be $<64$), and there are 64 of them, so total bits to store the key is $64*6=384$.

In a way, this key size is deceiving. Key of size $l$ bits normally implies that there are $2^l$ possibilities for the key. Here, you were right to use factorials. There are only $64!\approx 2^{296}$ possible keys. To understand why this is true, think about what happens if the key is sent, but the last 6-bit number is left off. We can figure out which one it is just by seeing which value hasn't been used yet. Even if the last two 6-bit numbers are left off, there are only two possibilities for the key ($a,b$ or $b,a$). So clearly, sending 384 bits means that we are sending more than we strictly need.

The 6-bit character code is just one possible way to convert these 6-bit numbers the algorithm uses into characters that are useful for humans.