Conversions

Decimal numbers are what humans use, and computers use binary. So how does
one convert a decimal number to a binary number?

DECIMAL to BINARY Conversion

When converting from decimal to binary the mathematical way is simplest.
Start with the decimal number you want to convert and start dividing that number by two, keeping track of the remainder
after each complete division. Every time you divide by two, you will divide
evenly (0) or get a remainder of one (1). Following the pattern to the end,
you will get a binary number. Write the remainders in the order they were generated from right to left and the result is the equivalent binary value.

REVERSE THE ORDER OF REMAINDERS
The bits, in the order they were generated is 001101 Reversing the order of bits we get 101100.
Properly padded with leading zeroes to fill out one byte, we get 01011000

BINARY to DECIMAL

Each binary column has a corresponding decimal value. You can add the decimal
values of all columns that have a '1' in them and you will get the decimal
equivalent.