Pfc_n_cst_numerical.of_Decimal_ULong and .of_Decimal_UInt return only 0 or 1

description

According to the documentation the pfc_n_cst_numerical.of_Decimal_ULong and pfc_n_cst_numerical.of_Decimal_UInt functions should return the decimal representation of their binary number argument. Instead, they return only 0 or 1, depending on the (valid,
non-null) argument.

If you change this part of the code in pfc_n_cst_numerical.of_Decimal_ULong