the IDs are signed 32-Bit integer. So you are actually trying to set a negative value here.
Interpreting 0xFFFFF000 as signed 32-Bit integer actually ends up being -4096. The symbol parser is a bit picky here, as it expects signed values and expects the user to explicitly have the sign for negative values. I mean for the decimal value it is absolutely correct, as the value can't be displayed as a signed 32-Bit integer, but one might discuss, if the hex value should be ok.
Using -4096 (0xFFFFF000 as signed 32-Bit integer) in c4d_symbols.h actually works, though I'm not convinced it's a good idea in the end. I'd rather say stick to values ranging from 0 to 2147483647 (0x7FFFFFFF).

Ooopsy, just realizing, you are looking at my very own source... didn't touch it for quite a long time... and indeed I did some strange things back then. Nowadays I probably wouldn't define masks in that place, rather do it in some header file, Cinema is not parsing.

I have a new version that plugin, which will probably never get uploaded (I never received much feedback, so I figured nobody would be interested). If you want, I can send it to you. Even more confusing code, but who knows what it's good for.

Ahahahahahahah!! It is a small world, after all.
Yes, this is ColorTable.
I was asked by someone who bought some of my plugins if I would make it work for R15, R16 and R17 since it was a plugin that he used a lot in R13.
Is it possible to make it work for those versions?