question about casting

This is a discussion on question about casting within the C Programming forums, part of the General Programming Boards category; Code:
char c = 0xff;
unsigned int a;
a = (unsigned int) c;
How come in the above cast "a" ...

question about casting

Code:

char c = 0xff;
unsigned int a;
a = (unsigned int) c;

How come in the above cast "a" becomes ULONG_MAX or 0xffffffff? Since I am
casting to unsigned, shouldn't it cast to 0x000000ff, or would I have to declare
c as unsigned char in order for that to happen?

>How come in the above cast "a" becomes ULONG_MAX or 0xffffffff?
Sign extension. Your char type is signed by default, so assigning it 0xff makes it negative. Casting to unsigned int then sign extends the value to 0xffffffff.

>or would I have to declare c as unsigned char in order for that to happen?
Try it and see.

That's just weird advice that has nothing to do with the question, and moreover, it is indeed safe compare signed stuff with signed stuff, and compare unsigned stuff with unsigned stuff without changing the logical answer: using the same signedness on both sides of a comparison makes the bit pattern deterministic.