(#61) Can you automatically/implicitly convert a char to a short?

The answer for this : "No. They're the same bit-depth, but since chars are unsigned they might have a higher positive value than a short can accept"

I quite don't understand this. If by saying "automatically/implicitly convert a char to a short" we mean this:

this compile and run ok, and i did it automatically/implicitly. Now if we try to assign character which value is greater than 32767 then i understand that this will cause compilation error but does it mean i can't "automatically/implicitly" try to do it (just like in casting)?

Pawel Nowacki wrote:The answer for this :
"No. They're the same bit-depth, but since chars are unsigned they might have a higher positive value than a short can accept"

I quite don't understand this. If by saying "automatically/implicitly convert a char to a short" we mean this:

this compile and run ok, and i did it automatically/implicitly.
Now if we try to assign character which value is greater than 32767 then i understand that this will cause compilation error but does it mean i can't
"automatically/implicitly" try to do it (just like in casting)?

Your example isn't taking a char and implicitly converting it to a short. You are just assigning the code for 'c' to a short.

This code declares little as a char and gives it the value 'c'. The next line declares t as a short, and tries to pass the value of 'little' into it. When the compiler tries to implicitly convert little to a short it fails, giving the following error:

SCJA
When I die, I want people to look at me and say "Yeah, he might have been crazy, but that was one zarkin frood that knew where his towel was."