const is for variables that do not change. So they are like defined symbols but with type checking.

Code:

#define a 1
const int b=2;

b is recognised as an integer by the compiler. But a will just be lexically replaced with 1 wherever it occurs, and this can lead to runtime errors if it's used in the wrong context.

const is also used by functions to indicate that the parameter won't be changed by the function. The compiler will prevent you from making changes to a parameter that you have declared as const.

*DO NOT* "work around" const by using a pointer. Copy the value into another variable instead. The reason for this is that an optimising compiler will take const into account. So:

Code:

const int i=10;
printf("%d",i);
printf("%d",i);

the compiler may optimise i away altogether and rewrite this as

Code:

printf("%d",10);
printf("%d",10);

so if you've been too clever and done something like:

Code:

const int i=10;
printf("%d",i);
int *j=&i;
*j*=2;
printf("%d",i);

then the output might still be 10 10. Except the optimisations could be different in different builds. So this might be 10 20 in a debug build where there is no optimisation (because optimisations can make debugging tricky), and when you're satisfied the code is correct and do a release build, the behaviour will change because of the optimisations and the code you thought would output 10 20 now outputs 10 10. Bugs that only reproduce with release builds and not debug builds are incredibly difficult to find and any kind of "hacking" like this opens the door to a whole world of pain that is so easily avoidable.