That is not true, and depending on the compiler a define may take up more memory than a const. The thing is that the define sort of creates a copy (maybe several copies if you use it a lot) of the value, whereas the const int is a single object. Think about it, where is the 4 stored? It must be in the memory somewhere.

I'm no Arduino compiler expert, but it could be that #define is faster when it comes to execution.

I modify my assertion then

Quote

constant int MyValue=42;

will create an integer that takes up memorysubsequent references to MyValue will generate an address reference

It's in the instruction itself, not in a separate variable space in memory anywhere.

Here's the thing:

For programs compiled with avr-gcc optimized for space (which is what Arduino does), values of variables designated "const" are put directly into the code. So it costs nothing at run-time to use the const... thingie. (Note that this action is not part of any C or C++ language standard, and it is not absolutely, positively, unequivocally guaranteed that this will be the case with all compilers or even all versions of avr-gcc, but it is my observation using avr-gcc version 4.3.4 with arduino-0022 on my Linux system.)

Compile using 13 everywhere the LED pin is referred to. (That's the way it is in my version of arduino-0022/examples/1.Basics/Blink.pde)

Then use #define ledPin 13 at the top of the sketch and change all occurrences of "13" to "ledPin" and compile again.

Of course you will get the exact same code, right? I mean, that's what #define is all about, right? (Don't take my word for it---try it.)

Now use const int ledPin = 13; instead of the #define. Compile again. I get the same code. Exactly the same. Therefore I conclude that there is no "penalty" for using the more robust construct for this example with this compiler and this version of Arduino. (See Footnote.)

Now, as has been mentioned, in general, using a "const variable" instead of #define can give the compiler some additional error checking capabilities, since type checking can be performed on usage of the "const variable" This can be a Very Good Thing since it can help the compiler find some of our usage errors.

The main other difference (to me) is that the variables designated "const..." appear to the compiler to be just like variables (except, of course, the user program won't be allowed to change the value at run time).

That means, for example, if you have a function that takes a pointer as an argument, you can send it the address of the "const variable," but you would not be able to do this with a #defined identifier.

void foo(const int *x){ // Do something with whatever it is that "x" is pointing to}

Now, for programs like this last example, it's not a matter of whether there is some memory allocated (somewhere) for the "const int" identifier. (There is.) It's a matter of whether you can do the same thing with a #define statement. (You can't).

Maybe this isn't such a BFD for beginners, but I think it is interesting and, maybe, even useful...

Regards,

Dave

Footnote:When I said that I got the "exactly the same code," I don't just mean that I got the same sketch size. I examined the results of executing avr-objdump -d on the generated .elf files in the Arduino build directory. A "diff" showed no differences at all among the three cases of blink sketch that I mentioned above.

Here's the disassembled code for the loop() function for all three versions. The "ldi r24, 0x0D" instructions are the places where the compiler is using the value of ledPin.

I hate to repeat myself, but there is no guarantee that the resulting executable code will always be the same for the different cases. If people have different results, maybe they can share them with us. (Be sure to tell us what version of avr-gcc you are using, and what version of Arduino---I don't know for sure that they have always used the same optimization switches with the same effect that I reported here.)

With Arduino, a const int will always be optimized down to a literal, just like define. Arduino users do not have to worry about other compilers or architectures or whatever.

Generally, the compiler (which handles const int) is considered superior to the preprocessor (which handles defines). It's type safe and respects scope. Among C++ developers I know, its non controversial to avoid the pre processor. But frankly for most Arduino use, its an esoteric difference.

It would be kind of a programming puzzle to construct a bug that one could introduce by using define instead of const in regular usage.

Don't you need to add "static" to ensure that? Depending on the optimizer always makes me nervous. If you care about this level of optimization, you should be paying close attention to how things actually happen in YOUR program.

For a constant like "4", having a strong type is not always a good thing.

That means, for example, if you have a function that takes a pointer as an argument, you can send it the address of the "const variable," but you would not be able to do this with a #defined identifier.

noted.I'm searching google everytime for references in arduino, and most of the time it will bring me to the forum. I can't fully understand yet, but it will be useful when my understanding advances.

With regard to the memory consumption of const vs. #define. It is just not true that const will always use up more memory. It basically depends on how the consts are used. The proper approach is to use them for type safety. However it is important to keep in mind that string constants are a different issue altogether. They do consume Ram which is very tight for the Arduino. Whoever got memory issues needs to take care of string constants. Dealing with them is not achieved with #define either. Actuall #define may end up using even more memory. Dealing with string constants requires the use of the progmem macros.In any case: most programs around would benefit much more from type safety than from questionable memory footprint reduction. Unless you are ***very*** tight on memory #define is the wrong way.

Apropos the calculation of f1, because a1 is considered an integer, the division is an integer division, so the result is calculated then converted to a float (result = 1). In the second case (calculation of f2), as a2 is defined as a float, the variable "b" is promoted to a float and the division is floating-point, resulting in the value of f2 being 1.5.

I realise this is a contrived example, and there are a number of ways around, including (float) in the define, or using 2.0 as the value rather than 2. The purpose is to show that behaviour is less predicable when you use #define.