It is even in tutorials on the Arduino site, for example, the first tutorial that most people run, Blink.

I read somewhere that const int is preferred over #define. Why isn't this encouraged right from the beginning, rather than allowing people to develop bad habits, from the outset? I noticed it a while back, but recently it has started to irritate me, hence the question.

Memory/processing/computing wise is a const int, enum, or for that matter #define, better than a plain int, i.e. occupies less memory, stored in different memory (Flash, EEPROM, SRAM), faster execution, quicker to compile?

With pins in particular, the simplistic form of the basic arduino API functions like digitalWrite don't encourage proper embedded design, i.e. using masks and a single memory address for the entire port
– crasicAug 21 '15 at 1:52

Another commenter has suggested that using byte will not necessarily improve performance, because numbers smaller than int will be promoted to int (see Integer Promotion Rules if you want more on this).

However in the context of a const identifier, the compiler will generate efficient code in any case. For example, disassembling "blink" gives this in the original form:

The compiler know when it can fit a number into one register and when it can't. However it is good practice to use coding that indicates your intent. Making it const makes it clear that the number won't change, and making it byte (or uint8_t) makes it clear that you are expecting a small number.

Confusing error messages

Another major reason to avoid #define is the error messages you get if you make a mistake. Consider this "blink" sketch which has an error:

You look at the first highlighted line (line 4) and don't even see a "=" symbol. Plus, the line looks fine. Now it's fairly obvious what the problem is here (= 13 is being substituted for LED), but when the line is 400 lines further down in the code, it isn't obvious the problem is with the way LED is defined.

How many pins do you have? is a very good point Nick, as most boards have only in the range of the tens, not hundreds (i.e. greater than 255), so an int is overkill... that is, until Arduino finally come out with the Tera board... :-)
– GreenonlineAug 14 '15 at 10:14

As Ignacio has rightly states, it's basically because the don't know better. And they don't know better because the people who taught them (or the resources they used when learning) didn't know better.

Much of the Arduino code and tutorials are written by people who have never had any training in programming and are very much "self taught" from resources by people who themselves are very much self taught with no proper training in programming.

Many of the snippets of tutorial code I see around the place (and especially those that are only available within YouTube videos --- urgh) would be a fail mark if I were marking them in an exam.

Yes, a const is preferred over a non-const, and even over a #define, because:

A const (like a #define, unlike a non-const) does not allocate any RAM

A const (like a non-const, but unlike a #define) gives the value a type

The second point there is of particular interest. Unless specifically told otherwise with embedded type-casting ((long)3) or a type suffix (3L) or the presence of a decimal point (3.0), a #define will always be an integer and all mathematics performed on that value will be as if it were an integer. Most of the time that's not a problem, but you can run into interesting scenarios when you try to #define a value that is larger than an integer can store, such as #define COUNT 70000 and then perform a mathematical operation with other int values on it. By using a const you get to tell the compiler "This value is to be treated as this variable type" - so you would instead use: const long count = 70000; and all would work as expected.

It also has the knock-on effect that it checks the type when passing the value around the place. Try passing a const long to a function that expects an int and it would complain about narrowing the variable range (or even completely fail to compile depending on the scenario). Do that with a #define and it would just silently carry on giving you the wrong results and leave you scratching your head for hours.

If you define a macro that expands to a value that won't fit in an int the compiler treats the value as having the smallest type in which it will fit (modulo rules about signed vs. unsigned). If you're on a system where int is 16 bits, #define count 70000 will result in count looking like a long, just as if it had been defined as const long count = 70000;. Further, if you pass either of those versions of count to a function expecting int, any sane compiler will treat them the same.
– Pete BeckerAug 14 '15 at 17:44

1

I agree with @PeteBecker - a construct like #define COUNT 70000 does not truncate into an int, but the compiler treats it as a type large enough to hold that number. It is true that it might not be obvious when you use COUNT that it isn't an int, but you could say the same thing about a const long anyway.
– Nick Gammon♦Aug 15 '15 at 4:29

2

"a #define will always be an integer" That is not true. You are taking the rules of integer literals and applying them to preprocessor macros. It's like comparing apples and pop music. The expression COUNT in your example is replaced before compilation with the expression 70000, which has a type defined by the rules of literals, just like 2 or 13L or 4.0 are defined by the rules of literals. The fact that you use #define to alias those expressions is irrelevant. You can use #define to alias arbitrary chunks of C code, if you like.
– Lightness Races in OrbitAug 15 '15 at 17:12

Arduino already has a bad rep in the larger EE community because of the mediocre-to-terrible hardware designs put out by the community. Shouldn't we try to give a sh*t about something?
– Ignacio Vazquez-AbramsAug 21 '15 at 0:12

2

"Most projects aren't going to involve risk of life or finances..." No surprise there. Who would want to involve Arduino where there's any chance of risk after looking at the community at large.
– Ignacio Vazquez-AbramsAug 21 '15 at 1:06

2

It's 'wrong' not because it doesn't work in one particular situation but because, compared to doing it 'right,' there are more situations in which it doesn't work. This makes the code fragile; changes to the code can cause mysterious failures that eat up debugging time. The compiler's type checking and error messages are there to help you catch those sorts of errors earlier, rather than later.
– Curt J. SampsonApr 2 '17 at 6:38

As a 2-week newbie to Arduino I'd pick up on the general idea of Arduino being occupied by non-programmers. Most sketches I have examined, including those on the Arduino site, show a total lack of order, with sketches that do not work & barely a coherent comment in sight. Flow charts are non-existent, and the "Libraries" are an un-moderated jumble.