My question is why use the integer variable declaration instead of just defining all numerical variables (excluding double etc.) as long?

Unless you're performing an operation like in a for loop where you can guarantee that the value won't exceed the 32,767 limit, is there an impact on performance or something else that would dictate not using long?

6 Answers
6

So, the benefit is in reduced memory space. An Integer takes up half the memory that a long does. Now, we are talking about 2 bytes, so it's not going to make a real difference unless you're storing a TON of integers.

BUT on a 32 bit system, a 16 bit integer gets silently converted to a long without the benefit of the larger range of numbers to work with. Overflows still happen and it takes just as much memory. Performance may even be hurt because the datatype has to be converted (at a very low level).

Not the reference I was looking for but....

My understanding is that the underlying VB engine converts integers to long even if its declared as an integer. Therefore a slight speed decrease can be noted. I have believed this for some time and perhaps thats also why the above statement was made, I didnt ask for reasoning.

Traditionally, VBA programmers have used integers to hold small
numbers, because they required less memory. In recent versions,
however, VBA converts all integer values to type Long, even if they're
declared as type Integer. So there's no longer a performance advantage
to using Integer variables; in fact, Long variables may be slightly
faster because VBA does not have to convert them.

So, in summary, there's almost no good reason to use an Integer type these days. Unless you need to Interop with an old API call that expects a 16 bit int.

One thing worth pointing out is that some old API functions may be expecting parameters that are 16-bit (2-byte) Integers and if you are on a 32 bit and trying to pass an Integer (that is already a 4-byte long) by reference it will not work due to difference in length of bytes.

One thing worth pointing out is that some old API functions may be expecting parameters that are 16-bit (2-byte) Integers and if you are on a 32 bit and trying to pass an Integer (that is already a 4-byte long) by reference it will not work due to difference in length of bytes.
– user2140173Nov 4 '14 at 9:04

2

@It'sbeenapleasure I'don't understand. If you pass a Long by ref to a function that expects an Integer by ref, it will use the first two bytes, and since numbers are stored in little-endian, it will work (provided that only the two lower bytes are meaningful, but it's likely if this Long is simply an Integer stored in 32 bits).
– user1220978Apr 26 '15 at 7:16

2

The other time that you must use Integer is when declaring a Type, where the type's layout and size is important, either because you're passing the type to an API, or you're serializing/deserializing a file, or you're copying bytes with LSet/Rset.
– ThunderFrameApr 9 '17 at 20:01

That MSDN article is wrong and/or misleading. It might refer to e.g. Integers being internally processed as Longs inside a processor, but for all observable purposes an Integer still takes two bytes. See answers and comments under stackoverflow.com/q/26717148/11683 for extended discussion.
– GSergSep 3 '17 at 13:06

1

You don't seem to state anywhere in your answer that msdn.microsoft.com/en-us/library/office/… might be wrong or misleading, rather, you are presenting it as the msdn documentation I was really truly looking for. However my comment was mostly directed towards future readers rather than yourself, because you have participated in that other question, even though your answer there in my opinion should not have been accepted (this one I believe should have been).
– GSergSep 3 '17 at 13:18

an Integer is 16 bits and can represent a value between -32,768 and 32,768

a Long is 32 bits and can represent -2,147,483,648 to 2,147,483,648

and there is a LongLong which is 64 bits and can handle like 9 pentilion

One of the most important things to remember on this is that datatypes differ by both language and operating system / platform. In your world of VBA a long is 32 bits, but in c# on a 64 bit processor a long is 64 bits. This can introduce significant confusion.

Although VBA does not have support for it, when you move to any other language in .net or java or other, I much prefer to use the system datatypes of int16, int32 and int64 which allows me to b much more transparent about the values that can be held in these datatypes.

Even though this post is four years old, I was curious about this and ran some tests. The most important thing to note is that a coder should ALWAYS declare a variable as SOMETHING. Undeclared variables clearly performed the worst (undeclared are technically Variant)

Long did perform the fastest, so I have to think that Microsoft's recommendation to always use Long instead of Integer makes sense. I'm guessing the same as true with Byte, but most coders don't use this.

The undeclared variable is a variant. Dim a as Variant, b as Variant, c As Variant should give the same result as not "dimming" at all.
– VityataSep 4 '18 at 21:41

3

Neat! Happy to see someone did some benchmarking!
– RubberDuckSep 4 '18 at 23:33

@Vityata, I was aware that "undeclared" SHOULD mean variant, but keep in mind the backstory to this whole discussion was microsoft automatically converting integer to Long despite being defined! So I wasn't sure what to expect with whole numbers without being defined. Seeing the result, it seems most likely to truly be Variants, but that's why I gave the name "Undeclared" vs. 'Variant." Plus that's probably a lot more useful to lower level coders. However, I'll go ahead and add that in now. Thanks for feedback.
– PGCodeRiderSep 7 '18 at 7:00

@PGCodeRider - it would be probably interesting to create a dll library with c++ and to add it to VBA and to benchmark it there as well. The results would be probably interesting. I have done something similar here - vitoshacademy.com/…
– VityataSep 7 '18 at 8:38

In some situations it's a necessity to use a long. If you're looping through rows in a large excel file, the variable that holds the row number should be a long.

However, sometimes you will know that an integer can handle your problem and using a long would be a waste of space (memory). Individual variables really don't make much of a difference, but when you start dealing with arrays it can make a big difference.

In VBA7, Integers are 2 bytes and longs are 4 bytes

If you have an array of 1 million numbers between 1 and 10, using an Integer array would take up about 2MB of RAM, compared to roughly 4MB of RAM for a long array.

For an array of numbers between 1 and 10, you could use a BYTE array instead, but I understand the point you are making.
– ChrisBAug 31 '17 at 16:51

This was probably true some 15 years ago, but now the answer is really outdated.
– VityataSep 5 '18 at 10:24

How is this outdated? Problems in pandas where you have millions of rows (big data/machine learning), or in networking when transferring large amounts of data (ex. physics experiments)
– AlterSep 5 '18 at 14:05

@Alter - I guess you missed the point of the original question. Take a look at the accepted answer.
– VityataSep 7 '18 at 8:42

Looks like I understood it. Also the accepted answer points out it's talking about 32 bit systems when it says there's no difference.
– AlterSep 7 '18 at 12:13

Perhaps a CS grad can explain it better, but yes an integer requires less memory to store (may not be relevant in a very simple VBA script).

It also guarantees/ validates that the variable will always be an integer (non-decimal). I'm fairly sure if you store 6.7 as an integer, it will be converted to the number 7 in VBA.

It may not be the best practice to do this, but there may be cases where this is needed. I think the main purpose is data validation/ integrity. In particular, yes for loops where something needs to occurs X amount of times -- a decimal wouldn't make sense.

Primary difference is memory though -- if I recall a long is 64 bit and integer is 32 bit.