Someone wrote a strcpy (I think that was it... over 10 years ago now) function inside of a method (because they didn't want the overhead of calling strcpy... sigh).

They clued in that it wouldn't work for Japanese characters so they added an "if" at the start to do ASCII or Unicode. At that point the code was about a screen long... likely killing cache coherency and erasing his supposed savings for the inlining of the code.

The code was identical save for the types (so should have used a macro).

Of course the strcpy that they wrote was much much much slower than the hand tuned assembler one that was in the standard library...

Of course if they had just done it all as a macro it could have been replaced with a call to strcpy...

Its beauty lies in its simplicity. It's been said it has all the speed of assembly language combined with the readability of ... assembly language :-) I prefer it over the bloated C++ (although I do prefer Java in my day job due to its huge library).

This is actually pretty common, as far as I understand, this comes from converting legacy code from pascal, or something of the sort.I've seen it in several projects, I'm not sure why people don't just use replace...

@Bernard - Of course not! As a Perlite, I get frustrated when I remember I don't have until() or unless() in C. Or postfix notation. If anyone can come up with a macro to allow me to write "i++ if(i);" in C, I will give you five dollars. And a free hug.

This allows a C structure that contains a member variable called class to be handled by a C++ compiler. There are two headers with this construct in it; one of them also contains '#undef class' at the end and the other doesn't.

When I first came across macros in C they had me stumped for days. Below is what I was faced with. I imagine it makes perfect sense to C experts and is super efficient however for me to try and work out what exactly was going on meant cutting and pasting all the different macros together until the whole function could be viewed. Surely that's not good practice?! What's wrong with using a plain old function?!

Forgive me as a non-C++ programmer: Is the main problem here that a threadsafe function is converted into a non-threadsafe one? Or that InterlockedIncrement expects a pointer, so now you'll increase the pointer instead of what it's pointing at? Or both?

The problem is that InterlockedIncrement is *normally* an atomic function defined in the Windows API. So when people call InterlockedIncrement, they expect to call into a function that is guaranteed to be executed atomically. Instead, someone defined a macro with the same name, which evaluates to a plain, non-atomic increment

they still be literals mon, should name em by purpose/intent not alternate symbol.COBOL code I heard about they made variable 5 = 5 then later had code saying set 5 = 10... people where real suprised when they did var + 5 and got var + 10.

This is an example where the macro really was better than a non-macro solution:

In a non-macro solution classes, functions and variables have to be built to keep track of what ID the message is. The developer may or may not make the message ID tracking complicated whereas this is easier to read and debug.

In addition, its easier to add new messages just by adding the message into the source.

The disadvantage of this situation is that the file has to be included in all code that uses messages. Compile time would increase whenever a message is edited.

Now I come along and sort the #defines... and the protocol changes. Or I get the Doxygen religion and document all the message codes, and the protocol changes. At least an enum is stable under the latter change.

You can suppress this but #define'ing NOGDI before including windows.h, provided of course that you don't need to use any of the various GDI functions. There are a bunch of other macros such as WIN32_LEAN_AND_MEAN, NOMINMAX, etc. that suppress other things from being defined or included.

Problem is that all windows API functions are macros. One that bit me was GetTickCount(). Since I do most of my programming outside of windows, I found all the defines in the windows headers and then made my own include file which defined them all to verify compatibility beforehand.

@Mark Ingram: the problem is that if you have a header file declare a function named GetObject(), and that header doesn't include windows.h, but another file includes windows.h before your header, you'll end up with different names and a compiler or linker error.

Haha I just ran into another variant of this on Windows. Calling a method called "FillMemory()" for a memory test class. Actual error: Not enough actual parameters for macro RtlFillmemory. see: http://msdn.microsoft.com/en-us/library/aa366561(VS.85).aspx

I think we have a winner. It's real-world, it's a ridiculously bad idea, and it's affected a huge number of innocent programmers. Whoever is responsible for this gem at Microsoft should be considered a war criminal... The best part is that Microsoft didn't think twice about using such amazingly common names, like GetObject, SendMessage or CreateWindow.

Why is that so evil? The macros simplify the code (to some extent). Good programming languages allow you to create mini-languages that mirror the problem domain. C offers you the preprocessor as the main option, unless you want to spell out the portable assembler...

Genuine code - I thought I'd removed it, but apparently not. I must have done so out in some temporary branch and not gotten permission to check it back into the main code. One more item for the 'to do' list.

Yes that's right, no closing braces in any of the functions. Syntax highlighting was a mess, so he used vi to edit (not vim, it has syntax coloring!)

He was a Russian programmer who had mostly worked in assembly language. He was fanatical about saving as many bytes as possible because he had previously worked on systems with very limited memory. "It was for satellite. Only very few byte, so we use each byte over for many things." (bit fiddling, reusing machine instruction bytes for their numeric values) When I tried to find out what kinds of satellites, I was only able to get "Orbiting satellite. For making to orbit."

He had two other quirks: A convex mirror mounted above his monitor "For knowing who is watching", and an occasional sudden exit from his chair to do a quick ten pushups. He explained this last one as "Compiler found error in code. This is punishment".

In case someone things that this was one lone crazy programmer, I have also seen a very similar thing done for return statements and braces. Even worse, however, the macros were not named such that it was obvious there was actually a return that occurred!

I think programmers (myself included) would be a lot more fit if we all did 10 pushups every time a compiler found an error in our code. This might also reduce the occurrence of testing by compilation.

I can assure you, that person wasn't a typical Russian programmer. We *REAL* Russian programmers tend to punish ourselves with kettlebell exercises and (occassionally) with horseshoe bending (though it's considered more of an Architect thing) :))

I like the other one better. This one shows something close to Java being written with a few macros. The other one shows exact Java being written with a plethora of sneaky macros and structs with function members. The first one was a cheap joke, whereas the second one was an elaborate and well-though out joke.

Two points: one, this paste messed up the original indentation. And two, the code looks fine for what it is: 1970s Unix C by a fervent Algol-68 fan. If _why the lucky stiff can express himself in a quirky style, why can't Steve Bourne? Of course, someone condemned to maintain it who doesn't know Algol 68 may not appreciate this chance to broaden their own tastes.

Good macros: (although personally I dislike the double parentheses required to use this syntax; I prefer either vararg macros (C99 only) or something like PRINTF_0, PRINTF_1, etc, depending on the number of arguments)

@Marten: unless that is something like "(void)(LookupDebugId(id), ConvertToString(id))", in which case "ConvertToString" will still be called, just its return value is ignored, and "LookupDebugId" could be non-existent.If you are in Visual Studio land, you have __noop, which does what you want.

@Mark - It declares `public` and `static as nothing, `void` as `int`, and `main(x)` as `main()`, so `public static void main(String[] args)` turns into `int main()`. Then `System` turns into `S s;s`, so `System.out.println("Hello World!");` turns into `S s; s.out.println("Hello World!");` which calls the `println` function in the `F` struct in the `S` struct.

I was pretty new to the practice of using macros and used this macro, but I expected the function that I passed to it to fail. And I was doing it in a background thread, so it stumped me for days why my entire app was "crashing".

As an aside, if only std::tr1::function was around when this macro was written, I would have a week of my life back!

Bear in mind that using CALL_AND_CHECK() twice in one scope will result in multiply defining `result`, and even if it didn't `if (x) CALL_AND_CHECK(foo, y) else CALL_AND_CHECK(foo, z)` would have surprising results.

This pattern can be used to do all sorts of (terrible) stuff... generate switch statements or huge if else blocks, or interface with "real" code. You could even use it to ::cough:: generate a context menu in a non-oo context menu system ::cough::. Not that I'd ever do anything so lame.

Since Qt is a really big framework there's little reason to use other libraries, especially boost::signals. Those macros make perfect sense in the code that uses Qt. If you don't want them, just disable them and use Q_WHATEVER. Not a problem at all.

@iconiK, true, the signal/slots are jut tags for the MOC to pick up, they could at least have made the macro qt::signal so there was less chance of collision. ps you can build QT with a flag to rename them Q_SIGNAL and Q_SLOT

@Martin, actually MOC looks for Q_SIGNAL, Q_SIGNALS, Q_SLOT and Q_EMIT. The QtGlobal header file (not sure if this one) includes #defines for signals, slots and emit for your convenience; those can easily be disabled by a #define if they get in the way, thus your argument is not well thought out.

It's even worse when four of your other external dependencies also define min/max of their own, of varying degrees of suckiness, ranging from badly-parenthesised macros to well-written templates, and one of them just has to make it impossible to be undefined or otherwise skip these... In my book the language is 50% to blame though.

Seemed like a good idea sometimes, but if you need to, you should probably just string replace anyway. Protected is fairly evil, allowing internal access to descendants isn't much better than just making the items public...

Notice that we haven't even gotten to any code yet, this is just to get to the first actual bit of code. This actually happens (in almost, but not exactly the same way) for several functions, each of which, in the end only have 4 possible variations (which are also mostly copy/paste with slight variations and #ifdefs of their own).

A coworker and I found these two gems in some of our code for object streaming. These macros were instantiated in EVERY SINGLE class file that did streaming. Not only is this hideous code spewed all over our code base, when we approached the original author about it, he wrote a 7 page article on our internal wiki defending this as the only possible way to accomplish what he was attempting to do here.

Needless to say, it has since been refactored out and is no longer used in our code base.

I maintain code that has gotos in macros. So a function will have a label at the end but no visible goto in the function code. To make matters worse the macro is at the end of other statements usually off the screen unless you scroll horizontally.

I saw one that a colleague of mine had the pleasure of working with. They tried to make a custom-implementation of string interning, so they re-implemented strings using a massive number of macros that (of course) didn't work properly. Trying to figure out what it did made my eyes explode because of all the ##'s scattered about.

At Lucent, I once took a look at the source code of Steve Bourne's original Unix shell, and found he'd used the C pre-processor to make C look like Pascal or Algol. The part dealing with if statements looked like this:

A friend of mine told me he'd done some maintenance on it in the mid-1990s, and it was still the same. (There's a lesson here for us in the inherent conservatism of a code base.)

Of course Steve did this as an experiment in the early days, and I'm sure would have had second thoughts if he'd written it later.

Update: According to Wikipedia's Bourne Shell article, the macros gave it an Algol 68 flavor. And, the full set of macros is here! They apparently influenced the founders of the International Obfuscated C Code Contest.

That's more like Algol than Pascal - it is Algol that uses backwards keywords (like 'fi') to mark the end of constructs. The shell uses that, in general. Fun question: why is the end of a loop in Bourne shell marked by '`done`' and not '`od`'?

You can grab the code from CVS and take a look. I had put some more details about it into my blogpost a while ago when I stumbled upon it: http://bnpcs.blogspot.com/2009/02/preprocessor-directives-for-ruby-not.htmlIf not for the problem with debugging the resulting code (the problem of having hugely long lines if they are generated by such a "language"), it could have been even usable as a practical code-generator for C.

I agree that for the most part, macros are horrible to use, but i have found a few instances where they have been useful.

This one is actually brilliant IMHO, as you can only get something similar with sprintf, which then requires resource allocations and whatnot, plus, all work is done entirely by the preprocessor

// Macro: Stringize
//
// Converts the parameter into a string
//
#define Stringize( L ) #L
// Macro: MakeString
//
// Converts the contents of a macro into a string
//
#define MakeString( L ) Stringize(L)
// Macro: $LINE
//
// Gets the line number as a string
//
#define $LINE MakeString( __LINE__ )
// Macro: $FILE_POS
//
// Gets the current file name and current line number in a format the Visual Studio
// can interpret and output goto
//
// NOTE: For VS to properly interpret this, it must be at the start of the line (can only have whitespace before)
//
#define $FILE_POS __FILE__ "(" $LINE ") : "

The other that I loathe to use, but find it extremely useful is doing something like this, which basically allows me to quickly generate templates that have a variable number of template parameters

Although this makes it kind of horrible to read "Callback.inl", it does completely eliminate rewriting the same code with a different number of arguments. I should also mention that "Callback.inl" #undefs all of the macros at the end of the file, hence, the macros themselves won't interfere with any other code, it just makes "Callback.inl" a little harder to write (reading and debuging isn't too hard though)

Yea, that's pretty bad. m4's syntax isn't great. I once dug into it's docs thinking that I could use it for some other stuff, and found that the authors didn't include a looping construct because you could build one of your own using recursion. I didn't use m4 for anything after that.

And that's if they get it right; I've seen versions with all possible permutations of parenthesis present or not. I've seen it defined twice in the same header file.

Mainly my argument applies to Windows (though I assume other OS SDKs have something similar), where just about everyone seems to feel the need to define this macro in their project's header, and I don't understand why.

WinNT.h (which is included by Windows.h) defines a very nice version that does some template voodoo to cause compile time errors if you pass a pointer type instead of an array.

Of course it falls back to exactly what I wrote above if you are building a C program, but I would still not redefine something the SDK has by default for no reason.

If you're writing code on windows, you're already dragging in windows.h. Unless you're writing small console apps and the like. I mean, you can try to stick to just the crt functions, but eventually you'll want to do something you need win32 for.

I'm not fond of the Boost Preprocessor stuff. I attempted once to figure out how to use it (we had Boost in the project anyway...), but as near as I could tell, using it would make my error messages SO unreadable that it wasn't worth it.

I liked the idea of the equivalent of looping macros, but it was just too much.

At the time it seemed like a good idea to "pass" a macro as an argument into another macro. (I just couldn't stand the thought of defining a list of values in multiple places.) The code here is contrived (and not very motivating), but gives you the idea:

A "technical manager" who had formerly been a coder introduced the following wonderful macros into our C++ project because he thought that checking for NULL values in DOM parsing routines was just too much work:

TRYSEGV
CATCHSEGV

Under the covers, these used setjmp, longjmp, and a signal handler for SIGSEGV to emulate the ability to "catch" a segfault.

Of course, nothing in the code reset the jump pointed once the code had exited the scope of the original TRYSEGV macro invocation, so any segfault in the code would return to the (now invalid) jump_env pointer.

The code would immediately die there, but not before destroying the program stack and rendering debugging more or less pointless.

When I was new to game programming I was writing a frustum for a camera class is a game that I wrote, I had really strange errors in my code.

It turns out that Microsoft had some #defines for near and far in windows.h which caused my _near and _far variables to error on the lines that contained them. It was very difficult to track the problem down because (I was a newbie at the time) and they only existed on four lines in the whole project so i didn't realise right away.

I once had to port a C application from unix to windows, the specific nature of which shall remain unnamed to protect the guilty. The guy who wrote it was a professor unaccustomed to writing production code, and had clearly come to C from some other language. It also happens that English wasn't his first language, though the in country he came from from the majority of people speak it quite well.

His application made heavy use of the preprocessor to twist the C language into a format he could better understand. But the macros he used the most were defined in a header file named 'Thing.h' (seriously), which included the following:

The entire project (~60,000 LOC) was written in a similar style -- marco hell, weird names, Olde-English jargon, etc. Fortunately we were able to throw the code out since I found an OSS library which performed the same algorithm dozens of times faster.

Many years ago I had the fun task of porting the original Transport Tycoon from the PC to the Mac. The PC version was written entirely in assembler so we had to go through the whole source code and port it to 'PC' C code first and then port that to the Mac. Most of the code was OK, even object orientated in places. However, the world rendering system was unbelievable. For anyone who's not played the game, the world can be viewed at one of three zoom levels. The code for this was something along the lines of:

macro DrawMacro <list of arguments>
a couple of thousand lines of assembler with loads of conditionals
based on the macro arguments
DrawZoomLevel1:
DrawMacro <list of magic numbers>
DrawZoomLevel2:
DrawMacro <list of more magic numbers>
DrawZoomLevel3:
DrawMacro <list of even more magic numbers>

We must have been using a slightly older version of MASM as the macro would crash the assembler when we tried to assemble it.

(Data were stored in arrays, since there were no structs. CONS_OFFSET is the constant 1000.)

car and cdr are used frequently in Lisp, and are short, and since function calls weren't
very fast in the implementation language, I optimized my code by implementing those two Lisp functions as macros:

I worked a long time to find the problem, until I finally checked what
that short line was expanded to by the pre-processor. It was expanded to a 31370-character line, which I have here split into lines (502 of them) for clarity:

this is just so evil. It's random, which means it fires in different places all the time, it changes return statement, which usually have some code on it that could fail all by itself, it changes innocent looking keyword that you won't ever get suspicious over and it uses exception from std space so you won't try to search through your sources to find it's source. Just brilliant.

Just tested this one, at least it doesn't compile by default because of a missing include for random, and it's red-squiggled then. If you have the include by accident, however, things get worse - VC++2010 marks it still as a keyword and does not show the macro expansion tooltip, so no help from the IDE to find this :-/

The NFS code in BSD-kernels use goto between macros. It's still in use, and the code actually works. I know of several persons who have tried to clean it up, but all of them have given up after a while - it's just too messy.

I once saw a macro package that would alias every C keyword to let you effectively program in Klingon. That's right, Klingon. (Un)fortunately, the project was abandoned and taken down several years ago.