So, there are a bunch of questions appearing asking is X evil, is Y evil.

My view is that there are no language constructs, algorithms or whatever which are evil, just ones which are badly used. Hell, if you look hard enough there are even valid uses of goto.

So does absolute evil, that is something which is utterly incompatible with best practice in all instances, exist in programming? And if so what is it? Or is it just bad programmers not knowing when something is appropriate?

Edit: To be clear, I'm not talking about things programmers do (such as not checking return codes or not using version control - they're choices made by bad programmers), I mean tools, languages, statements, whatever which are just bad...

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

40 Answers
40

1. there are reason to have them. 2. Type inference in Haskell is a type of implicitness that I love.
–
Matt EllenDec 21 '10 at 14:57

1

+1 for magic numbers, bane of my life at a very big company. Also, "lost in the distant past" reasons for scaling by 1000 instead of 1024, leading to an enormous overhead in a critical loop that everyone was scared to eliminate because they didn't know what else would be affected.
–
geekbritDec 21 '10 at 15:01

7

Meh, implicitness is very neat sometimes. Explicitness is for assembly-programmers.
–
MackeDec 21 '10 at 18:57

6

Re #2: I see what you did there...Sorry about the downvotes from people who I'm guessing didn't get the joke.
–
Larry ColemanDec 21 '10 at 19:05

The reason so many people think that there are evil things is that it is pounded into their heads when they first take their programming classes. "Don't use goto! Always normalize your databases! Never, ever use multiple inheritance!" These are hammered in because these "evil" practices are so easily abused, not because they are inherently bad. There are so few uses of them that you can get away with saying "never" at first. What is truly evil is saying, "There is no reason to consider anything that is not a 'best practice'", because there is always a place where that very way is perfect.

+1: A while back, I added a new goto to some C code I was working on that already had a bunch of them. It was quicker than refactoring it. I told my wife, who's also a programmer, and she asked, "Honey, are you okay? Do you have a fever?" I'm currently working on some code where another guy wrote a C goto that jumps into the middle of a loop. I'd never write it myself, and I curse inwardly every time I see it, but it meets the ultimate test: it works.
–
Bob MurphyDec 21 '10 at 16:23

1

+1 - I worked on some C code right out of school where "goto exit" or "goto error_exit" was used often. I have to say it made for cleaner looking code than, having multiple return statements. My motto is that everything has it's place. (even singletons ;-) )
–
eSniffDec 22 '10 at 1:48

1

@JonHopkins -- which is why I much prefer to talk of "good practices".
–
RichardDec 22 '10 at 9:43

Guns go a long way in helping people kill people. What’s worse, I don’t understand the point you’re trying to make: who ever claimed that dev-tools are evil?! There must be something clever in this answer since it got so many up-votes. But I completely fail to see it.
–
Konrad RudolphDec 22 '10 at 11:22

+1 I was going to add this if someone else hadn't
–
Conrad FrixDec 21 '10 at 22:22

1

Really? "I don't care if this fails or not" is never valid, not in any possible context? Not everything done needs to be guaranteed to succeed or fail, especially in small throwaway utilities. Another case where not thinking first is the real evil.
–
SilverbackNetDec 21 '10 at 23:02

4

I see that in legacy code All. The. Time. What's worse is that people question me when I change it.
–
George StockerDec 22 '10 at 2:01

5

I add a email notification in it first that sends me a "empty catch block hit" type exception, and release it to prod so I can see WHY that catch is actually there, and under what conditions it's being it. Then I fix it.
–
CaffGeekDec 22 '10 at 5:35

2

I would vote for checked exceptions in Java :D
–
NilsDec 22 '10 at 21:05

Perhaps I can flip the question around, and ask if there is anything in programming that is absolutely and perfectly good? If you can't think of one thing (I know I can't), then the concept of evil is also just as muddy.

There are common behaviors that lead to mistakes, misunderstandings, and other general confusion--but to say that language feature X is inherently evil is to admit that you really don't understand the purpose of feature X.

There are common behaviors that can save a lot of heartache and avoid some misunderstandings--but to say that language feature Y is inherently good is to admit that you don't fully understand all the implications of using feature Y.

We are a people of finite understanding, and strong opinions--a dangerous combination. Hyperbole is just a way of expressing our opinions, exagerating facts until they become fiction.

Nevertheless, if I can avoid behaviors that lead to problems and pursue behaviors that avoid them, I just might be a bit more productive. At the end of the day that's what it's all about.

is anything in programming absolutely and perfectly good? Clear, well written documentation?
–
JamesDec 21 '10 at 22:22

4

@James Clear, well-written documentation is often used for an excuse of unreadable code, and can become shackles that needs to be maintained together with the code if it's overdone. Clear, well-written, but completely outdated documentation can also become pure evil.
–
Eugene YokotaDec 21 '10 at 22:49

@bold: I actually ran into that some years ago; thanks to some garbled macro definitions, both TRUE and FALSE wound up evaluating to the same value (0, IIRC). Made for an interesting afternoon.
–
John BodeDec 22 '10 at 17:29

1

@bold Or is it the other way around?
–
Mateen UlhaqMar 30 '11 at 22:44

Nothing, really nothing could ever excuse the way this program increments the loop counter. It's just a undefined effect that happens to do the right thing on my machine, my compiler, my default options.

So does absolute evil, that is something which is utterly incompatible with best practice in all instances, exist in programming? And if so what is it?

Yes; the standard C library function gets(). It's evil enough that the C standards committee has officially deprecated it, and it is expected to be gone from the next version of the standard. The mayhem caused by that one library call is scarier than the prospect of breaking 30+ years' worth of legacy code -- that's how evil it is.

gets() takes a single argument, which is the address of a buffer. Characters are read from standard input into the buffer until a newline is seen. Because all it receives is the address of the buffer, gets() has no idea how big the buffer is. If the buffer is sized for 10 characters and the input stream contains 100, those extra 90 characters are written to the memory immediately following the buffer, potentially clobbering the stack. As a result, it's a favored malware exploit. It is unsafe and insecure by design.
–
John BodeDec 22 '10 at 15:28

The Billion Dollar Mistake: I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null. More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965. C.A.R. Hoare, 2009

We have very myopic views of "evil". People who kill lots of other people are evil. People who steal from others are evil. Every nation (that I know of) has some evil in their past. Some would like to deny it.

Is there evil in programming? We innocent programmers might like to think "not really". However, once I had a conversation with the inventor of a widely-used hierarchical database, on this very subject. Want to know who was one of the best customers? The secret police of Communist Poland.

Is there evil in the world now? You bet. And are they using programmers? You bet.

I'm surprised no one has floated Globals as a true evil. No better way to be programming in an environment about which you have no idea of the parameters and virtually no control over what happens to them. Chaos! I have a strict ban on the use of global variables in all of my coding.

Find me a non-trivial application that doesn't have globals. Oh they're wrapped up and neater and protected and in a class and in a framework and generally better... but if there's just one for the application its a pretty much global. Where the issue is is with use of inappropriately scoped variables.
–
MurphDec 22 '10 at 9:23

2

Global variables don't deserve such a bad reputation. Like @Murph said, some things are global. The file system is global (all processes use the same); your process is global to all the threads; the memory is global (another process can use up your memory and crash you while he safely use memory parachute); the user himself is global (think in terms of UI design). It's not wrong to model something inherently global as global variables -- or Singleton, or Service Locator or maybe Registry.
–
kizzx2Dec 22 '10 at 18:09

I personally find Donald Knuth's phrase: "premature optimization is the root of all evil" as the first evil thing in programming. In an expirienced point of view (that says that i have failed for this).

Actually, the phrase says something like: Don't try to understand the problem in a particular enviroment, particular PC or set of users before you get in deep into the problem.

It means such and such is something you should avoid most of the time, but not something you should avoid all the time. For example, you will end up using these "evil" things whenever they are "the least evil of the evil alternatives." It's a joke, okay? Don't take it too seriously.

Even more evil than setjmp/longjmp is using setjmp/longjmp to implement a poor man's super lightweight threading library (as I've seen done way back in the DOS days). Delightfully evil code! :-)
–
Brian KnoblauchJul 17 '12 at 12:45

i dont know java but why is this evil? are you saying it should only be used on ints and byte isnt an int? I find >>> a but weird since i know it as D's unsigned shift right which should not result in a -1.
–
acidzombie24Dec 21 '10 at 21:52

I think there are evil things in programming, but I don't use the term pejoratively.

Evil is when code pretends to behave in one way, but in reality behaves in a very different fashion, and in a way that hurts a unenlightened rational programmer. I often refer to this as a type of "Magic." Magic is anything who's functionality is "hidden" from the programmer, and it comes in different styles.

Example: in Scheme the functions "car" and "cdr" could be implemented using functions only, however, they are not. Instead they are implemented at a lower level imperatively because that runs faster on most computers. I'd call this "white magic." Its not evil, but its definitely magic.

By comparison the unique number NAN in Javascript is not equal to any other number... even itself. This is "black magic." I don't want to get into a discussion of why you have NAN in Javascript (or why you have both Infinity and NaN), but you can see why a such a simple concept would be useful to a language with only floating point numbers. However, having a constant number which cannot be tested for in the same way as other constant numbers is not something one would expect. Fortunately Javascript provides isNAN to help solve this issue, but if you are unaware of NaN's unique property you might write the following code and get burned:

if(x == NaN)

or if you're clever you might try the following with the same results

if(x === NaN)

I joking refer to this as getting "mana burned" (it is magic afterall...).

I realize there are good reasons why you want things which are not numbers to be automatically equal to themselves, but you have to remember that for IEEE floating point numbers NaN has a specific bit sequence and it is similar to other numbers in this respect. If you treat Javascript NaN the way you might treat an IEEE floating point NaN you are liable to get burned. This is both deceptive and frustrating, the former being the reason I refer to this as Evil.

I'm gonna turn this around and say that while there's no absolute evil, there are tools and constructs which make it more plausible for our feeble humans with such a limited skull size to make mistakes than others.

So I'd say you could talk about the evilness of a construct based on how likely people are to make mistakes with it. Sure you can cut bread with a knife or with a chainsaw with blades as grip, but one is more likely to cause damage than the other, even though you may be able to pull it off with enough care.

Programming, per se, I think, is not inherently evil. However, programming is very often a social activity, and disrespecting those around you can be very evil. People often forget that most code is going to be shared with others; mostly read, sometimes written too. Be it open source, a product that a company is releasing, or a small piece of patching up a consultant is hired for, programs are going to be read.

That's half the reason why so many "considered harmful" articles exist, or why people say "never". Making life difficult for others is the very root of all evil. Isn't it?