As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

9

Pretty much all absolute statements about programming are wrong, but I guess you're looking for specific examples?
–
Jon HopkinsDec 21 '10 at 13:49

9

Actually almost all absolute statements pronounced by humans are wrong. Ignorance and arrogance are not limited to programming.
–
user8685Dec 21 '10 at 13:53

3

To avoid confusion, maybe this post should be renamed "Mythical Absolutes in IT."
–
Mark FreedmanDec 21 '10 at 14:08

It depends on your employees and the software you are using, it can go either way. I think, in fact, because most software is cheap in comparison to the time it takes to train people, whether software is free or commercial has little bearing on the TCO.
–
dan_waterworthDec 21 '10 at 14:22

5

but at least you have a chance to fix things.
–
user1249Dec 21 '10 at 14:44

8

To supply another frequently false absolute: Free software is useless. I think you'll find my false absolute much more common in business than yours, although yours is more common in programming communities.
–
David ThornleyDec 21 '10 at 15:49

10

I think you can generalize this to "The cost of software is primarily indicated by the purchase price". goes back to scheduling training for the enterprise-grade project management software being rolled out in a few days to replace the enterprise-grade project management software rolled out last year to replace the enterprise-grade project management software in use before that...
–
Shog9♦Dec 22 '10 at 4:04

Forget all-nighters. In my early 20s, I had nowhere near the number of distractions, obligations, conflicting priorities, etc. that now compete for my attention daily. Consequently, I was a hell of a lot more productive, even on the days when I only worked a few hours...
–
Shog9♦Dec 22 '10 at 4:06

1

@Dean, I've heard it at least 0.5 times per year, since 1984, when I first started programming.
–
Lars WirzeniusDec 22 '10 at 8:43

Although this one does work, it is often taken out of context (I could swear many people convenently ignore the 'premature' bit). Some people seem to read this as "any optimisation is the root of all evil"

I wish those people would read more of the text around that small quote:

There is no doubt that the grail of efficiency
leads to abuse. Programmers waste
enormous amounts of time thinking about,
or worrying about, the speed of noncritical
parts of their programs, and these attempts
at efficiency actually have a strong negative
impact when debugging and maintenance are
considered. We should forget about small
efficiencies, say about 97% of the time: premature
optimization is the root of all evil.

Yet we should not pass up our opportunities
in that critical 3 %. A good programmer
will not be lulled into complacency by such
reasoning, he will be wise to look carefully
at the critical code; but only after that code
has been identified. It is often a mistake to
make a priori judgments about what parts
of a program are really critical, since the
universal experience of programmers who
have been using measurement tools has been
that their intuitive guesses fail"

There may be good habits to get into, also. In C++, ++i to increment a variable as a stand-alone expression will never be worse than i++, and sometimes may be a lot more efficient.
–
David ThornleyDec 21 '10 at 15:51

2

++ I'm an A1 pest on this subject. So much stupid stuff is said about it, and so very few people actually know what they're talking about. They say "measure measure". They say "get the right algorithm", all of which totally misses the point. The point is to actually do performance tuning and in the process learn what approaches to take, and what well-known and recommended practices to avoid.
–
Mike DunlaveyDec 21 '10 at 23:11

This statement refers, of course, to Turing-equivalence and is generally used to refer to a language with features lacking in the preferred language of the person making the statement. It would be more accurate to say that syntax is a difference that results in more differences when it comes to actual coding and maintenance.

Most of the time, they really don't. They may think they do, and some may even be in the neighborhood, but they really don't. The rare few who really do know what they want can't describe it well enough for you to get it right the first time.

In consulting, this is often called the "presenting problem"; in development, it's the 80/20 rule; in programming, it's the admonition to expect to throw away the first prototype.

Users don't really know that they want/need because they don't understand what is truly possible...and what isn't!

Often found in business circles, rarely among the programming community.

Often associated with the naive belief that commercial software vendors actually take responsibility for their products. If I had a nickel for every time I've heard "someone to sue", I'd get myself a much nicer home box and monitors.

Often also associated with the belief that nobody would do good work except for money.

Maybe this was true back when people could substitute a general ledger notebook for a database, but in this day and age, an efficient and competent programming staff can save a company great piles of money. Piles of money saved = piles of money earned.

"You're not a real programmer: you're not constantly typing".
At first, I hoped it was just a joke, but no, he was serious.

A short moment of unconfortable silence ensued before I proceeded to explain that thinking about the implications and weight which type of implementation to do before coding or even sketching diagrams to structure is also part of the process x_x

On the other hand, the colleague he was implicitely comparing me to is an obsessive typist: he's my best mate and a great guy, but he writes the first thing that springs in his mind and only thinks about it while he's typing, so he's hitting backspace more than half of the time: looks great to "look busy" when management comes around, but quite a pain to maintain the code or integrate with the rest of the application.

It also depends on your target. If you're developing software that runs on a server, it might be worth it to just buy a bigger server. But if your software runs on the client, and you have a wide distribution, then it's not cheaper to ask all 100,000 of your users to upgrade their PC...
–
Dean HardingDec 21 '10 at 21:51

Once you learn language [commonly C++], all other languages are trivial to learn

I have never met a university professor who didn't genuinely believe that this statement was true. I've argued it till I was blue in the face, but they have all been insistent that the language is a simple implementation detail, and are as such, interchangeable.

Arn't languages trivial to learn? Are we talking about the language, or the ecosystem around the languages?
–
MortenDec 21 '10 at 18:59

3

If you've got the concepts, the syntax is generally easy to learn. The tools and libraries might take longer, and of course new concepts can take years.
–
David ThornleyDec 21 '10 at 22:49

5

It's probably true in the case of C++ ; I've been learning it for 20 years, and I still find rules and exceptions and whatnot.When I finish learning it, I'll probably know all the other languages already. Hence, trivial to learn.
–
egarciaDec 21 '10 at 23:37

7

I think the point is that once you have mastered one language you will have learned the skills necessary to master the next one. Honestly... your professors are exactly right. You go from not knowing the details to knowing which of the details you know not.
–
instanceofTomDec 22 '10 at 0:43

3

To all commenters: try going from C++ to Haskell or Lisp. Heck, even Python or Ruby. And I'm not talking about libraries or syntax, but concepts, abstractions, workflows.
–
Mauricio SchefferDec 23 '10 at 22:13

Learning a language means understanding its syntax, and is therefore trivial.

"Programming language" is not synonymous with "spoken/written language" in this way. Knowing the syntax of a programming language is equivalent to knowing the alphabet of a spoken/written language. Really learning a language involves much, much more than its syntax.

I'm sure this will seem controversial, but there are plenty of scripts and small projects out there that have 0 bugs. But of course, that won't be the code you're writing right now, or that project that's about to go live ;)

If you are a Programmer you should inherently understand the business logic behind ALL your company's software.

Even though we are capable of writing software, it does not mean we will inherently understand the internal business logic of every piece of software our company owns or requires us to write.

In my experience customers assume we have a fundamental understanding of the business logic when proposing new software. Customers often become frustrated when you ask for further direction. I think there is an assumption that since tasks on the computer are "quick", then the time required to plan and write a piece of software should be proportional. Customers can become even more frustrated when you explain the dangers of the implementation they are asking for. Its like a one way street of knowledge, the programmer should fully and immediately understand the business logic, but customer will step back from understanding the mechanics of the software, expecting it to be done.

In another aspect, management will assume since you are a programmer you should inherently be able to use every piece of company software without training.

Splitting and branching a product will make maintenance more efficient.

I've seen this one burn companies very badly several times in my career. It's very tempting to think that different things will break in the slightly different, specialised versions of products, so that teams working in their own specialized "bubbles" can efficiently maintain their own stuff. But more often than not, this just leads to a mess where the same things have to be fixed in every separate branch, and inter-team communication and integration becomes a nightmare.

Also: this development model often leads to the "Mythical Man Month" thing happening.

It's more likely that inter-team communication was already poor, branching isn't a silver-bullet, but if it's done properly, it's extremely useful. Anything done badly is a problem, and branching requires a high level of group coordination/cohesion.
–
SlomojoDec 23 '10 at 23:11

It is true that one should be wary about absolutes but the statement that you quote is not a absolute statement per se but rather rule of thumb which is arguably true for many but not all cases (such as the example given by Dan). Truth be told I can't recall any objectionable statement which is considered as absolute by the programmers