All programming languages are having their design flaws simply because not a single language can be perfect, just as with most (all?) other things. That aside, which design fault in a programming language has annoyed you the most through your history as a programmer?

Note that if a language is "bad" just because it isn't designed for a specific thing isn't a design flaw, but a feature of design, so don't list such annoyances of languages. If a language is illsuited for what it is designed for, that is of course a flaw in the design. Implementation specific things and under the hood things do not count either.

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

closed as not constructive by Mark Trapp Oct 7 '11 at 20:50

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

6

Note that this is constructive for language designers (mistakes to avoid), if someone would like to question how constructive this question is.
–
AntoMar 5 '11 at 16:08

1

@greyfade: Not really, this are about flaws in the actual language, that seems to be about things which lower adoption of a language, which could include bad standard library or just a bad website for the language. Some answers list e.g. bad syntax, but that isn't a specific design flaw
–
AntoMar 5 '11 at 16:54

49 Answers
49

One of my big annoyances is the way switch cases in C-derived languages default to falling through to the next case if you forget to use break. I understand that this is useful in very low level code (eg. Duff's Device), but it is usually inappropriate for application level code, and is a common source of coding errors.

I remember in about 1995 when I was reading about the details of Java for the first time, when I got to the part about the switch statement I was very disappointed that they had retained the default fall-through behaviour. This just makes switch into a glorified goto with another name.

@Christopher Mahan: switch doesn't have to work that way. For example, Ada's case/when statement (equivalent to switch/case) does not have fall-through behaviour.
–
Greg HewgillMar 5 '11 at 22:40

2

@Greg: switch-like statements in languages unrelated to C don't have to work that way. But if you use C-style control flow ({...}, for (i = 0; i < N; ++i), return, etc.), language inference will make people expect switch to work like C, and giving it Ada/Pascal/BASIC-like semantics would have confused people. C# requires break in switch statements for the same reason, although makes it less error-prone by forbidding silent fallthrough. (But I wish you could write fall; instead of the ugly goto case.)
–
dan04Mar 6 '11 at 8:05

4

then don't write switch, write if() else. the reason you're complaining about is what makes switch better: it's not a true/false conditional, it's a numeral conditional, and that makes it different. You could also write your own switch function though.
–
jokoonMar 6 '11 at 11:40

9

-1 I consider it a benefit to allow fall through, there is no danger from it other than stupidity, yet it grants additional function.
–
OrblingMar 8 '11 at 0:25

11

The problem isn't that switchallows fallthrough. It's that most uses of fallthrough are not intentional.
–
dan04Mar 8 '11 at 0:57

I've never really liked the use of = for assignment and == for equality testing in C-derived languages. The potential for confusion and errors is too high. And don't even get me started on === in Javascript.

Better would have been := for assignment and = for equality testing. The semantics could have been exactly the same as they are today, where assignment is an expression that also produces a value.

@Nemanja Trifunovic: I originally thought of that suggestion, but it has an unfortunate ambiguity in C with less-than comparison against a negative number (ie. x<-5). C programmers wouldn't tolerate that kind of required whitespace :)
–
Greg HewgillMar 6 '11 at 2:35

7

@Greg: I would prefer := and == because it would be too easy to forget the : and not be notified like it's already the case (though reverted) when you forget a = today. I am thankful for compiler warnings on this...
–
Matthieu M.Mar 6 '11 at 13:31

13

On almost all keyboards I've ever used, ":=" requires changing the shift key while typing it. On the one I'm using now, ':' is uppercase and '=' is lowercase, and I've had that reversed. I type a lot of assignments, and don't need that sort of typing hassle in them.
–
David ThornleyMar 7 '11 at 19:00

14

@David Thornley: Code is read many more times than it is written. I don't buy any arguments about "typing hassle".
–
Greg HewgillMar 7 '11 at 19:06

8

@Greg Hewgill: Sure it's read more often than written. However, the problem between = and == isn't in reading, because they're distinct symbols. It's in writing and making sure you got the right one.
–
David ThornleyMar 7 '11 at 19:45

The choice of + in Javascript for both addition and string concatenation was a terrible mistake. Since values are untyped, this leads to byzantine rules that determine whether + will add or concatenate, depending on the exact content of each operand.

It would have been easy in the beginning to introduce a completely new operator such as $ for string concatenation.

@Barry: Not really. + makes a lot of sense as a string concatenation operator in a strongly typed language. The problem is that Javascript uses it but is not strongly typed.
–
Mason WheelerMar 6 '11 at 6:02

13

@Mason: + does not make sense for concatenation, because concatenation is definitely not commutative. It's an abuse as far as I am concerned.
–
Matthieu M.Mar 6 '11 at 13:36

12

@Matthieu: Umm... why does that matter? Concatenation isn't commutative (like addition is), but the addition of two strings is nonsensical so no one thinks of it that way. You're inventing a problem where none exists.
–
Mason WheelerMar 6 '11 at 13:49

4

just switch to C++ and add any kind to esoteric overloading to the operator of your choice
–
NewtopianMar 7 '11 at 5:50

8

@Matthieu: Commutativity isn't the issue here. If JS had a Matrix class, would you consider it an abuse for it to have an * operator?
–
dan04Mar 8 '11 at 0:56

Actually it is very nice language, with a few bad design choices in it. This is the major one, if you don't scope a variable it will scope it as a global. The good thing is that by using Doug Crockford's Jslint program you can catch this error and a bunch more.
–
Zachary KMar 5 '11 at 16:36

1

Just use var whenever you declare a variable and you are good to go. And don't tell me it's too much typing because Java forces you to declare all the types twice and nobody complains about it being a shitty design choice.
–
davidk01Mar 6 '11 at 6:10

3

@davidk01: People do complain about that. Similarly, people complained about having to declare std::map<KEY, VALUE>::const_iterator variables in C++ enough that auto is being added to that language.
–
dan04Mar 6 '11 at 7:03

The preprocessor in C and C++ is a massive kludge, creates abstractions that leak like sieves, encourages spaghetti code via rat's nests of #ifdef statements, and requires horribly unreadable ALL_CAPS names to work around its limitations. The root of these problems is that it operates at the textual level rather than the syntactic or semantic level. It should have been replaced with real language features for its various use cases. Here are some examples, though admittedly some of these are solved in C++, C99 or unofficial but de facto standard extensions:

#include should have been replaced with a real module system.

Inline functions and templates/generics could replace most of the function call use cases.

Some kind of manifest/compile time constant feature could be used for declaring such constants. D's extensions of enum work great here.

Real syntax tree level macros could solve a lot of miscellaneous use cases.

@dsimcha: I agree with the #include issue, but the module system was invented... afterward! And C and C++ aim for a maximum of backward compatibility :/
–
Matthieu M.Mar 6 '11 at 13:34

3

I agree that this is a massive pain in various body parts, but the multi-pass design has the advantage of simplicity: a C compiler doesn't need to know much about the context of operation before it can successfully compile a chunk of C code. This can only be qualified as a design error if you can show that the costs of using a hypothetical module system within the C language itself (e.g. C++-like classes) is always lower than or comparable to the present cpp-based #include hacking.
–
reinierpostMar 8 '11 at 9:00

5

@Stephen: I agree that Java with a preprocessor might be better than Java without, but only because Java doesn't have several of the "real" features necessary to replace the preprocessor. In languages like D, which include such features and Python, which gets flexibility in other ways by being dynamic, I don't miss it one bit.
–
dsimchaMar 10 '11 at 3:55

There are lessons to be learned, but the lessons are rarely clear cut, and to understand them you have to understand the technical trade-offs ... and the historical context. (For instance, the cumbersome Java implementation of generics is a consequence of an overriding business requirement to maintain backwards compatibility.)

IMO, if you are serious about designing a new language, you need to actually use a wide range of existing languages (and study historical languages) ... and make up your own mind what the mistakes are. And you need to bear in mind that each of these languages was designed in a particular historical context, to fill a particular need.

If there are general lessons to be learned they are at the "meta" level:

You cannot design a programming language that is ideal for all purposes.

You cannot avoid making mistakes ... especially when viewed from hind-sight.

Many mistakes are painful to correct ... for users of your language.

You have to take account of the background and skills of your target audience; i.e. existing programmers.

-1 -- programming languages follow design goals. A features of a language that work against these goals is a design flaws, unless it is a necessary compromise for one of its other goals. Not many languages are made with the intention of satisfying everyone, but all languages should attempt to satisfy those people that it sets out to satisfy in the first place. This kind of postmodernistic political correctness is quite stifling to programming language research and development.
–
Rei MiyasakaMar 9 '11 at 22:08

1

seems to me that the OP already accounted for your answer in the second paragraph of the question.
–
Aidan CullyMar 10 '11 at 9:14

C was not designed to be a standard fits-all language, hence it cannot be a design error. It was designed to model a cpu as portable assembler avoiding cpu-specific assembler code. However, it was fixed in Java.
–
user1249Mar 5 '11 at 17:10

7

I think the bigger problem is newer languages that continue to use the same meaningless terms, despite history showing us it's a terrible idea. At least the C guys noticed their mistake and created standard int types.
–
Mark HMar 5 '11 at 17:48

5

A char is not a utf-8 unit. A utf-8 character can take more that 8 bits to store. C is not a language of the 1970s, I'm using it for a project now (voluntarily).
–
dan_waterworthMar 5 '11 at 18:45

4

C is little more than a high-level abstraction of the PDP-11 processor. For example, pre and post incrementation were directly supported by the PDP-11.
–
bit-twiddlerMar 5 '11 at 22:32

5

This is a terribly misguided answer. First off, C and C++ are not interchangeable. Second, the language spec clearly defines what a char is - An object declared as type char is large enough to store any member of the basic execution character set.. Third, C is not a "language of the 70's", it is a language that lives close to the hardware and is probably the language that ultimately allows all of your high level abstractions to actually make sense to a CPU. You come off as a person who knows only high level languages and has no appreciation for how things actually work. -1
–
Ed S.Mar 5 '11 at 23:18

It was introduced in ALGOL in the 60s, and exists in most of the commonly used programming languages today.

The better alternative, used in languages like OCaml and Haskell, is the maybe. The general idea is that object references cannot be null/empty/non-existent unless there's an explicit indication that they may be so.

(Although Tony's awesome in his modesty, I think almost anyone would have made the same mistake, and he just happened to be first.)

@user14579: Every language that supports any kind of set or string or array has {}, but such is still semantically appropriate and will not crash unless you already have something that could possibly cause an array bounds error -- but that's another issue. You can process an empty string to uppercase all the characters, which will result in an empty string. You try the same thing on a null string, and without proper consideration, it'll crash. The problem is that this proper consideration is tedious, often forgotten, and makes it difficult to write single-expression functions (i.e. lambdas).
–
Rei MiyasakaMar 10 '11 at 20:15

1

I make money every time I type null...oh right someone loses money every time I type null. Except this time.
–
kevpieMar 12 '11 at 12:59

3

@umlcat - when you have languages with pattern matching like Ocaml, Haskell, and F#, using the Maybe x | None pattern prevents you from forgetting the null case at compile-time. No amount of compile-time trickery can catch an error in languages where null is the established idiom. Since you have to explicitly choose not to deal with the null case in languages that have the Maybe and Some monad, they have a serious advantage over the "null" approach.
–
JasonTrueMar 12 '11 at 22:27

1

@Jason -- I like to think of maybe as a null opt-in, whereas the null exception is an opt-out. Of course there's some things to be said about the difference between runtime errors and compile-time errors too, but just the fact that null is essentially injecting behavior is noteworthy in itself.
–
Rei MiyasakaMar 13 '11 at 0:11

I get the feeling that the people who designed PHP didn't use a normal keyboard, they don't even use a colemak keyboard, because they should have realized what they were doing.

I am a PHP developer. PHP isn't fun to type.

Who::in::their::right::mind::would::do::this()? The :: operator requires holding shift and then two key presses. What a waste of energy.

Although->this->is->not->much->better. That also requires three key presses with the shift being in between the two symbols.

$last = $we.$have.$the.$dumb.'$'.$character. The dollar sign is used a tremendous amount of times and requires the award stretch up to the very top of the keyboard plus a shift key press.

Why couldn't they design PHP to use keys that are much faster to type? Why couldn't we.do.this() or have vars start with a key that only requires a single keypress - or non at all (JavaScript) and just pre-define all vars (like I have to do for E_STRICT anyway)!

@nikie Actually, the XML-based ASPX code is a language, so you can twist this one to work. :D
–
CodexArcanumMar 9 '11 at 21:58

2

The real problem with ASP.NET, I think, is how hard it tries to hide the details of the web from the programmer. There's actually some really neat, useful stuff going on in ASP.NET, but you have to fight so hard and dig so deep to get at it.
–
CodexArcanumMar 9 '11 at 21:59

1

On the other hand there are thousands and thousands of simple and successful data collection apps out there that where put together using the "classic" desktop app thing. The only bad thing was that until MVC the only Microsoft option was the windows forms.
–
ElGringoGrandeMar 12 '11 at 2:27

For me it is PHP's absolute lack of naming and argument ordering conventions in its standard library.

Though JASS's necessity to nullify references after the referenced object was released/removed (or the reference would leak and several bytes of memory would be lost) is more serious, but since JASS is single purpose language, it is not that critical.

The lack of conventions in PHP's stdlib is arguable not a language design flaw.
–
delnanMar 5 '11 at 16:15

3

@delnan: The lack of conventions is a result of how PHP was designed, and therefore has a lot to do with the language design. Also it is not clear to me that there is a clear distinction between libraries and language. Lisp in particular has a proud tradition of bootstrapping one language on top of another.
–
btillyMar 8 '11 at 17:35

1

The truly remarkable thing about JASS was that it had reference counting on handles, but wouldn't clean them up unless they were manually destroyed (and the graphic interface created functions that leaked memory everywhere)!
–
StrilancOct 7 '11 at 17:39

Well, not even Guido can get everything right at once...
–
delnanMar 5 '11 at 21:12

5

@delnan, oh I know, and python < 3 is still a staggeringly good language, but it is a little annoying to have better language in the form of python 3.x that I can't use because it breaks all of the modules that I need.
–
dan_waterworthMar 6 '11 at 6:19

They break the principle that everything is a descendant of java.lang.Object, which from a theoretical point of view leads to additional complexity of the language specification, and from a practical perspective they make the use of collections extremely tedious.

Autoboxing helped alleviate the practical drawbacks but at the cost of making the specification even more complicated and introducing a big fat banana skin: now you can get a null pointer exception from what looks like a simple arithmetic operation.

Perl tried many ideas. Some were good. Some were bad. Some were original and not widely copied for good reason.

One is the idea of context - every function call takes place in list or scalar context, and can do entirely different things in each context. As I pointed out at http://use.perl.org/~btilly/journal/36756 this complicates every API, and frequently leads to subtle design issues in Perl code.

The next is the idea of tying syntax and data types so completely. This lead to the invention of tie to allow objects to masquerade as other data types. (You can also achieve the same effect using overload, but tie is the more common approach in Perl.)

Another common mistake, made by many languages, is to start off by offering dynamic scoping rather than lexical. It is hard to revert this design decision later, and leads to long-lasting warts. The classic description of those warts in Perl is http://perl.plover.com/FAQs/Namespaces.html. Note that this was written before Perl added our variables and static variables.

People legitimately disagree on static versus dynamic typing. I personally like dynamic typing. However it is important to have enough structure to let typos to be caught. Perl 5 does a good job of this with strict. But Perl 1-4 got this wrong. Several other languages have lint checkers that do the same thing as strict. As long as you are good about enforcing lint checking, that is acceptable.

If you're looking for more bad ideas (lots of them), learn PHP and study its history. My favorite past mistake (long ago fixed because it lead to so many security holes) was defaulting to allowing anyone to set any variable by passing in form parameters. But that is far from the only mistake.

Yea, Perl has a lot of mistakes, because the folks who built it were trying new ideas, and when you do that you often get them wrong. (Perl also has some very good stuff, and is the standard for Regexps that everyone else seems to have copied)
–
Zachary KMar 5 '11 at 17:04

4

Lisps were originally dynamically scoped, and over time got changed to being lexically scoped (at least in Scheme and Common Lisp). It's not impossible to change.
–
David ThornleyMar 7 '11 at 19:07

4

@david-thornley: It is impossible unless you sacrifice backwards compatibility somewhere. Scheme was always lexically scoped. Common Lisp was lexically scoped from the time it was standardized, but various Lisp communities had their struggles adopting it. And Emacs Lisp is still using dynamic scoping, even though there has been a desire to change it for a long time.
–
btillyMar 7 '11 at 19:41

1

BTW, many of the things people dislike Perl for weren't invented in Perl but taken from other languages, mostly the Bourne shell.
–
reinierpostMar 8 '11 at 9:05

It pervaded the specification. The END card had to be defined as a card with an 'E', an 'N', and a 'D' in that order in columns 7-72, and no other nonblanks, rather than a card with "END" in the proper columns and nothing else.

It led to easy syntactic confusion. DO 100 I = 1, 10 was a loop control statement, while DO 100 I = 1. 10 was a statement that assigned the value 1.1 to a variable called DO10I. (The fact that variables could be created without declaration, their type depending on their first letter, contributed to this.) Unlike other languages, there was no way to use spaces to separate out tokens to allow disambiguation.

It also allowed other people to write really confusing code.
There's reasons why this feature of FORTRAN was never duplicated ever again.

@oosterwal: I certainly did. I might be wrong, but I vaguely remember the language definition based on punch cards. They were the main way of inputting FORTRAN programs back then, and the idea of an 80-column line with columns 73-80 reserved is from punch cards.
–
David ThornleyMar 10 '11 at 14:30

One of the biggest issues with BASIC was the lack of any well defined method to extend the language beyond it's early environments, leading to a bunch of completely incompatible implementations (and a nearly irrelevant post-facto attempt at any standardization).

Almost any language will get bent into general purpose use by some crazy programmer. It's better to plan for that general purpose usage at the beginning in case that crazy idea takes off.

I believe in DSLs (domain-specific-languages) and one thing I value in a language is if it allows me to define a DSL on top of it.

In Lisp there are macros - most people consider this a good thing, as do I.

In C and C++ there are macros - people complain about them, but I was able to use them to define DSLs.

In Java, they were left out (and therefore in C#), and the lack of them was declared to be a virtue. Sure it lets you have intellisense, but to me that's just an oeuvre. To do my DSL, I have to expand-by-hand. It's a pain, and it makes me look like a bad programmer, even though it lets me do a heck of a lot more with tons less code.

I'd agree that any language without decent macros is a one huge unfixable design flaw. But what do you mean by 'they were left out'? C preprocessor was not any kind of a decent macro system. Java is not derived from any proper language with macros.
–
SK-logicMar 6 '11 at 9:04

1

You can write your DSL in an external macro processing language (like m4, say, among a myriad of others).
–
JUST MY correct OPINIONMar 6 '11 at 10:42

@reinierpost: I'm thinking of things I could do in Lisp, such as introduce control structures like differential execution and backtrack. These could be done with Lisp macros. In C/C++ I could do differential execution with C macros (and a little programmer discipline), but not backtrack. With C# I can do neither. What I get in exchange is things like intellisense. BFD.
–
Mike DunlaveyMar 8 '11 at 12:57

1

@David: The way I did it was I had a macro to wrap around ordinary code, such as a list of statements. It would take the cdr of the list and form a lambda closure out of it (i.e. a continuation) and pass it as an argument to the car of the list. That was done recursively, of course, and would "do the right thing" for conditionals, loops, and function calls. Then the "choice" function just turned into a normal loop. Not pretty, but it was robust. Problem is, it makes it super easy to make overly-nested loops.
–
Mike DunlaveyOct 7 '11 at 21:16

Statements, in every language that has them. They do nothing that you can't do with expressions and prevent you from doing lots of things. The existence of a ?: ternary operator is just one example of having to try to get around them. In JavaScript, they are particularly annoying:

@SK-logic: I suspect that statements were blindly inherited from machine language, through FORTRAN, ALGOL, and COBOL.
–
David ThornleyMar 8 '11 at 21:15

1

I'm pretty sure machine language is the common ancestor, and that's just a reflection of the fact that modern computers based on von Neumann architecture execute instructions sequentially and modify state. Ultimately when IO happens, there are going to be expressions that don't yield meaningful data, so statements aren't entirely useless in indicating semantically that some code has only side effects. Even languages that have a notion of unit type (aka ()) instead of statements have special consideration to ensure that they don't throw warnings or otherwise behave strangely.
–
Rei MiyasakaMar 9 '11 at 21:03

For me, it is the design problem that plagues all of the languages that were derived from C; namely, the "dangling else." This grammatical problem should have been resolved in C++, but it was carried forth into Java and C#.

One of the core goals of C++ was to be fully backward compatible with C. If they had drastically changed semantic behavior it may not have caught on like it did (or at least, that was the thought at the time)
–
Ed S.Mar 5 '11 at 23:31

2

@Ed S., however, elimination of the "dangling else" problem could have been accomplished by eliminating the <compound_statement> (a.k.a. <block>) grammatical production and incorporating the curly braces into the conditional and iterative control structures like they did when they added the the try/catch exception handling control structure. There is no excuse for not rectifying this grammatical ambiguity in Java and C#. Currently, the defensive work around for this grammatical ambiguity is to make every statement that follows a conditional or iterative control statement a compound statement.
–
bit-twiddlerMar 6 '11 at 0:19

1

What do you mean by that? This isn’t a “grammatical problem” (it’s completely unambiguous). How would you “resolve” it? I actually find the rules in C satisfactory. Arguably, only a Python-like syntax (= meaningful indentation) can really solve this problem. Furthermore, I’m actually very happy that modern languages do not mandate braces. I agree that all C-like syntaxes suck but dangling-else is the least of their problems.
–
Konrad RudolphMar 6 '11 at 16:51

1

Continuing: I think that Python's use of a indentation as means by which to delineate a statement list is weird beyond belief. This technique violates the "separation of concerns" principle by tightly-coupling lexical scanning with syntax analysis. A context-free grammar should be able to be parsed without knowing anything about the layout of the source.
–
bit-twiddlerMar 6 '11 at 19:33

3

@bit-twiddler: No, it doesn't. The Python lexer just converts the whitespace to the appropriate INDENT and DEDENT tokens. Once that's done, Python has a pretty conventional grammar (docs.python.org/reference/grammar.html).
–
dan04Mar 8 '11 at 0:53

I think all the answers so far point to a single failing of many mainstream languages:

There is no way to change the core language without affecting backward compatibility.

If this is solved then pretty much all those other gripes can be solved.

EDIT.

this can be solved in libraries by having different namespaces, and you could conceive of doing something similar for most of the core of a language, though this might then mean you need to support multiple compilers/interpreters.

Ultimately I don't think I know how to solve it in a way that is totally satisfactory, but that doesn't mean a solution doesn't exist, or that more can't be done

Both Java and C# have annoying problems with their type systems due to the desire to maintain backwards compatibility while adding generics. Java doesn't like mixing generics and arrays; C# won't allow some useful signatures because you can't use value types as bounds.

As an example of the latter, consider that

public static T Parse<T>(Type<T> type, string str) where T : Enum

alongside or replacing

public static object Parse(Type type, string str)

in the Enum class would allow

MyEnum e = Enum.Parse(typeof(MyEnum), str);

rather than the tautological

MyEnum e = (MyEnum)Enum.Parse(typeof(MyEnum), str);

tl;dr: think about parametric polymorphism when you start designing your type system, not after you publish version 1.

The inability to restrict types to enum is annoying in C#, but you can kind of work around it like this MyMethod<T>(T value) where T : struct, IComparable, IFormattable, IConvertible But you still have to test for an enum and it's a hack. I think the bigger lack in the C# generics is no support for higher kinds, which would really open up the language to some cool concepts.
–
CodexArcanumMar 9 '11 at 22:10

I feel like I'm opening myself up to get flamed, but I really hate the ability to pass plain old data types by reference in C++. I only slightly hate being able to pass complex types by reference. If I'm looking at a function:

void foo()
{
int a = 8;
bar(a);
}

From the calling point, there is no way to tell that bar, which may be defined in a completely different file, is:

void bar(int& a)
{
a++;
}

Some might argue that doing something like this may just be a bad software design, and not to blame the language, but I don't like that the language lets you do this in the first place. Using a pointer and calling

@Jeff: For one thing, the primary reason that reference semantics made their way into C++ was operator overloading, for which uniform reference behaviour simply makes sense. More importantly, though, C++ is designed to be versatile and provide very fine-grained features, even if doing so incurs significant risk of programmer error. So yeah, at least in this particular case, don't blame the language. I'd rather be able to make mistakes than let a language get in my way.
–
Jon PurdyMar 10 '11 at 4:47

ALTER

When I learned COBOL, the ALTER statement was still a part of the standard. In a nutshell, this statement would allow you to modify procedure calls during runtime.

The danger was that you could put this statement in some obscure section of code that was rarely accessed and it had the potential to completely change the flow of the rest of your program. With multiple ALTER statements you could make it nearly impossible to know what your program was doing at any point in time.

My university instructor, very emphatically, stated that if he ever saw that statement in any of our programs he would automatically flunk us.

I don't see how to avoid ill-definedness in an evolving language. Or, for that matter, in a predesigned language where the original designer missed some important points (like Pascal).
–
David ThornleyMar 11 '11 at 21:02

Classes in C++ are some kind of forced design pattern in the language.

There is practically no difference at runtime between a struct and a class, and it is so confusing to understand what is the real true programming advantage of "information hiding" that I want to put it there.

I'm going to be downvoted for that, but anyway, C++ compilers are so hard to write this language feels like a monster.

Information hiding is important because it lets you hide implementation specific details, which are likely to change, from the accessible parts of the API (the "UI" of the API) thus making changes to the program becomes easier and less painful.
–
AntoMar 6 '11 at 20:57

This difference is not the most revolting part of C++, not even close. The only difference is a default access modifier (public for structs, private for classes). C++ is a horrible, monstrous language, but certainly not in this part.
–
SK-logicMar 7 '11 at 12:45

2

Information hiding is good; you can find discussions of that all over. The only prominent software book I can think of that was against it was Brooks' "The Mythical Man-Month", and he later considered it the biggest mistake in the book. If you don't understand the advantages, you really aren't qualified to make the judgment you're making.
–
David ThornleyMar 7 '11 at 19:10

To a human, the concept of (un)signed chars and the necessity to tell the compiler to use unsigned chars as default is surely just as insane as to a mathematician the claim that 2 != 2, because the second 2 is uppercase, bold, or italic.
–
Ekkehard.HornerMar 5 '11 at 23:16

5

The problem is that C confounds the concept of "char" (i.e., part of a text string) and "byte" (i.e, (u)int_least8_t). Signedness makes perfect sense for small integers, but no sense at all for characters.
–
dan04Mar 6 '11 at 7:16