Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

First time accepted submitter edA-qa writes "Antiquated, clunky, and unsafe. Though beloved to some, C is a language that many choose to hate. The mass opinion is indeed so negative it's hard to believe that anybody would program anything in C. Yet they do. In fact a lot of things, even new things, are programmed in C. The standard was recently updated and the tools continue to evolve. While many are quick to dismiss the language it is in no danger of disappearing."

...and not some VM? Most of the popular languages these days are all dynamic. And they are very convenient and nice. But if you actually want to know what the machine is actually doing, and want to have a say in such things, C is the way to go.

I mean, unless you want to, you know, use pascal or fortran or something.

Personally the thing I like most about C is that it's not "safe". It doesn't take care of a lot of memory management for you and you can easily eat up all the memory or introduce a buffer overload vulnerability if you're not paying attention. It forces programmers to actually look at what they're doing and consider what it will do in the long run, and causes good coding habits to form. I think the majority of people who dismiss C as "too hard" are coming from Java programming. C gives you a lot of power, but, as the well-cliched saying goes, "with great power comes great responsibility".

C is going to stay around for a long time in embedded systems. In this environment many microcontrollers still have 4k or less of RAM, and cost less than a postage stamp. In these systems there is virtually no abstraction. You write directly to hardware registers, and typically don't use any dynamically allocated memory. You use C because, assuming you understand the instruciton set, you can pretty much predict what assembly instructions it's going to generate and create very efficient code, without the hassle of writing assembly. Aditionally, your code is portable for unit testing or, to a lesser degree, other microcontrollers. This allows you to write a program that will run in 3.2 k of ram, rather than 4k, which allows a manufacturer to save 4 cents on the microcontroller they pick. This saves you $40,000 when you're making a million of something.

Tons of people love to have something to hate. It might be because they don't like something about it...but I think it's mostly because people like to set up communities held together by rhetoric against a tool or technology perceived and portrayed as an enemy.

"C++ sucks. We are at war with C++. We have always been at war with C++.[1]"

Swap out "C++" for whatever language you like.

Certainly there are going to be cases and scenarios where C is preferable over C++, where C++ is preferable over C, or where Brainfuck is preferable over either. Use the right tool for the right job,[2] and the right tool is whichever allows you to most effectively achieve your goals.

[1] Or, at least since its inception. Or since [insert arbitrary date here].[3][2] For whoever asks "what's the right job for Brainfuck?"... just wait. Someone will eventually come along and get modded +2 Funny when they reply to you.[3] I see what you'll do there, Mr. Connor.

Although possible, and done, I have a hard time thinking of good reasons to write drivers in higher level or interpeted languages. Besides, most kernels are written in C, makes sense to write drivers in C. When some one trots out a kernel in Python then I'll jump on the systems work in python bandwagon.

C is a very simple language and yet it allows operating memory directly in a way similar to assembler. C is portable (well, it can be compiled for different platforms), I rather enjoyed the language a while back, but since about 98 I use Java for most of big development and I am pretty sure that if I had to do everything in C that I did in Java, it would have taken me much more time.

C is nice in another way - it allows you and in some sense it forces you to understand the machine in a more intimate way, you end up using parts of the machine, registers, memory addresses, you are more physically connected to the hardware. Java is a high level abstraction, but then again, to write so much logic, it is just better to do it at higher level, where you don't have to think about the machine.

C is great tool to control the machine, Java is a great tool to build business applications (I am mostly talking about back end, but sometimes front end too).

So I like C, I used to code in it a lot, but I just to use it nowadays. What's to love about it? Applications written in it can be closely integrated into the specific OS, you can really use the underlying architecture, talk to the CPU but also the GPU (CUDA), if I wanted to use the GPU from a Java application, I'd probably end up compiling some C code and a JNI bridge to it.

C teaches you to think about the machine, I think it gives an important understanding and it's faster to develop in than in Assembler.

Not just control, but predictability. C does exactly what you tell it to do and you can see what it is doing more easily then other languages. You can tell C++ to do 'X', but it might slip in a 'Y' and 'Z' without telling you.

Though beloved to some, C is a language that many choose to hate. The mass opinion is indeed so negative it's hard to believe that anybody would program anything in C.

The masses to which you refer are idiots. C is great. It lets you do what you want, how you want. True, you're afforded enough programming rope to easily hang yourself, but you learn not to, and while most things can be more easily done in higher languages (you'll have to pry Perl from my dead, cold hands), many things can only be done in languages like C or its derivatives. C is one of those languages that separates the adults from the kids, so put on your big-boy pants, stop whinging about it and step up.

The mass opinion is indeed so negative it's hard to believe that anybody would program anything in C.

Huh? What mass opinion? Where's the evidence for this?

Pick the right tool for the job. C is the right tool for some jobs, specifically jobs like writing drivers or operating systems.

Historically, C won by having an innovative syntax for pointers, which a lot of people liked, and it also won by being a small language that was easy to implement. Because it was small and easy to implement, it ended up being widely available. Ca. 1980, the joke was that C was like masturbation: it might not be what you really want, but it's always available. A lot of people in 2012 may not realize that in the era when C was winning popularity, people didn't usually have access to free compilers, and for many types of hardware (e.g., 8-bit desktops like the TRS-80), there simply weren't any good development tools. Another big win for C was that because it was so widely available, it became easy to find programmers who could code in it; it fed on its own success in a positive feedback loop. This is why languages like java had C-like syntax -- they wanted to ride the coattails of C.

IMO the biggest problems have been when people started to use C for tasks for which it wasn't the right tool. It started creeping up into higher-level applications, where it wasn't really appropriate. This became particularly problematic with the rise of the internet. Networks used to be small and run by people with whom you had personal contact, so nobody really cared about the kind of buffer-overflow vulnerabilities that C is prone to. The attitude was that if you gave a program crazy input that caused it to crash, well, what was the big deal? You crashed the program, and you were only hurting yourself.

I am not claiming it is a language without warts, but I challenge any one who modded the parent post up to provide a coherent argument as to why C++ is bloated and what features you could therefore remove without detracting from the effectiveness of the language.

I'm as tired of single-langujage zealots as I am about single-issue zealots in politics. It's a repetition of the old saw: "When the only tool you have is a hammer, everything starts looking like a nail." C has its applications. C++ has its applications Perl has its applications. FORTRAN (remember that language?) has its applications. And so on down the list.

The fact is, a true professional doesn't have a single tool, he has a whole toolbox full of tools from which to select one, or two, or three, or more to get a particular job done. Look at your auto mechanic: he doesn't try to use a screwdriver when a torque wrench is called for. Look at the Web developer: he doesn't try to write C code to do Web sites. And no one in their right mind would write the heart of an Internet router in C++ or PHP or (shudder) COBOL. The tool has to be matched to the job.

Sometimes you do the job multiple times, once in an easy-to-debug language to get your algorithms down and your corner cases identified, then a second pass in a language that lets you get closer to the nuts and bolts -- hey, you already have the high-level stuff debugged.

And then you have people who are more comfortable creating tools to get the job done. I don't know how many times I've written a LEXX/YACC package to generate exactly what I need from a higher-level description...or to give a customer scripting capability suited to the particular task. I call it part of layered programming, and using multiple languages in a project to get the job done right, and in a way that can be maintained.

Finally, programming style helps reduce mistakes, as do good development tools like IDEs that do syntax highlighting.

OK, every language has its shortcomings. Even specific implementations of a language will drive you up the wall with its mysteries. But that's part of matching the language to the job.

I'll grant you that string handling in C sucks. It's part of the charm, though, for some projects, because you don't have to worry about the run-time kicking in to do garbage collection at points in your code where timing is critical. But if the job is text processing without real-time constraints, C is one of the worse choices for a tool. So don't use it for that. Use Perl, or Python, or PHP, or any number of other languages where string handling is a first-class citizen. (For a price. For a price.)

That's the difference between a true professional and a one-hit wonder: the former knows his tools, how to use them, and when to use them.

Lots of people hate manual transmissions in cars, too. That doesn't mean there isn't a place for them. I bought a manual transmission truck for the same reason I use C: it lets me get more performance out of lesser hardware, gives me more control, and it's just plain fun to work with.

Some of us were developers at one point in our careers, some of us deal with tech that is extremely close to having to occasionally jump into source code, and some of us are hobbyists who just do it for the hell of it.

Personally, I dabble in it once in awhile, but cannot stand to do it professionally. Why? Because I'd rather dream of naked nubile young ladies (grits optional) than to spend all my sleeping hours mentally untangling someone else's poorly-built code (or worse, my own occasional bork-ups).

That, and it's kind of nice to not have to keep a laptop near the bed any more.:)

My experience is that the C guys (and gals) often write better Java than the Java guys write Java.Programmers who have never written in C (and/or assembly) often have a poor understanding of how computer actually work.

I totally agree. My point was that C promotes good habits. It's absolutely not a language made for large projects. I suspect that a lot of this is because large projects of that type really didn't exist back when C was being developed.

That, and because of that, you can easily import a C library into any language without having to worry about compiler used, or bullshit like that.

I have a C library. I can dynamically import it into C, Python, Java or C# fairly easy, on any platform, I don't even have to compile the library or have the same compiler. Want to do that with any other language (including C++)? You are in for some pain and suffering.

What are value templates? If you mean things templated on integers, then I respectfully disagree. I use some very nice small-vector/matrix linear algebra libraries which would be pale shadows of themselves without templating on integers.

*Diamond inheritance (just make it illegal)

Is it a big problem? I think I had a design once where it made sense, but I can't for the life of me remember even what the domain was.

The trouble with interfaces is that you often end up having to duplicate code because you can't pull in extra stuff like you can with inheritance.

*The entire algorithms part of the STL.

No way! I, for one like things like sort (and related), heap, nth_element (one of the very few language std libraries with this O(n) algorithm!), permute, random_shuffle, binary_search (and related), set_union (and related), min/max, min/max_element, equal, swap, and a few others.

for_each is pretty useless, annoying and unclear.

Many of the others are generally a bit more useful now with lambdas.

The algorithms part of the STL is one of my favourite things, and something I really miss when going to other platforms.

Kill either structs or classes.

I think that's pretty harmless as these things go. For program structure, I use class, since I want everything private by default (for encapsulation). For template programming, the public-by-default struct is useful for since one often has many tiny structs.

It's a small oddity, but it must add positively 5 or 6 lines to the compiler.

*The iostream libraries. I don't think I've ever seen code that didn't say fuck that and just use C style stdio. They're clunky.

Well, I like the type safety of the C++ library. The formatting stinks and is really painful to use. I ended up writing/finding a few helpers, e.g. ones that use printf style formatting (but with safety) where necessary.

I've actually got pretty used to it. I find having the variables inline in the positions they appear in the output now easier to follow than jumping between a format string and list of variables.

I agree that other languages are more bloated, but I think that if you removed any feature you would lose something significant.

One of the best features about C++ is that you can define your own subset that you'll use. Don't want to use RTTI? Then don't! Worried about exception handling? Then don't use them! This is why C++ can never be slower than C -- in the end, just reduce that subset until you get to C and you're done. Beware, though, some of the features you just left out might actually improve performance over the C equivalent!

However, this subset has to be defined on a project basis, and not left for individual programmers to decide for their own code.

Operator overloading and templates are an abomination, and don't exist in Objective-C.

What's wrong with operator overloading? What if you happen to want a vector or matrix class, or a complex number class? Seems to be a limitation.

But, the nice thing about C++ is the RAII. Create an object on the stack, and the constructor is called. Function ends, object goes out of scope, destructor is called. Useful for locks. Can also implement timers, resource counting, and of your class is well behaved, you don't need to worry about resources.

Oh, no, it's not a mystery. The animosity comes from ignorance and lack of ability. Many newer programmers have never programmed in assembly language or C, and would not be able to build a computer out of logic chips. This same demographic has learned Java and believes that Hashmap is a magic O(1) thing you can just smear onto any algorithm.
C was initially popular among people who were, in fact, able to build computers out of 7400-series logic, or even transistors, if need be.
In other words, the animosity comes from those who aren't qualified to judge.

Ehm... auditing dynamic languages at the level required for driver-level security is nearly impossible. No there are no dangling pointers. What there are is race conditions between the cleanup of the casually discarded memory item and it's availability for re-use. Churn in the underlying VM can change the consitency of lots of such behaviors without your code ever changing, so there's another headache. Add to that that almost all drivers for anything beyond a USB gadget are going to have to, mandatory, do some pointer arithmetic, deal with endian-swapped values, and deal with the differences between signed and unsigned fixed-width integer values, and it just makes the very idea of using a "modern" language for drivers threatens the sanity of anyone who ever wrote a production-worthy driver.

Value templates- yes, templating on the value, rather than the type. It's an extremely niche use case that causes difficult to read code and provides little in the way of benefits. I doubt even 5% of the user base knows it exists. Kill it.

Diamond inheritance- it isn't used much, but when it is it's a problem. Because you don't know when the common base class will be initialized, or which constructor will be called by what values. Make it illegal to solve those problems (or add it to the specification)

STL algorithms- sorry, totally disagree. The entire thing is a horrible hack based on using constructors to make quasi-first order functions. They were never meant to work like that. At best it's confusing, at worst it's unreadable and hard to debug. I bounce any code review I see using them. Worse, it encourages people to make more code like that, which tends to be done worse and be even more unreadable and frequently just wrong (generic programming has a lot of gotchas). I'd rather see the whole thing killed.

Structs vs class- it is pretty harmless, but we're talking bloat. It's bloat. There's no reason to have both. I wouldn't say this is something that has to be fixed, but it's silly to have the two of them.

Yup, I stand by my original opinion- all of these could be cut with no real loss to the language. Then again, if I were writing my own language I'd take C, add constructors, destructors, and C++ style syntax for member functions, add container classes, and call it perfect.

Quite a lot of languages have problems with this. I would say that 100% of the current popular languages can't do this in the standard. So it's not necessarily a valid criticism of C as a standardized language or as an abstract machine-independent language. Individual compilers though typically have a set of pragmas or attributes to handle machine specific details.

One of the best features about C++ is that you can define your own subset that you'll use.

One of the worst features about C++ fanboys is that if their subset doesn't match your subset, you get subset holy wars with "no true Scotsman" fallacies all over: "no true C++ program uses <cstdio>".

Worried about exception handling? Then don't use them!

The entire STL uses new, which throws std::bad_alloc on failure, instead of new(std::nothrow), which returns NULL like old-school std::calloc. So how would one handle out-of-memory conditions on an embedded system with no swap file without A. being unable to use the containers and algorithms of the STL or B. reimplementing the whole STL yourself to use instead of throwing std::bad_alloc?

I think the problem these days is an entire generation of coders who have never in their life experienced a UI that was actually so repsonsive that it was impossible to ever perceive a delay between your keystroke and its effects. They've been raised entirely on laggy windows textboxes and mouseclicks that just migt get around to doing something anytime now.

They don't even know how slow their kit is running. They've never seen one run fast. So they consider similar results satisfactory when they write code.

Nobody uses everything in C++, I estimate that most programmers only ever use 75% of the language. The problem is that everybody uses a different 75%. For instance, diamond inheritance can be a pain, but is occasionally unavoidable and I am glad it works. STL algorithms are the best part of C++, complex problems reduce down a few lines of code.

Your one example that is actually bloated is iostreams, which is slow and overkill for almost any program. I wish more C++ text books would ignore iostreams and spend more time on STL.

Macros aren't really outdated. Maybe 95% of their uses can be replaced with enums or inline functions, but there are also times where textual substitution can do useful things.

Templates as a whole are overused and lead to problems, and in particular tend to not co-exist with object oriented styles. Small simple templates I like, but I see them used to completely replicate a data structure which leads to bloat. And the bloat in C++ is not bloat in features as you sort of imply but bloat in the immense size of the executable. Templates really are just a style of textual substitution, except that unlike C macros this is hidden from the programmers who often don't realize how much extra space can be taken up by them if they're not careful. I don't think there are any compilers yet smart enough to determine when two template instantations only needs one copy at run time, and you end up with standard "libraries" where most of the code is in a header file and recompiled often. To be fair you can use templates that do some nice libraries (ie datastructures store void* and the templates just do the type safety checks for you before casting), except that this is contrary to the STL style. I used to like the rogue wave libraries myself.

You need structs in C++ to be compatible with C. One of C++'s goals is to compile proper C programs and keep compatibility where possible. But if you got rid of the class keyword instead you'd have a revolt. So the typical style almost everyone uses is that struct is used as a plain C struct only, and if you need anything beyond what C can do then you use a class instead. What C++ should have done I think is to enforce that style.

I agree, iostream is awful. Originally it wasn't too bad in some ways and early on it did a good job of adding some extra type safety to IO. But it always felt like a bit of a kludge. It is nice to have a more generic output destination like a stream buffer but it came saddled with the operator overloading and you needed to know implementation details to do a lot of useful stuff.

Diamond inheritance is a side effect of C++ never being able to say no. It should have just said "no multiple inheritance". Once it did have mutliple inheritance it should have not allowed the cross pollenization and stuck to a style where multiple inheritance had only mixins or interfaces. But they didn't want to say "no" and figured that they could use a keyword so that the user could do what they wanted. Which resulted in a lot of confusion.

Actually Ruby is a relatively simple language. A good job of making a textual style of Smalltalk (only with a horrible kludge of blocks). The bloated part may be the Ruby on Rails which is not the same thing at all. Ignoring libraries and looking just at the syntax/semantics of the language then Ruby is much simpler than C++.

Yes, we've all heard that refrain: "mange code mens no memory leaks"; it's very last-century. Some of us on/. do this coding thing for a living, you know. That's not what I was talking about.

There are memory leaks in the JVM itself. There are memory leaks in the CLR itself. Trust me, it really sucks to be stuck with one of those! Very hard to find and work around whatever's leaking, and you have no leverage at all over the vendor to get the leak fixed.

And, you know, I never had a memory leak in my own code in nearly 20 years of assembly and C++ coding; not because I'm some rock star, but because there are reliable techniques to avoid them. It's harder in C: with the need to match acquire and release in every function, there will always be some human error creeping in. But with good coding standards it's a trivial problem with modern tools.

I don't think I am. Take a survey of average C++ programmers who don't work directly with you on that type of code. See how many of them know they can do that. I actually think 5% is an overestimate. And if that small a percentage actually know about and use a feature, I question how much that feature is worth. Especially when it at best offers a miniscule speed boost over just passing in the number as a parameter.

Several points:

Expert-friendly language features enable the creation of better libraries. Library users might not even realise that they are using the advanced features. That doesn't make the advanced features worthless.

Having fixed sized linear algebra objects yields massive speed increases, not tiny, since the compiler is able to sue stack allocation (which is essentially free) which is vastly faster than malloc/free. Secondly, the optimizer seems much better able to reason about stack allocated objects, giving another large speed increase.

At best, the speed increase is well over a factor of 10, and I have verified this with benchmarks.

Functors are implemented by overriding the constructor of a class to preform an odd type of initialization and then overriding the operator () to do computation.

I think functors just (in general) overload operator(). Nothing special is necessarily done with the constrtuctor.

It's an ugly, difficult to understand hack. It completely destroys the idea of what a class is,

I don't see how: a class is an aggregation of methods and members. C++ allows you to use some OO features with classes if you like, but it isn't necessary. What would you prefer?

Can't say I've ever wanted to perform a set difference. But if I did, there's be a method difference in the class Set, and it would take the second set as a parameter and spit out the result.

OK, so how do you store elements of the set?

What I'm driving at is that in C++ set_difference, etc work on any ordered collection of things. The advantage is that you can choose the container for the collection based on the algorithmic properties you desier (array, RB-tree, linked list, etc).

Same with a sort- the class would have a sort function. I would reluctantly not bounce using the sort function of the STL since it's so useful, but it's still not the right way of doing things. And it's much more complex than it should be, since the calling code has to worry about things like passing in comparators, when that should really be the job of the sort function.

So, every container class needs to have it's own sort function? If so, then they would likely share a lot of common code. Is that whay you propose?

Why on earth would the vendor's compiler for that platform provide a standard library with full support for disk based files, POSIX locales, and all sorts of other obscure features when the target is a tiny micro?

The C and C++ standard library in glibc and libstdc++ are both huge to deal with a lot of odd stuff specific to general purpose POSIX class computers.

You wouldn't use the stock gcc iostreams, or the stock glibc on an atmel.

Having recently had to learn TI assembly to do things on some strangely designed DSP boards, I really do think it should be more widely taught. Memory management takes on a whole new meaning once you've had to fit your program (data acquisition and FFTs) and data into one shared 10kword RAM block which has some hardware-reserved blocks and various alignment requirements. I had plenty of conceptual understanding of how computers worked beforehand, but now things like shared busses, registers, stacks, addressing, interrupts, etc. have much more direct meaning to me, and I feel a lot more grounded even when coding in higher-level languages.

...Not to mention I feel grateful now that I don't often have to use ASM.

It's clear that you don't really get a lot of this stuff. Despite that, you seem to be in some position where you can review and reject other people's code.
Your remarks would be hilarious if I didn't think you were serious. For example:

Can't say I've ever wanted to perform a set difference. But if I did, there's be a method difference in the class Set, and it would take the second set as a parameter and spit out the result.

The whole point about generic algorithms is that you only have to write them once and can then use them with all sorts of containers, including ones that might not have been written yet, as long as the containers satisfy the minimal requirements of the algorithm. So for example, the 'set' in set_difference does not refer to the container type - it is a description of what the algorithm does. The algorithm does not demand a set; you can equally apply it to a sorted vector. Furthermore, the two input sequences to set_difference do not even have to be the same type as long as their elements are compatible, so I can apply it to a set of strings and a sorted vector of strings if I want to. By your argument, I would have to have a set class with a difference method, and a sorted vector class with a difference method. And then if I wanted set's difference method to work with sorted vectors and other compatible sequences, how would that work? I would have to write it as some sort of generic member function anyway.

Same with a sort- the class would have a sort function. I would reluctantly not bounce using the sort function of the STL since it's so useful, but it's still not the right way of doing things. And it's much more complex than it should be, since the calling code has to worry about things like passing in comparators, when that should really be the job of the sort function.

So what you are saying is that instead of having a sort algorithm implemented once, I need to reimplement that algorithm in every class that I might want to sort. So either I guess that I might want to sort it at the time of writing it or, if I didn't get that right, I have to go back and modify the class. Compare that with the non-intrusive sort algorithm. How is what you are proposing good software engineering practice by any stretch of the the imagination? And I don't understand your point about comparators. In most cases a type you want to sort probably defines a less than operator, which is all you need and you don't need to provide an explicit comparator. It's only when you need to do something special that you need a comparator. How would the sort member function be better?

Here's a hint: go and look up the word 'orthogonal'. It's a key concept in understanding the STL.

No, we stick with C because it is a great middle ground between assembly and high level languages. I would not want to write Python or Java on little microcontrollers. C is a small enough language that lets you write complex code relatively easily while staying close to the hardware.

C's got its warts, it's true. It's a mature language that leaves the programmer in control of the system. It's not supposed to do the fancy things such as garbage collection, object management and so on. I'm glad it doesn't. There are other languages for that.