Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Rui Lopes writes "After a 5 year hiatus, the IOCCC (International Obfuscated C Code Contest) is back! This marks the 20th edition of the contest. Submissions are open between 12-Nov-2011 11:00 UTC and 12-Jan-2012 12:12 UTC. Don't forget to check this year's rules and guidelines."

Most C coders seem to achieve obfuscation without any additional incentive.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

+1, it all started going downhill when :- professional language designers abdicated their role, and the void was filled by amateurs- people who use these languages have no fucking clue what they're doing and we're all paying the price- corporations hyped languages for their own purposes and languages stagnated or worse were crapified to an absurd level (witness java).

professional language designers abdicated their role, and the void was filled by amateurs

I'm not sure how you define a "professional language designer", but I don't think Ritchie was one either way. It's precisely why there are a lot of messy things about the original design of C, such as its declarator syntax, or mixed signed/unsigned arithmetic rules, or implicit int. Some of it, I believe, comes from having the language designed as its compiler was written, and design being tweaked so that it would be easier to implement - it's a hacker's pragmatic approach to making a tool that's needed for

How about stuff that needs to be rewritten from scratch because a target platform can't run C? This is true of the web (or at least it was until Emscripten), and it's still true of Xbox Live Indie Games and Windows Phone 7.

I would hope that no one who actually knows what they're doing would ever create a type with the name __MfxVge__, because symbols starting with a leading double underscore are reserved for use by the implementation. You knew that right? Right? Obviously, because you know how to write comments and spell out complete words.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year.

This clearly shows you simply don't understand the problem. A good programmer can (and does) write well structured, clean, DOCUMENTED and maintainable product in any language. The issue has nothing to do with the language used and everything to do with lack of discipline, inexperience and a slapdash and unprofessional attitude. Usually the worst programmers are the ones who think that once the code is written and compiles clean, the job is done. For most of these people there is little hope of educating them as they are incapable of seeing the bigger picture.

Yes, a true communicator switches to any of Earth's languages at will and celebrates the variety, eagerly perfecting his ability in any new language which some committee or group of enthusiasts recently invented. This is a realisable and good use of the copious time every human has available: the sugary topping has always been more important than the meal below.

Sounds great, but all that nice sounding theory doesn't apply in practice. For example, C and C++ in particular are languages that started out "simple" but became quagmires over time. It's impossible to write portable C/C++ code that meets your requirements of "well structured and clean".

Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C

Multi-threaded programming is particularly easy in those languages, because a lot of their internals are inherently thread-safe. For example, strings are read-only, so they can be passed around risk free. Similarly, mark & sweep garbage collection is thread-safe, and doesn't suffer from the rare but complex to debug memory leaks that occur with reference counting

java.util.concurrent.atomic is a perfect example of why Java is not a viable choice for the work I'm doing. One of the tasks I currently have to handle is multiprocess disjoint set construction (using the wait-free union-find algorithm), on a very large corpus. This algorithm requires each disjoint set tree node to contain two fields: a reference to its superset, and a rank counter. In Java, the only choice I have is to use an array of AtomicStampedReference<V>, which will always occupy at least two p

Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C is a badly designed language where the behaviour and/or availability of even basic language keywords like "int" is a crap shoot that depends on the compiler and the target processor type

Well, duh. That's the entire point of C. You're complaining that it's a language close to the metal.

But can't be used to represent characters, because Unicode requires at least 16 bits for the character type.

Never heard of UTF-8 then, I take it? That works fine in regular old char arrays... (Though, to be fair, it introduces other issues, such as you can no longer depend on the length of the string being the number of characters in it...)

You're making basically the same argument as people were saying back when machine code was what people wrote and C was new. If you have an open mind, you can easily see that C has serious shortcomings by modern language standards.

C offers no abstractions for complex data types. It offers no subtyping. There's no facility for generic programming other than macros, which everyone knows suck. No support for closures or comprehensions. None of these things are "trivially implemented", as you state. Even its syntax sucks, as anyone would agree who's tried to declare a non-trivial function pointer.

Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point) make ocassional mistakes with pointers, and they are the worst kind of bug: silently incorrect or a crash at a random place in the code.

C is perfectly appropriate for some projects, especially with really low-level code (as most C constructs translate directly to assembly). C++ is usually better, as it has a richer typing system and ability to do generic programming, but you need to be an expert as the language is full of pitfalls (which are mostly C's fault). For projects that don't need to be close to the hardware, scripting languages can multiply programmer productivity.

If you sometimes made some mistakes with pointers, occasionally, it is 10000 times better than what an unexperienced C# developer could do with all the neat language features, you simply have no idea how obfuscated his code could become, and how much you begin to like the idea of having revoked the death penalty...

Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point)

I'm seriously doubting your professed skill here. You don't ever have to do pointer arithmetic in C, unless you are counting parameter passing as 'extensive pointer manipulation,' but you pass parameters as pointers in Java too (that's why you can get an NPE). The most common use of pointer arithmetic is for array processing, but if you want to be safe you can just use the array[] notation and not worry about understanding pointer arithmetic (I usually do, unless I have a compelling need to use pointer arithmetic). Furthermore I don't even know what you are talking about when you say, "non-trivial function pointer." Aren't all function pointers the same, just a bunch of parameters and a return value? Or are you declaring an array of function pointers or something? That might be where your problems are coming from.

From experience I can say by far the thing that takes the most extra time when I am writing in C (compared to Java) is the lack of a good library, with common data structures like hash tables and lists and regular expressions. The number of times I've had to write a generic list library for some random platform, or figure out someone else's nonstandard implementation, is depressing.

Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger. C code (really) tends to be a lot more readable. The downside is that it's usually a lot easier to refactor Java code without needing to rewrite a lot of interfaces (even when the interfaces were poorly written in the first place).

Ultimately though, a good programmer will write good code in any language. A poor programmer will likewise write poor code in any language.

Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger.

The design of generics in Java is flawed in more than one way, but can you give an example of "confusing code that can be extremely difficult to understand", especially "without a debugger" - that last part sounds completely nonsensical to me since Java generics are purely compile-time; they don't have any runtime variability by design (due to erasure).

Anyway, there are many better examples of generics done right. My personal favorite are OCaml functors, which could be easily slapped on top of C with only m

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.

Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.

No matter which path you took, you died before reaching your goal.

Java's got it's weaknesses but there's nothing seriously wrong with the language. The problem is the libraries, especially the enterprise libraries, and I mean both the EJB monstrosity and the consultant hell that passes for light weight. Misuse of the idea of design patterns are to blame. Let's look for ways to tack on another layer of abstraction we'll never actually use, shall we?

And it seems that you need a string builder class to help you manipulate strings. Sorry, but that tells me that you're doing strings wrong.

It's one of the tradeoffs that you have when you design what your strings look like. In C++, they are mutable, but they are "value types" in Java terms - i.e. copying them around actually copies the buffer, so when you pass a string to a function, it can freely mutate it since it has the pointer. But copying like that ain't cheap - it's why you have to pass by const reference where possible. Often can't return by reference, though. RVO (and, in C++11, move sema

while some smart programmers think it's necessary to over-use the preprocessor

And that is ultimately my main beef with C, it's impossible to write non-trivial code that DOESNT make use of the pre-processor. Header guards in 2011? Really? C either needs to make an Objective-c like import statement a standard or else make #pragma once standard and make it default, so that if in the rare case you do actually need to include a file more than once, THEN you have to use a pre-processor command. I think the pre-processor is a really useful feature of C, but it should never be essentially mandatory to use it.

It's "broken" in a sense that, if you try real hard to break it, you can find ridiculous corner cases that can do that. In the meantime, there are millions of line of C and C++ code written using it, which compile and work just fine.

Yeah, Python sure flooded the marketplace in the past year. Now, if you'll excuse me, I've got to check the breaking news about the Lewinsky scandal after buying some hot dot-com stocks while on the way to work at the World Trade Center because apparently it's the late nineties again somehow.

You can sigh all you want. Lousy programmers will write lousy code whether it is open source or not, whether it is C or not. I know, I have had to maintain it (and certainly written it at some time). Just sprinkle ten or so non-named constants in your code to signal for some database offsets and you are in for a few entertaining hours/days of repurposing.

It is called playing to your strengths.While producing well structured, well documented, clean and correct code in C would be quite a challenge it could never approach some of the new languages in these terms.

Most C coders seem to achieve obfuscation without any additional incentive.

You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

When someone knows C well enough to create a truly obfuscated or compressed piece of portable C code that follows the rule of the language to a tee, i.e. that can be compiled strict or linted, and wins the IOCCC, it's a very good sign that this someone can create excellent C code.

I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.

While your code may be technically correct, compile, and do what it's intended to do, that does not make it good code. It just makes it code that works.

Look at IOCCC examples posted on wikipedia. If the average programmer (ie: your coworker) will need to spend more time thinking about the extra whitespace and the syntactic monstrosity that comprises the competition, then your code design sucks and you've ended up causing more headaches with your "good" code.

I'm not very familiar with this competition but that seems to be the very point. The winning code should be almost impossible to understand unless you are very good. This isn't good code in the traditional sense but in an ironic sense.

Sure, that could be nice as well, but the IOCCC provides great challenges and puzzles, something that a clean code contest wouldn't. And what would you rather see in your news paper: difficult puzzles or easy ones? Or, for the youngsters here: would you rather play word feud, or type the answer to 1 + 1 over and over again?

Besides that, the IOCCC entries contain mostly well structured and correct code, and afterwards they get documented as well. It's just not readable.

Pardon me, but are you serious? Claiming that code is clean (or correct) because it compiles to a small executable isn't necessarily true. The demo scene prides itself on small executables and optimizes for this. Such optimizations are rarely the product of clean and correct code but rather hand crafted dark compiler (or assembler) magic.

Let me tell you, you don't get to 96k by writing clean code. You get there by writing utter unholy messes, and you get there by cheating like hell, and you get there by using every dirty trick in the book.

Also, you often do it in a week or so before the compo, and continue right up to the deadline, in the party hall, and you do it know you will never have to maintain or look at that code ever again after you hand it in.

If you think demoscene code is "clean", you have absolutely zero experience with it.

The entries for the IOCCC can show a lot of cleverness, but nobody in their right mind would accept such code. The beauty of the Underhanded C ones is that the code looks reasonable, but does extremely undesirable things.

call me paranoid but this contest and the ioccc are the reasons why i don't particularly let anything from s.e.l. touch my systems. i am not a good enough coder to be able to tell if what it's doing is what it says it's doing or something the cia wants it to do..

Jim Gettys has a wonderful explanation of this effect in the X server.
It turns out that with branch predictions and the relative speed of CPU
vs. memory changing over the past decade, loop unrolling is pretty much
pointless. In fact, by eliminating all instances of Duff's Device from
the XFree86 4.0 server, the server shrunk in size by _half_ _a_
_megabyte_ (!!!), and was faster to boot, because the elimination of all
that excess code meant that the X server wasn't thrashing the cache
lines as much.

It's the main reason why C++ does well in microbenchmarks and does so much worse in real-world usage. It encourages a lot of inlining, which reduces branching but increases instruction cache usage. It's difficult to benchmark well, because instruction cache pressure changes over time depending on what else is happening with the system.

C++ also encourages object-oriented programming techniques, which spreads functional responsibility across multiple classes. This naturally leads to more, smaller functions that may also mess with the instruction cache.

Not really. Small functions that are called a lot are fine for icache usage. They're often better than big functions with some hot and some cold code paths. One of the optimisations that I'm currently working on is to move cold code paths out of big functions into a separate function so that they can reduce icache pressure.

I haven't ever used a C++ compiler which didn't have some way to control inlining

Yes, I know, I do work on a couple of C++ compilers...

Some even let you set hard limits on levels on resulting function size

... but it's always done at that kind of granularity. That's the problem. In every case, inlining can appear to be a locally correct optimisation. Every function that has something small inlined into it will become faster. You can measure that with microbenchmarks. The problem is that doing this

And it's not like C can't be inlined just as well

Not as easily. The C specification doesn't define ODR linkage, which makes creating inline functions more difficult. C++ templates get expanded in every co

I wonder if FORTH would be particularly efficient on today's processors, for that reason? It is incredibly compact, with generally tiny functions nestled deeply. An entire FORTH system could sit in a modern cache.

I find all the "C sucks" comments to be both amusing and stupid. Without C code there would literally be no Internet. Every bit you are sending and receiving uses C. The two operating systems that represent 99.99% or more of the running computers that are online run C. Both Windows and Linux use the BSD TCP/IP stack.

If C did not get the job done for this kind of computing then it would have been replaced. The fact that C thrives in the systems programming domain is a tribute to it's utility.

A proficient C coder can write clear, maintainable, efficient code that runs on many platforms. This requires both skill and practice. Not everyone is capable of doing this. It requires the ability to keep multiple competing abstractions in mind when coding. I think a lot of people try this and find it difficult and then blame the language. Those who persevere and learn this style of working can usually move on to other kinds of programming and also do excellent work.

Some problem domains require different languages and different skill sets. Personally, I like writing code where I know that if I were to look at the assembly code generated by the compiler I can see how it relates to the C code I wrote. I rarely do this, but it's good to know that I can if I want to. I'm doing any C coding now, because I always use the appropriate language to the task. But I also know that my C coding skills give me a distinct advantage in solving difficult problems, no matter what they are,

Before I stir up any vitriol, I'm just kidding. I think C is under appreciated precisely because is provides only a thin abstraction that (hopefully) maps well to the target architecture, but otherwise stays out of the way. That is to say, when all you have is a hammer, you can easily shoot yourself in the foot.

Actually, I think I fucked that one up. The bit size is exact, but there is no guarantee that the typedef for a particular size exists, so technically you can't rely on uint16_t in your programs anyway.

The people on the standards committes are either saints, or abject evil scum. Maybe they're in a quantum superposition of both states, as long as you don't open the ISO report and take a look...

The reason why there's no guarantee is because there are actual real-world architectures on which it would be impossible to uphold such a guarantee (at least in a reasonably efficient way). E.g. SHARC [wikipedia.org] only has 32-bit words, so implementing an int16_t would be kinda tricky - sure, the compiler could do it with bitwise operations, but think about how a pointer to int16_t would look, and how dereferencing operation would work.

In practice, if you need intN_t, you'll probably just assume that it's available, and

- I think people who put the "*" of pointer syntax near the variable name and not the type name when declaring pointers should be shot. It should always be int* pointer_to_int, not int *pointer_to_int.

I'm sure my complaints are unwarranted except for the first point.

But that's backwards of what the compiler really does. Consider this:

int* p, q;

What types do p and q have? p is a pointer-to-int; q is an int. By putting the * next to the type name it makes it look like all the things are int*, but they're not. By putting the * with the type (which I did for my first year of C coding) you're making reading the code harder rather than easier. It'd be like writing

a = b * c+d;

and trying to convey that the '+' binds tighter since it doesn't have spaces. That's not what the compiler will do and writing it so only serves to confuse the reader.

In addition, what you see at declaration is representative (modulo the weirdness of array subscript and pointer deference) to what you'd do to get the type. That is, int ***p means that you'd have to type ***p to get an int. *p means you'd need another ** to get an int, etc.

When I write C, I put my splats next to the variable name too, it's just how I've always done it. . I think the Pascal way is the best where a pointer would be declared like this: p: ^integer;. This is very clear in my mind, as it reads "p is a pointer to an integer. int* p or int *p, they both read backwards to me, I just put it next to the variable name because that's where it's going to be when I dereference the pointer, so I might as well be consistent in my declarations as well.

If style issues bother you, run your code through a styler before and after you receive them from your source-code management system.

Really, style and content in C are as separate as they are in HTML and CSS. If you want a certain way of spacing things, generate a rule that turns everything into your style before you see it and converts back to whatever the agreed style is before you publish it for others. It really makes no difference to the compiler, only the programmer. And the programmer that can't e

This dates back to Algol 68. In Algol, the basic numeric data types were INT and REAL, and you could prepend an arbitrary number of SHORT or LONG to them. Any implementation was guaranteed to support SHORT INT and LONG INT as distinct data types, respectively smaller and larger than INT, but any extra prefixes were conditionally supported - it was always legal to write, but it was implementation-defined whether e.g. LONG LONG INT is larger than LONG INT, or the same. I think it's a fairly interesting conven

Keyword: cognitive load. Case in point: hilariously excrutiating code example in linux man page of snprintf. If you need to jump through all these burning hoops to do something this mundane, imagine how much more your proficient C coder could achieve in a more sensible laguage with the same amount of effort.

A sensible C coder might use vasprintf instead of the example in that manpage. The fact that all the standard library functions aren't great for all (or sometimes any) use cases is hardly unique to C.

Nonsense. Windows security flaws have historically been due to boneheaded design decisions. Windows was never meant to run as a node on the internet, that functionality was retrofitted when Bill decided to do his famous 180 about turn on the computer highway because he'd missed the internet on-ramp. "Security" was retrofitted ten years later.

Problems with the C standard library certainly do exist and can expose security issues, but Windows security problems exist because the OS design emphasises user friendliness and backwards compatibility over tight protections.

It's like inside your home, you don't lock all your cupboards, drawers and doors - that would be painful, eg to walk from the kitchen to the living room you'd take out your key, unlock the door, open go through shut, relock the door, each time. To make your home livable you keep it insecure, and that's how Windows was designed from the ground up.

But now suppose there's a magic internet wormhole that opens in your toilet room, and anybody can enter your house. Suddenly it makes sense to have locks on all the doors and cupboards etc, but it's too late. Windows + Internet = insecure.

Unix doesn't have this problem, because Unix was always designed as a hotel (multiuser OS) rather than a home. So there's locks on the rooms and the swimming pool needs an access card etc. If a wormhole opens in the hotel lobby or even in one of the guest rooms, there's limited access to most areas by design.

Wrong. On account of backwards compatibility, the available infrastructure was never seriously enforced, and on account of idiotic security flag combinations the options available did not promote real security.

XP SP2 was the first time a (laughable) effort was actually made to enforce some security.
With SP2, enforcing security on sensitive API calls meant something trivial like inserting a pop up dialog box to ask the user for confirmation. All you had to do to bypass it was send a button click message t

Actually during the creation of NT Microsoft licensed a network stack from Spyder Systems, which was based on the BSD TCP/IP stack. Microsoft replaced this with it's own stack for NT 3.5, which was the second version, and I believe that was the one that went in to Windows 95. Some small userland utilities persisted after the stack was replaced, and who knows how much BSD code Microsofts own network stack contains even to this day seeing as we can't review Microsoft's source. Not that it really matters, a

Actually during the creation of NT Microsoft licensed a network stack from Spyder Systems, which was based on the BSD TCP/IP stack. Microsoft replaced this with it's own stack for NT 3.5, which was the second version, and I believe that was the one that went in to Windows 95.

More to the point, you can still see the heritage in the C API; it's recognizably similar throughout despite many other parts (e.g., file descriptors) being wildly different between Win and Unix. That's OK too. It means that Microsoft have a properly road-tested API in use. (The code itself may have gone, but that would be No Big Deal. While some C code really does survive for multiple decades, it's not really to be expected in any OS. APIs are much longer-lived.)

A number of quick points... Some people just don't know, so here are some practical speaking points...

-C has been around longer than most of the non-C programmers alive. That includes you people on this site, which has the smartest people, from one of the most divisive areas in the civil space: the "tech wars".

-D was such a better language... also, C++ because we never hear about C anymore.

-Java is on it's way out, being deprecated by the largest company in the world, which also deprecated Flash (on mobile) which Adobe just acquiesced to, replaced by Google's new iteration. Maybe not in the next 5 years, but it can no longer grow... it will have to get smaller with less support.

-Objective-C, used by Apple Inc., the largest company in the world, is a wholly-compatible superset of (ANSI) C. There are no signs of change here. Big surprise, it's all the same hardware components, just in larger capacities, at faster rates, and smaller form-factor. C can't help us with the flux capacitor... but that has not been added to the standard CPU, memory, memory storage, etc. model.

-Google announced that Android will run a C-like-language in the native space that uses the CPU and GPU. Even with Dart coming our way...

-CUDA... C is relevant in other (all) GPU spaces which is the go-to-guy, for the moment, to eak out more performance from a machine.

-And here is where the feelings get hurt: In college, I strattled the EE/CS line while being firmly EE. EEs learn C because it teaches them valuable things about the hardware, being a very light obfuscation. CS departments tend to concentrate on, well, anything else. Flavors of the year, interesting projects, etc. That is their place. My older brother went the CS route, 8 years before I got my turn and went EE. I admire him and his success greatly but I know, push came to shove, I can talk about certain topics without talking about garbage collectors and universal typing.

So, please, if you've never used C in any significant way, just don't comment. Listen. People, young and old, have something to tell you about the most significant programming language ever invented.

And to bring this all together: When you are trying to eke out CPU cycles so your 3D rendering is above 60 fps on that mobile device, you will know why closeness to hardware and C, in particular, may be your best friend. Or a C-like language...

Another way to look at it: People who know C and have worked with it, can't just unknown it. They know what you non-C people know, but also have other experience. If MOST of them say C is indispensable, then how about you do the one thing some Tech Asshole never do: Take someone else's advice. And STFU.

Can we just talk about something else that is awesome and not caught up in this stupid argument?