Posted
by
Soulskill
on Friday May 11, 2012 @12:43PM
from the late-bloomers dept.

New submitter IdleThoughts writes "Sometimes it takes a long time to spark a revolution. Long the ugly duckling of programming languages, iOS' Objective-C passed C# in the 'TIOBE Programming Community Index this month and seems on a trajectory to overtake C++ in the next few. It was invented in the early 1980s by Brad Cox and Tom Love, with the idea of creating 'Software Integrated Circuits' and heavily influenced by Smalltalk — yet another legacy from Xerox PARC, along with desktop GUIs, ethernet and laser printers. It was adopted early on by Steve Jobs' NeXTStep, the grand-daddy of all that is now OS X. It had to wait, however, for the mobile device revolution to have its day, being ideally suited to the limited resources on portable devices. It's still being actively developed by Apple and others, sporting the new automatic reference counting and static analysis in the Clang compiler. It turns out it has supported dynamic patching of code in applications all along. What more surprises does this venerable language have up its sleeve?"

Properties will also be synthesized by default, so you won't have to write @synthesize statements anymore, and corresponding ivars will be synthesized with an underscore prefixed name.

Objective-C is interesting to follow because it's a language that was once considered totally niche and almost completely irrelevant, but the frameworks were beloved by developers, and the language's keepers kept at it long enough for the world to see how useful the language is. It also has historical significance as the tools used for creation of the original WorldWideWeb program as well as the development of Doom and Quake. John Romero wrote [rome.ro] about he and Carmack simultaneously editing the same map in DoomEd thanks to distributed objects.

It's still verbose and Smalltalk-ish, but the language as a whole has improved drastically since the transition to Clang. According to the mailing list, Apple has more engineers allocated to the language than ever before, and a lot of it has to do with the move away from GCC.

I hear that GCC is working toward being easier to modify, so the competition from Clang has been good for everybody, and it's all open source.

Why? Anyone can look at that line and understand exactly what it does without any knowledge of the APIs (although those of us that are familiar with them will suspect that you probably meant +whitespaceAndNewlineCharacterSet). It's more to type, but given that even vim with the clang plugin can autocomplete from a couple of characters that's not really an issue. Most code spends a lot more time being read than being written, and Objective-C is very easy to read.

In many ways this is true, but then again, they aren't the same kind of languages. I absolutely love C# syntax and the easy readability of the code..NET libraries are also wonderful, and in general I would rather use C# than Objective-C because of this.

But Objective-C is closer to C and C++ than C#. I would however hope that Apple brings something like C# to OS X and iOS. I would start developing with them right away.

The readability is a bit clearer in C#, but Apple is already fixing that in Obj-C with changes like auto synthesizing properties and making the declarations of common objects simpler like the initial poster showed (with code examples). But aside from simple things like that, the readability of the code depends a lot more on the programmer than on the language.

If you haven't used Obj-C, at least not on an Apple platform, then that's why you don't know that Apple provides excellent frameworks very much like MS provides.NET. Check out: https://developer.apple.com/library/mac/navigation/#section=Frameworks [apple.com] Almost anything you want to do, Apple provides the foundational building blocks to help you build the application, and not waste time implementing a queue, list, or talk to a webserver.

Well, it's been years and years since I wrote Objective-C (I worked for AppSoft, doing NeXTStep Productivity Apps), and have never touched ruby, but to me, the Obj-C code is clearer. A hell of a lot more to type, but I know it's stripping the whitespace characters from 'stringToTrim' and creating a new string.

my_string.strip looks like a call to a non-standard (project specific, locally developed) bit of code that could do anything. If I knew Ruby, I might totally know it _was_ standard and what it did.

I just think its a bit of a shame they went with objective-C which is a bit strange, they might have been better off with C++ which has as many quirks as obj-C but is more widely known and is just as performant.

C++ doesn't support the dynamic dispatching needed for what makes Cocoa so great to write. Apple tried to switch to Java (which has reflecrion), but it was too slow, so they abandoned that attempt.

By GUI widgets are you referring to the drag and drop windows forms crap? Because I don't use that. On the other hand, if you're talking about intellisense, that feature is a freaking godsend and has reduced the amount of time I spend looking at API reference manuals to almost zero.

He's probably referring to the left-hand-side menu containing the template code for buttons, listboxes etc. It's still a stupid remark though. Who cares if you automate some tedious programming stuff - that's the whole point of using the computers in the first place!

My problem with the drag and drop widgets isn't that it automates work.. I try to automate whatever I can whenever I can. I just don't like the fact that always seems to do it in such a way that, were I to customize something, it's significantly more difficult than if I code it myself. It's really a moot point for me though, as I rarely have to do any GUI coding, and when I do it's usually coding a special widget in our library for other devs to use.

Worse.A Windows developer developed Norton.Another Windows developer got drunk one night and had all of his humanity removed and wrote Mcafee.Then there was the infamous Windows developer that did Internet Explorer. I heard he started his career of terror by writing THIS [wikipedia.org] program.

Sounds a lot like the Mono vs.NET debacle. There's absolutely nothing that says that Apple won't just come around and sue everybody elses buts off for unlicensed use of Objective-C and Apple's copyrighted APIs.

Oh, come on, Apple isn't going to sue you for using Objective-C. They send you a sympathy card and a case of aspirin.

Sadly, given the fact that Oracle has sued Google over nine lines of standard library code for the Java language (developed by Sun, bought out by Oracle), it wouldn't shock me at all to hear that Apple has sued over Objective-C.

I know, the parent comment is funny, but the best humor has a grain of truth. In this case, it's a grain of sand in a shoe.

You don't need Xcode to use Obj-C. Clang and gcc are open source and you can use them on Linux and Windows. You can even use clang in Visual Studio!
If you mean that you want to develop OS X or iOS applications then yes, you should at least have one of those around to test on. And please hold any more complaints about Xcode being Mac-only until MS releases Visual Studio that isn't Win-only.

My last Linux dev box was pulled from a dumpster by a friend, and was handed to me. I wiped the Windows XP installation off...

Yes it's a shame how people throw away things that other people might want, and then other people throw garbage on top of it. I've had to wipe egg salad off a few things I've salvaged, but Windows XP? Yeauchk! I hope you burned the towelettes afterwards.

STFU whippersnapper, when I was a kid we cut punchcards out of coke boxes and punched them with our teeth -- and we liked it.

But seriously -- if you don't have a computer to run your programs on, it doesn't much matter whether you have one to write them on or not. There was a day when every computer came with the usual set of programming tools (granted, for much of this period the programming tools, at least on home machines, were rather minimal and essential to using the computer as well). So even if you o

You save $500 on hardware but spend $500 on Microsoft Visual Studio Pro. So its a wash. However when you consider that the hardware lets you dual boot Mac OS X or Microsoft Windows the Mac seems like a win. Especially when you consider Mac OS X offers you a really nice Unix environment if one is so inclined.

I do? You mean I'm not supposed to download it from my work MSDN account for my personal use?:P

You know, or get the Express version that does everything a hobbyist needs for free.

Emphasize hobbyist, actually only some hobbyists. No 64-bit code for Express. No Microsoft Foundation Classes, MFC really simplifies Windows use interface coding and it is very commonly used in Windows apps. No profile guided optimization. No remote debugging. No resource editors.

you are not 100% correct there. 64-bit tools are not available on Visual C++ Express by default. To enable 64-bit tools on Visual C++ Express, install the Windows Software Development Kit (SDK) in addition to Visual C++ Express.

Notice how bonch, a notorious shill for corporations such as Apple and Microsoft and with an openly anti-google agenda, happens to post a verbose comment, with source code examples and all, right at the exact same time this piece of Apple propaganda is published on slashdot.

The asterisk next to his name means he's a subscriber, dumbass. Subscribers see articles before non-subscribers. You can write a reply in the box and submit it when the story goes live.

Not in the least. Windows is not tied to a language (you can use whatever you like), where iOS is. Now I can't comment on what languages are available for Windows Phone 7, or Windows 8 has/will have, but they do not have the platform adoption that iOS does. C# usage is based on its merits, where Objective-C usage is based on Vendor lock-in.

Given that Java is one of the top languages, it's a valid point if we're talking about vendor lock in monkeying with the list. If it wasn't for Android, Java would be lower on the list.

If we're talking about C#, there are quite a few new APIs under Windows 7 that are C# only. Is that vendor lock in?

Every vendor has a favorite language, and every vendor builds their new runtimes around it. This hardly goes into vendor-lock in. If I want to use Windows Presentation Foundation, I have to use a managed language

I'm not doing anything in Objective-C but I actually like the method name syntax. Code becomes much easier to read and hardly any harder to write if written in a somewhat modern IDE with code completion.

C already has a syntax for declaring and invoking functions, and it already has a syntax for accessing variables bound to instances of structs. why did they have to invent a completely new syntax for declaring and invoking functions bound to structs?

if "struct.field" is ALREADY a way or accessing a struct's field, and "function()" is ALREADY a way of invoking a function, then why can't "struct.field()" be a way of invoking a function that is a field of

Objective-C is a strict superset of C. Anything it adds over C has its own special syntax and notation, possibly to help reduce confusion. Properties didn't always use dot-notation--you used to have to do [object ivar] in order to access a member variable, and [object setIvar:ivar] to change it. The (relatively new) dot-notation and @property syntax is just shorthand for this functionality, and a welcome thing (though you can still use the old style).

Objective-C used to have a lot of irritating things about it, but I think the language has really improved over the past couple years. Properties, auto-synthesizing, automatic garbage collection, fast enumeration, etc. have all made the language much better. Once I got past the odd messaging syntax, I really came to like it, and I have to wonder how much recent experience some of these vocal haters have with the language.

Look, I understand that people who use their tools daily want to advertise them and it's a goosd thing if you like what you're using, but let's face it: Objective-C is just another unsafe, hopelessly outdated extension of C as C++. It's great to get things done and sucks less than C++, but it's not in any way a modern language nor is it based on a great language design.

Before people start flaming me, please consider that programming languages are tools and you choose the right tool for the right purpose and platform, and the availability of libraries is often more important than the language itself. There is no doubt that Objective-C has its place and is useful, just don't try to sell it as the latest great new thingy. Even Apple's own old Dylan was a more interesting and innovative as a language than Objective-C.

"Even Apple's own old Dylan was a more interesting and innovative as a language than Objective-C."

Agreed. I loved the multi-interface stuff. Why doesn't anyone else pick that up? It would be particularly easy to implement in Bundles. But...

"the availability of libraries is often more important than the language itself"

Bingo. Lets be honest, is any native library set even *remotely* as good as Cocoa out of the box? With the exception of Delphi I've fiddled with them all, and the answer is a resounding "no!". All you have to do is compare the basic text editing widget across libraries and you can draw your own conclusions.

Look, I understand that people who use their tools daily want to advertise them and it's a good thing if you like what you're using, but let's face it: C is just another unsafe, hopelessly outdated extension of assembly. It's great to get things done and sucks less than fortran, but it's not in any way a modern language nor is it based on a great language design.

Before people start flaming me, please consider that programming languages are tools and you choose the right tool for the right purpose and platform, and the availability of libraries is often more important than the language itself.

Not only does Objective C have an extremely rich set of libraries from both Apple and the community (UIKit and Foundation are arguably the best mobile development APIs out there), but Objective C is compatible with all C and C++ libraries.

So I'm not exactly sure what the point is. I suppose if you have to use a C library one could say "Well see, you have to use C anyway!". But at least for me, the important part is while I'm using C, I'm still encapsulating that code in Obj-C.

Objective-C is just another unsafe, hopelessly outdated extension of C as C++.

Why do you claim it is "unsafe"? Almost all work done in Objective-C is very "safe", by any measure - mostly you are never using C arrays or the like. Just because they are there does not make the language inherently "unsafe" if that's not how real people use the language.

consider that programming languages are tools and you choose the right tool for the right purpose and platform, and the availability of libraries is often more important than the language itself.

Objective-C currently has some of the most advanced libraries for any platform. It already had great string support and other strong frameworks even before iOS, but with iOS and the Mac taking off the framework support for really advanced animations, database work, networking, etc. as good as or better than any other platform. I came from a Java world and am missing nothing for libraries... not to mention a really good set of open source libraries that offer other abilities in addition to the core frameworks.

In fact, I would go so far as to say the range and quality of design of the frameworks are THE reason to use Objective-C.

People like you just look at when Objective-C was developed and think because of its age it cannot be "modern". What you don't realize is that Objective-C was developing over all that time, just in a fairly parallel path to other languages - I like to refer to it as a "Steampunk" language. It is modern but just not quite the same as other things you are used to, coming from an alternate reality.

You're going to have to come up with real reasons for Objective-C not being "modern", most of which are probably quite out of date by now. Before we can flame you, there need to be specifics which we can skewer...

Why do you claim it is "unsafe"? Almost all work done in Objective-C is very "safe", by any measure - mostly you are never using C arrays or the like. Just because they are there does not make the language inherently "unsafe" if that's not how real people use the language.

There is a common consensus in the CS community that pointers as opposed to references, pointer arithmetics, direct type conversion ("memory overlays") etc. are unsafe, and a language that makes it easy to use them is "inherently unsafe". (That doesn't have anything to do with actual programming practise. Obviously, you can write "safe" programs in any language, even in machine code, as long as you're very careful.) As a comparison, take Ada, Eifel, Java, Haskell -- these are all much safer.

There is a common consensus in the CS community that pointers as opposed to references, pointer arithmetics, direct type conversion ("memory overlays") etc. are unsafe

In ObjectiveC we are really using objects more as references than as pointers.

Basically you come off here as just being afraid of something because you've been told it's scary, not because you've seen real issues.

As a comparison, take Ada, Eifel, Java, Haskell -- these are all much safer.

Exactly my point, As I said, I was a Java programmer (for almost a decade) - Objective-C is not really less safe at this point in practice. I say that in terms of stability and in terms of memory use (since you still do not say what you mean by "safe" and the world offers many perils).

As for "modern": Perhaps you haven't seen any modern programming languages yet?

Snark alert. As I said, I used Java for a LONG time. Before that I knew better languages still, Scheme and other things... Perhaps you have not worked with enough different languages to know what is really "safe" and what is not.

You're still using a garbage collector? Do you watch that operate while gnawing on woolly mammoth bones or what? ARC is a far superior approach as it involves no overhead.

As for contracts... you really don't know Objective-C at all, do you?

You just come off as some ancient CS grad-school twat totally removed from real world programming. I've worked on large systems for multi-national corporations, and now on mobile applications used by millions of people. I don't automatically assume anything anymore, as experience I have found teaches you a lot more than mere theory or some summary of a language you have read on a blog.

Don't judge any language until you've tried to solve real problems with it.

Let me summarize. You chose to ignore almost everything of what I've said

That summarizes your position. I corrected almost all of your points. You responded to only one of mine, and there not even a point on the language but a meta issue of languages and platforms that any third rate philosophy student could offer up as a supposedly "informed" opinion on computing.

have personally insulted me (makes you wonder who's the real snob?) made all kinds of presumptions about my background...

Why do you claim it is "unsafe"? Almost all work done in Objective-C is very "safe", by any measure

Objective C, at least as used on iOS, is not a safe language. I don't see how anyone with serious programming experience could believe that.

Here are some things about it that are unsafe. Firstly, it's not garbage collected (on the phone). Manual memory management has a long history of resulting in memory corruptions, leaks, and even security vulnerabilities. Yes, on MacOS X there is GC available, so Apple clea

Seems like every few weeks someone writes another story about the amazing "trends" in the TIOBE Index. As far as I can see, the real trend is: Languages go up in popularity, they go down, they move around, one month it's the First! Time! Ever! that a language has made the list, the next month it's gone again, and C, C++, and Java are always at the top (in varying order). Such variable results suggest that TIOBE's sampling method isn't all that reliable or accurate to begin with, but I think we all have a pretty good idea what languages people are really using and for what.

Thanks for giving clarity. If we went by popularity, we'd all be listening to Rihanna or Gotye (both hit #1) or watching FOX (#1 on cable, #2 on broadcast) or reading Alex Jones infowars.com (routinely 1 or 2 in the webnews index). Popularity is interesting to note but doesn't mean much otherwise.

In fact it can be claimed to be a lineal descendent of NeXT, but it's been greatly modified, and the new UI is a regression from either the Mac or NeXT GUIs.

Also iOS - Obj C is obviously referring to the proprietary dialect of ObjC used in Apple mobie devices. (Nothing to do with Cisco iOS either, why cant they think of their own names for this stuff?) There are other dialects, notably the GCC version, which is much more widely applicable.

So you're saying OS X GUI is actually inferior to the Classic OS 9, or the old NeXT computer's GUI? Interesting. I jumped from OS 8 to OS 10.2 and didn't really notice any major differences in the desktop (except the new tab bar at the bottom). Maybe I just didn't use it enough. Why do you think OS 10 is inferior?

Objective-C is not exclusive to Apple platforms, they just happen to be one of it's most prominent supporters. As a matter of fact the GNU project has actually for long time been a supporter of the language due to its use in GCC and through the Gnustep project.

I dev in ObjC on iOS almost every day, and the language sucks. I think it sucks less than C++, but I'm not sure that says much. The Xcode IDE (which also sucks) and the bolted-on features help, but overall the language hasn't aged as well as plain old C - i.e. while coding in it, you are constantly reminded that it is not a modern programming language. Anytime a language gets in your way, it's a bad thing, and that happens an awful lot with ObjC.

(And before the flames start: yes, I fully recognize that nobody is forcing me to dev for the iOS platform, it's a choice I've made because I make gobs of money off of it. But that doesn't make ObjC suck any less, it just makes me willing to tolerate the suck and grumble about it on/.)

The above combine to make even the most basic operations tedious. Want to trim leading/trailing whitespace off a string? Enjoy [someString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceCharacterSet]]

Immutable arrays, dictionaries, sets, strings. I get it, it can be useful for performance to know something is immutable (maybe, I'm not that convinced). But the common use case is most certainly mutable, so/that/ should be the default, e.g. NSArray should be mutable and then if needed there can exist some NSImmutableArray or something. But no, they did it the other way around.

Memory management (up until recently) was neither fully manual nor fully automatic and ownership was based on naming conventions

Being a superset of C, they couldn't provide an object-oriented array class using the normal array syntax, so instead you have tedium like [myArray objectAtIndex:2]

Ditto for strings. Also, you have to prefix string constants with '@'.

The single inheritance model and a strict class hierarchy discourages writing reusable code. For example, if I have an app that runs on iPad and iPhone and they share a common screen (from the user's perspective although the layout/design might be significantly different), it's a real task to write a common parent class holds the common code.

Many violations of don't-repeat-yourself: if you want to have an object with properties like Foo.title, then the code is roughly:

Similarly, there's no way to have a truly dynamic object with a clean syntax, e.g. in Python/ruby/js/and a host of others you could have a quick little object you use to pass state around, kind of like a struct that has its members added on the fly, e.g.: x = new Object() ; x.name = 'dave' ; x.age = 3. There is literally no way to do that in ObjC - the best you can do is create mutable dictionary and use its verbose syntax.

We could have a whole extra thread about the toolchain (Interface Builder is an abomination, Xcode often pegs multiple CPUs when it's just sitting there, if you kill the simulator instead of stopping the app via Xcode then you often have to reboot your host machine before you can run the simulator again) but I digress.

About half of these have been fixed (and were actually new problems that didn't exist in Objc-1.0). Obj-C 2.0 introduced some of the "don't repeat yourself" issues with things like properties and consistency issues between the new property syntax and the existing syntax, but with the new LLVM you can just declare properties and not have to synthesize. In addition, ARC takes care of the memory management code. New LLVM also has Obj-C constants which take care of things like your dictionary example. You haven't needed a backing ivar, so that NSString *title; line in the interface can go.

(Your code is also in bad form. You shouldn't do [title release]. It should be self.title=nil. Much cleaner, and you get the same thing without doing a roundabout around the property. You also clear the reference to reduce any chance of accidentally hitting a dealloc'd object.)

As far as complaining about method names... That seems like a personal taste thing... And honestly, it's considered good form to have long method names these days, because the method names themselves become the comments, and it makes extremely clear exactly what the method does. And autocomplete should keep you from having to type out the entire method. There's been a lot of coding standards material written on why this is a good thing. You may not like it, but it seems like a majority of people actually do, and if you're writing APIs or classes, people aren't going to like short method names very much. Writing long method and variable names that describe exactly what they do was one of the first pieces of advice I got in my programming career, and it's served me very well.

It's actually one of my biggest pet peeves when a C developer starts writing Obj-C stuff. They use these short, truncated method names for no apparent reason, when a longer name will reduce confusion when the code is shared. (And making your code reusable is one of the tenants of Obj-C for very good efficiency reasons.)

As far as XCode, yeah, it sucks. We all know it. But every IDE has it's quarks. Eclipse is slow and also somewhat unstable, and Visual Studio has it's own quarks. There has been a lot of pressure on Apple to make XCode better, and indeed each release seems to be more and more stable, with 4.0 being the low point.

Honestly, from your example code, it certainly looks like you've worked with Obj-C, but it doesn't look like you really had a strong understanding of it. Not that I don't blame you, with Obj-C 2.0 it became a little harder to understand, but Apple is cleaning that up.

Again, rather the point. Otherwise your above example would endue as initWithBitmapData(p,w,h,b,s,a,p,c,f,r,b).

The above combine to make even the most basic operations tedious. Want to trim leading/trailing whitespace off a string? Enjoy [someString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceCharacterSet]]

Autocomplete makes this rather irrelevant, no? Especially the much improved autocomplete in Xcode4+.

Immutable arrays, dictionaries, sets, strings. I get it, it can be useful for performance to know something is immutable (maybe, I'm not that convinced). But the common use case is most certainly mutable, so/that/ should be the default, e.g. NSArray should be mutable and then if needed there can exist some NSImmutableArray or something. But no, they did it the other way around.

That you don't actually understand this point is pretty telling. Performance isn't the only reason to make arrays and such immutable by default. It

Theres only one way to find out, and it involves wading through extraordinarily long, unintuitive, and overly verbose object, property, and method names until, Surprise!, you find yet another feature of limited utility.

When I evaluate a language the first thing I do is look at a random block of code and say to myself is this what I really want to be writing?

When I look at lisp all I see is endless streams of ()()())))) and my brain instantly reboots in a violent seizure.

jquery would be a decent system if only I could get over the rediculous hackish syntax needed to workaround underlying JS environment.

ASP and close neighbors were always a turnoff due to the weird escape sequences you needed to plaster absoultely everywhere more recently razor cleaned that up somewhat.

Objective c has too many perlish @ symbols and a rediculous number of [] [][][ ][][][] [] contraptions all over the place. I know this sounds and is shallow but when I look at code I really need to see the code not have to look under layers of syntatic nonsense existing only for convenience or compatibility/interop purposes.

Give me a capable clean language not hacks upon hacks.

Given enough time any language can be made useful... this does not mean I would ever willingly choose to use it. I'm instantly wary of languages with only one killer app (iphone) unless it is heavily domain specific.

What is the percentage of software projects being developed for Mac OS X? iOS, on the other hand, currently dominates the smarthphone market and development on this platform is mainly done in Objective-C, which explains the statistics. But it's a "dynamic" situation, as we are all aware.

Tiobe's data is an indicator of how active internet based discussions on each programming language. Even Tiobe says it themselves:

"What programming languages are hot in the Internet discussions? "

and

" Observe that the TIOBE index is not about the best programming language or the language in which most lines of code have been written." (http://www.tiobe.com/content/paperinfo/tpci/index.html)

All the graph on that link above shows is there's been an increase in the amount of discussion on Objective C. You can say it's due to an increase in adoption, or you could say it's due to people being absolutely fitful with learning it. There's no way to tell what the data means. You may as well google " sucks" and count the results.

I think the author/submitter is being very hopeful in the way they have construed the data.

This seems more likely to be due to the easy money currently seeming to be in iOS apps. It's a big installed base, there's a delivery system, and the consumers have been trained to expect to pay some money for just about everything on it (whereas the usefulness of free 'droid apps generally seems to be way higher - in my, admittedly limited, experience).

I mean, if you have an idea, then the thing you want to do is try and get a few hundred thousand people to buy it for a $1, so that's what everyone is currently doing. I don't think it really says anything beyond that.

I find Android apps are not nearly as useful as similar iOS apps. They are usually slower, uglier, and buggier - free or not.

Given the choice between a free Android app that is a turd, and a great iOS app that costs $1, I'll gladly pay the $1.

Also, for developers, I think there is more to it than just the money. With iOS you can test a reasonable amount of the devices on the market and the screen sizes they use. With Android? Not unless you happen to have a few hundred Android devices kicking around