A fast look at Swift, Apple’s new programming language

For better or worse, Apple's new language lets you do things your way.

If anyone outside Apple saw Swift coming, they certainly weren't making any public predictions. In the middle of a keynote filled with the sorts of announcements you'd expect (even if the details were a surprise), Apple this week announced that it has created a modern replacement for the Objective-C, a programming language the company has used since shortly after Steve Jobs founded NeXT.

Swift wasn't a "sometime before the year's out"-style announcement, either. The same day, a 550-page language guide appeared in the iBooks store. Developers were also given access to Xcode 6 betas, which allow application development using the new language. Whatever changes were needed to get the entire Cocoa toolkit to play nice with Swift are apparently already done.

While we haven't yet produced any Swift code, we have read the entire language guide and looked at the code samples Apple provided. What follows is our first take on the language itself, along with some ideas about what Apple hopes to accomplish.

Why were we using Objective-C?

When NeXT began, object-oriented programming hadn't been widely adopted, and few languages available even implemented it. At the time, then, Objective-C probably seemed like a good choice, one that could incorporate legacy C code and programming habits while adding a layer of object orientation on top.

Further Reading

But as it turned out, NeXT was the only major organization to adopt the language. This had some positive aspects, as the company was able to build its entire development environment around the strengths of Objective-C. In turn, anyone who bought in to developing in the language ended up using NeXT's approach. For instance, many "language features" of Objective-C aren't actually language features at all; they are implemented by NeXT's base class, NSObject. And some of the design patterns in Cocoa, like the existence of delegates, require the language introspection features of Objective-C, which were used to safely determine if an object will respond to a specific message.

The downside of narrow Objective-C adoption was that it forced the language into a niche. When Apple inherited Objective-C, it immediately set about giving developers an alternative in the form of the Carbon libraries, since these enabled a more traditional approach to Mac development.

Things changed with the runaway popularity of the iPhone SDK, which only allowed development in Objective-C. Suddenly, a lot of developers used Objective-C, and many of them already had extensive experience in other programming languages. This was great for Apple, but it caused a bit of strain. Not every developer was entirely happy with Objective-C as a language, and Apple then compounded this problem by announcing that the future of Mac development was Cocoa, the Objective-C frameworks.

What's wrong with Objective-C?

Objective-C has served Apple incredibly well. By controlling the runtime and writing its own compiler, the company has been able to stave off some of the language limitations it inherited from NeXT and add new features, like properties, a garbage collector, and the garbage collector's replacement, Automatic Reference Counting.

But some things really couldn't be changed. Because it was basically C with a few extensions, Objective-C was limited to using C's method of keeping track of complex objects: pointers, which are essentially the memory address occupied by the first byte of an object. Everything, from an instance of NSString to the most complex table view, was passed around and messaged using its pointer.

For the most part, this didn't pose problems. It was generally possible to write complex applications without ever being reminded that everything you were doing involved pointers. But it was also possible to screw up and try to access the wrong address in memory, causing a program to crash or opening a security hole. The same holds true for a variety of other features of C; developers either had to do careful bounds and length checking or their code could wander off into random places in memory.

Beyond such pedestrian problems, Objective-C simply began showing its age. Over time, other languages adopted some great features that were difficult to graft back onto a language like C. One example is what's termed a "generic." In C, if you want to do the same math with integers and floating point values, you have to write a separate function for each—and other functions for unsigned long integers, double-precision floating points, etc. With generics, you can write a single function that handles everything the compiler recognizes as a number.

Apple clearly could add some significant features to the Objective-C syntax—closures are one example—but it's not clear that it could have added everything it wanted. And the very nature of C meant that the language would always be inherently unsafe, with stability and security open to compromise by a single sloppy coder. Something had to change.

But why not take the easy route and adopt another existing language? Because of the close relationship between Objective-C and the Cocoa frameworks, Objective-C enabled the sorts of design patterns that made the frameworks effective. Most of the existing, mainstream alternatives didn't provide such a neat fit for the existing Cocoa frameworks. Hence, Swift.

Then we'd reference it with Planets.Mercury, Planets.Jupiter, etc. so we're not concerned with the actual value that each planet represents. That's the whole point of enums: a simple to compare/use value with an easy to read definition.

Well of course "most people" don't know what a C-style ternary operator looks like, but I hope that most software developers do... Although I suppose it's possible to avoid contact if you use languages that aren't very C-like.

What's the deal with 'let'. I've not seen that since my old VB days. I've always considered it ugly syntactic sugar. Why do we have to be polite to our variables?

Assignments in terms of forcefulness.int x = 7;let x=7;superposition x = 7; (X can be any value, we hope it is 7, but takes a random value after we read it)slipItARoofie x = 4.5 (underhanded assignment, how to get ti to do something it would not otherwise do.) Also valid:"x =7 using ambien"

Really people, if you use the work let, you imply there is a chance it might not actually take on the value of assignment. I thought this dies a long time ago.

As far as I can see, in Swift "let" means "constant", while "var" means "variable".So, "var x = 7" is similar to "int x = 7", while "let x = 7" is similar to "const int x = 7".

What's the deal with 'let'. I've not seen that since my old VB days. I've always considered it ugly syntactic sugar. Why do we have to be polite to our variables?

Assignments in terms of forcefulness.int x = 7;let x=7;superposition x = 7; (X can be any value, we hope it is 7, but takes a random value after we read it)slipItARoofie x = 4.5 (underhanded assignment, how to get ti to do something it would not otherwise do.) Also valid:"x =7 using ambien"

Really people, if you use the work let, you imply there is a chance it might not actually take on the value of assignment. I thought this dies a long time ago.

I believe "let" comes from lisp, and really it comes from math (proofs). The implication you see is not one that most people see.

What's the deal with 'let'. I've not seen that since my old VB days. I've always considered it ugly syntactic sugar. Why do we have to be polite to our variables?

Assignments in terms of forcefulness.int x = 7;let x=7;superposition x = 7; (X can be any value, we hope it is 7, but takes a random value after we read it)slipItARoofie x = 4.5 (underhanded assignment, how to get ti to do something it would not otherwise do.) Also valid:"x =7 using ambien"

Really people, if you use the work let, you imply there is a chance it might not actually take on the value of assignment. I thought this dies a long time ago.

As far as I can see, in Swift "let" means "constant", while "var" means "variable".So, "var x = 7" is similar to "int x = 7", while "let x = 7" is similar to "const int x = 7".

Yes, yes, but it's completely the wrong word. If you're going to make something a constant "let" is not the word to use. People have used "static" and "const" (for various nuances of fixedness) successfully, and they read a whole lot more intuitively than "let" for a constant"

The biggest problem that I've seen with Swift so far is the way that arrays are managed.

If you assign an Array instance to a constant or variable, or pass an Array instance as an argument to a function or method call, the contents of the array are not copied at the point that the assignment or call takes place. Instead, both arrays share the same sequence of element values. When you modify an element value through one array, the result is observable through the other.

That looks fine, arrays are just pointers. Except...

For arrays, copying only takes place when you perform an action that has the potential to modify the length of the array. This includes appending, inserting, or removing items, or using a ranged subscript to replace a range of items in the array.

So whether or not an array passed to a function is modified by the function depends on whether the length of the array has been changed.

Well of course "most people" don't know what a C-style ternary operator looks like, but I hope that most software developers do... Although I suppose it's possible to avoid contact if you use languages that aren't very C-like.

I'm hard-pressed to think of a language I've ever used (other than maybe BASIC) that doesn't support the ternary operator.

Then again, this guide seemed very dumbed down, almost as if its intended target wasn't programmers... for a review of a programmer language...

I did some elementary programming back in school (Eiffel, Basic and visual basic, Pascal and Delphi), I remember the theoretical fundamentals in that I can figure out what it is I'm trying to do (so I can write pseuedocode that references Arrays, ifs/block ifs, loops, functions, proceeedures, etc) but remembering the actual specifics for each language is a bit hit an miss (I could probably do soemthing elementary with a bit of google).

I want to get into indie development for OSX/iOS. I'm looking for a hobby that might have an odd sale every now and again, not a career in coding.

Am I better grabing an Objective C book now and learning that, or waiting for Swift, or doing something else?

Can anyone recommend a book that hits the sweet spot of me not being completely clueless and needing to be stepped through theoretical examples what loops are for the billionth time, but being a syntax newbie?

Well, "static" is really a bad word for "const" as well.In C world, it usually means static storage.Or a symbol which isn't exported from a binary object, which is another terrible use for "static".

Oh, I completely agree, static is a horrible word, and is even worse that it can appear before or after, but it does something concrete. Static is fixed in space, const is fixed in time, therefore static const (or const static) is fixed in space and time.

You have read an entire 550 page document in less than 72 hours? Impressive, but for me personally a bit hard to believe.

If you've studied several programming languages before, none of the concepts are new in Swift and you can casually skim through it in 30 minutes to see what they have done. I don't think you need to deeply study this unless you have some unique need, like writing your own Swift compiler or tools.

When Apple inherited Objective-C, it immediately set about giving developers an alternative in the form of the Carbon libraries, since these enabled a more traditional approach to Mac development.

That's not quite the way I remember it. I went to WWDC in 2000 and 2001, and even though they were trying to reassure Carbon devs that they wouldn't be left behind in the dust, there was a clear "Cocoa is the future" vibe.

All the cool, shiny new stuff was in Cocoa. Carbon was mainly used to allow access to legacy APIs that came from the Classic MacOS. E.g., for a long time there were no Cocoa APIs for QuickTime.

Then we'd reference it with Planets.Mercury, Planets.Jupiter, etc. so we're not concerned with the actual value that each planet represents. That's the whole point of enums: a simple to compare/use value with an easy to read definition.

And if you *really* wanted the planet that had the integer value 6, you'd just assign it to the cast 6.

This is definitely a good thing. One of my issues with Objective-C was actually when I hit the plain old C part. Not that the C part was hard, it just required me to shift how I worked with the language which was jarring when I was in the flow.

The other nice thing about Swift (which no one is talking about) is that this is going to be used not just in coding iOS apps. For example, it's included with LLDB so you can write debugging extensions with it (currently you have to use Python which is a pain). From the keynote, I wouldn't be surprised if there will be a server component written in it as well. That way, you can write your entire application stack in one language. That would be a huge win.

That said, I will miss Objective-C, especially the brackets. It was different in a good way.

QAm I better grabing an Objective C book now and learning that, or waiting for Swift, or doing something else?

Frankly, I'd advice going with a more general programming language. Such as C++ or Java. Swift as a first language is an insanely bad idea right now.

If you're new to programming, Googling 'how do I X' is invaluable. You'll get a plethora of results for C++ or Java, but absolutely nothing for Swift (in fact, since there are two languages that go by 'Swift' - you're likely to get results for the wrong language).

In addition, those two languages have plenty of published works that you can choose to reference, Swift does not.

But if you don't happen to remember that Saturn is the sixth planet from the Sun—or happen to be working with a list that's less memorable than planets—you'll find yourself wasting time as you try to figure out what number you should expect.

Maybe I'm one of the dinosaurs, but one of Pascal's GREAT aspects was that once you created an enumeration, it served quite well as both a list of programmer constants and an index to arrays.

Code:

PlanetSunDist: ARRAY [Planet] OF REAL

would store a useful attribute and you never worried about whether Saturn was 6; it was just "Saturn." PlanetSunDist[6] was not risky; it was impossible. IIRC, Borland made

Code:

FOR p:= Planet.min TO Planet.max ...

duck soup.

Yes, this gets strained when you try to extend your enumerations, store polymorphic elements, but when you don't need anything you couldn't do in C or FORTRAN, it made for extremely self-documenting, easy-to-write and very efficient code.

Quote:

If you choose, you can write code in Swift pretty tersely, which should make things easier for developers.

Though nobody ever accused Pascal of being excessively terse, I never understood why people who code for hours per day didn't naturally acquire high-speed typing of alphas that obviate the verbosity, and enjoy the fact that your workmates' efforts were more understandable.

Yes... it can figure out what 'var' means at compile time based on what's being assigned to it. I think it's interesting that they chose the keyword "let" rather than (what is so common elsewhere) "const"... but then, it's also par for the course.

Math geeks, lisp fans, need to be different or just wanting to save 2 keystrokes.It's a minor taste issue.

In a nice touch, you can use scientific notation to assign values to floating point variables.

and this:

Quote:

In most languages, you have to explicitly keep multiple case statements from executing, a common source of errors. With Swift, you have to explicitly tell the compiler if you want to fall through to the next case statement.

I want to get into indie development for OSX/iOS. I'm looking for a hobby that might have an odd sale every now and again, not a career in coding.

Objective-C isn't going away anytime soon, so there is still value in learning the language. If you haven't programmed in a long time, I highly suggest "Head First Java" as that will teach both Java and the concepts of object orientated programming. Objective-C and Java have a lot in common so it's not a wasted trip.

The other book that I recommend is called "The iOS Apprentice". This book will teach you iOS development through the process of creating an app. Just a bit of disclosure, I'm an editor for the book's publisher (raywendelrich.com) but it's a great read for starting out iOS development. We actually give away the first section for free which is around 140 pages.

But really, I highly suggest you start with Head First as they lay a great foundation. They do have an iOS book but I think you'll learn more about programming in general with their Java book.