Swift For Application Development

I came across this blog post by Brent Simmons earlier today, and I found myself vehemently disagreeing with some of what Brent wrote, which doesn’t happen often.

He says:

Maybe Swift is faster than Objective-C, or will be. But that also hardly matters.

I completely agree.

But, performance is just one part of Swift’s value. Swift’s biggest value add (in my view, anyway) is correctness. A world where you could prove that a program was correct would be a much better place than the one we’re in now. We’re a long ways away from that in this industry, but every step we take towards it is a step in the right direction. Swift is one such step. If you write an idiomatic1 Swift program that compiles, you’re a lot more certain it’s correct, than a similar Objective-C program.

Yes, I realize that would have meant pushing some errors from compile time to run time, in exactly the same way Objective-C does right now.

But I don’t care — I’d take that trade-off every single day of the week.

That’s a trade-off I’d very rarely make. Detecting errors at compile time is a massive gain. Now, is it a bit more cumbersome to write in Swift than in Python 2? Absolutely!

However, in my experience, it is really hard to maintain even a moderately large project in Python with a team of people. Most non trivial Python projects I’ve seen have a ridiculous number of unit tests to check for things like type safety, and nullability anyway. Without those unit tests, it is easy to inadvertently make mistakes and not detect them. Instead of writing all those unit tests by hand, you might as well help the compiler out a bit, and eliminate those classes of error with no further effort.

in Swift, I sometimes feel like I’m filling out a form in triplicate before I can use a variable, or faxing various department heads before I can call a method.

I’ll forgive the hyperbole, and instead ask: is Swift sometimes more verbose than it needs to be? Sure. I’m confident that will get fixed over time, though. There is however a flip side to this. We write in high level programming languages instead of assembly for a reason: we’re writing code that’s intended to be read by other human beings, not just machines. If the programming language can help convey the intent of the original programmer more clearly to those who follow, I’m all for it. I could not possibly count the number of times I’ve had to ask myself whether I could set a value to None safely when writing Python. Every time I do it, I have to look it up to be sure. In Swift, the answer to that question is immediately obvious.

We’ve also given up simplicity (assuming you know C, and you should, the “Objective” part is pretty small) for a much more complex language.

He’s absolutely right about this, and there’s no debating the point. Swift is way more complex than C, and has a much larger surface area. It is a lot harder to learn all of Swift than all of C. However, you don’t need to know all of Swift to benefit from it. For instance, you can gain from Swift even if you don’t understand pattern matching, or what a PAT is, or how it works. I’m also hoping that with better education (from both Apple and the community) this becomes less of a problem as time goes on.

I’m the first one to concede that nobody really knows what exactly idiomatic Swift is yet. However, as long as you’re not telling the compiler to shut up when it tries to help out, I think you’re good. For starters, don’t litter your code with !, use let unless you have a really good reason to use var, don’t make everything a reference type, etc. ↩

I haven’t written anything other than small toy projects in Objective-C in a long time, so I’m using Python as an example. ↩