Thursday, April 21, 2016
“Performance These Days” ↦
Brent Simmons weighs in on Swift’s performance as a programming language:
Maybe Swift is faster than Objective-C, or will be. But that also hardly matters.
[…]
Instead of language performance the focus should be on developer productivity. Developer performance. That’s the thing that counts these days.
Of course, that’s the argument that proponents of interpreted languages have been using for some time now.
But I think Swift otherwise gives and takes away — we get type inference and all the lovely things I mentioned before, but we end up fighting the type system, standing on our heads to deal with optionals, and working with a language that’s much larger (demonstrably) than is needed for writing great apps. It solves a whole bunch of problems that didn’t need solving (for app-writing).
I tend to agree. It seems there’s been a trend lately—embodied by Swift, Rust, and Go—towards languages whose typing system is strong, static, and inferred (where possible). The implicit promise is that you can have the best of both worlds: the “safety” of strong, static typing combined with the simplicity of not always having to declare types. But strong, static typing is still what it has always been: fiddly. It significantly reduces flexibility for the sake of catching certain classes of programming errors at compile time1.
Marco Arment adds his thoughts:
I don’t know much Swift yet. But I’ve felt since its introduction that while it seems like a good language overall, it feels more like a language designed by C++ enthusiasts to replace C++, rather than being particularly optimized for 99% of what it’ll really be used for: making high-level mobile and PC apps.
[…]
The idea of one language to serve all roles, high-level to low-level, is an interesting thought challenge, but I don’t think it could exist.
-
It’s worth noting that not all of the “errors” that strong, static typing catches at compile time are actually errors. Swift’s optionals represent an implicit admission of this by allowing developers to use
nil
in variables that would otherwise require a value of a different type. But there are plenty of other cases where type may not matter as much. Many math operations, for example, should be just as applicable to floating point numbers as they are to integers. ↩