Concurrency in Swift: One possible approach

spearo77 | 188 points

I'm surprised that async/await is the preferred idiom.

Having worked with async/await in Node.js a lot, it is of course a significantly better solution than plain promises, but it is also quite invasive; in my experience, most async code is invoked with "await". It's rare to actually need to handle it as a promise; the two main use cases where you want to handle the promise as a promise is either when doing something like a parallel map, or when you need to deal with old callback-style code where an explicit promise needs to be created because the resolve/reject functions must be invoked as a result of an event or callback.

Would it not be better to invert this -- which is the route Erlang and Go went -- and make it explicit when you're spawning something async where you don't want to deal with the result right away? In Go, you just use the "go" keyword to make something async. So the caller decides what's async, not the callee. If callers arbitrarily decide whether to be async or not, a single async call ends up infecting the whole call chain (which need to be marked "async" unless you're explicitly handling the promise/continuation without "await").

lobster_johnson | 7 years ago

Async / await also need a idiom to accompany on how to support cancellation. In practice, cancellation happens a lot because the unbounded latency for async operations. Without a throughout support, async / await syntax is probably usable on server side somewhat but still hardly applicable on client side (as the latency is unbounded). On the other hand, C# does go through the pain and added cancellation support to all its standard libraries async API.

liuliu | 7 years ago

As someone who doesn't develop for the Apple ecosystem, I can't quite find one place that articulates well what the Swift development philosophy is and what it brings to the table besides just being modern language with shims for interacting with legacy Apple APIs. Why would I use Swift on Linux?

Most of the posts that I see related to Swift are RFCs evaluating solutions to problems in other languages. I rarely get to see the actual solutions being integrated into the language. Is this just a result of HN readers caring more about language design than learning to leverage the Swift language?

jzelinskie | 7 years ago

Feels like the elephant in the room for the description is not once mentioning Futures but instead jumping straight to using async / await keywords and the actor model.

As evidenced by C#, you can't avoid leaking the type signature of async operations if you actually support generic programming- so while that's a nice ergonomics improvement, it only adds complexity to the actual concurrency model. Go enthusiasts out there will appreciate that go solves this by refusing to support PROGRAMMABLE generic abstractions at all (looking at you, channels and map).

Referencing the actor model and making it first class is interesting, but probably a mistake. Actors are hard to reason about because they're so flexible. Pony is a good recent attempt at combining static types with actors, bit they didn't put performance into the "non-goals" section of their language spec.

If you want task level concurrency and you want it to play nice with your type system, you have to start with Scala and work backward to the alternative implementation choices you're going to make because it checks all the boxes of all the "goals" and ALSO has a very mature actor model implementation that doesn't require promoting actors to keyword status in the language.

haimez | 7 years ago

And the supporting swift Pull Request is https://github.com/apple/swift/pull/11501

spearo77 | 7 years ago

Maybe it's time for Swift to decide whether it's going to stick to multicore systems or actually leave this single box mindset and enable distributed systems programming. I think designing for distributed systems first could lead to a lot of right choices from the beginning and would still allow for various optimizations later to get the best performance from multicore. Alternative is not very promising and provides a lot of room for really stupid mistakes. Either way, I'm glad to see actor model getting more traction.

zzzcpan | 7 years ago

Definitely some cool ideas here! The motivation seems a little high-level to me, though. Language features ought to help you solve concrete, real-world problems. I can understand what problem async/await solves -- there's a great example with code -- but the actor stuff isn't as clear-cut.

panic | 7 years ago

Software interrupts is the right answer to concurrency nonsensical hype. Sometimes old is gold.

Wrapping specialized interrupt handlers into some higher lebel API, such as AIO, is the right way.

Async/await is a mess. Concurrency cannot be generalized to cover all the possible cases. It must have specialization.

Engineers of old times who created the early classic OSes were bright people, contrary to current hipsters.

Once popularized, flawed abstractions and apis such as pthreads would stick (only idiots would accept shared stack and signals).

Look at how an OS implements concurrency and wrap it into higher level API. That's it.

lngnmn | 7 years ago
weitzj | 7 years ago

> the speed of light and wire delay become an inherently limiting factor for very large shared memory systems

Wow, cool to see actual speed of light mentioned as a factor, never seen that before.

hellofunk | 7 years ago

Doesn't Chris work at Google now? Where does he find the time to just go off and implement the Actor Model? I assume this was done when he was still at Apple?

jorblumesea | 7 years ago

Must be crazy for Apple to have such an influencing voice on Swift and LLVM working for its archenemy

tambourine_man | 7 years ago

correct me if I'm wrong but this is what C# has? I recall writing async and await when I did a Windows Phone app many years ago.

perfectstorm | 7 years ago

Any such model MUST show what error handling will look like. Otherwise you end up with "log to console and ignore" approach so prevalent in Android SDK examples, which people _will_ copy unchanged.

0xbear | 7 years ago