Concurrency in iOS: serial and concurrent queues in Grand Central Dispatch (GCD) with Swift 3

[Download the full Swift Xcode project from GitHub.]
[Download the full Objective-C Xcode project from GitHub.]


UPDATE: I’ve updated this article for Swift 4, learned a few new tricks, and taken advantage of the Swift 4 compiler’s “intelligence.” Please check out the new version as it’s more comprehensive and detailed, and my source code has been highlighted and commented to better help you understand the sometimes confusing concept of parallelism. There’s a brand new Xcode companion project, too.


Today, I’m going to start answering some of the concurrency questions I asked you to ponder in yesterday’s post entitled “Concurrency in iOS — Grand Central Dispatch (GCD) with Swift 3.” Specifically, I’m going to write some code in Swift 3 and Objective-C showing you the difference between serial and concurrent queues. But before coding we’ll talk about concurrency in general, the terminology used in discussing concurrency (threads, process, and tasks), the differences between the terms “concurrent” and “parallel,” the differences between serial and concurrent queues, the differences between synchronous and asynchronous methods/functions, and finally we’ll wrap up with some more definitions you need to know about.


Let’s refresh our memories about why concurrency is needed in iOS app programming. How would you write an iOS app today of even modest complexity without using concurrency? If all your code ran sequentially, you’d end up with unhappy users. Suppose your app performs statistical computations, analyzes big data, performs image transformations (like sharpening, changing tint, applying filters), or allows downloads of large files. You want your app’s UI to be responsive. You want your app to always be able to respond to user events like taps and swipes — gestures — and possibly continue running some type of process if he or she switches to another app. Let’s start with some basic definitions.

Concurrency
According to Apple, “Concurrency is the notion of multiple things happening at the same time.” In other words, I can develop and app that can perform certain tasks simultaneously, as in yesterday’s post in which I downloaded very large image files and allowed the user to keep pressing a button, and kept updating a UIProgressView. While one task was performing heavy duty work downloading files, the user interface (UI) remained responsive. But we need to define some common terminology before continuing onto to topics like clarifying the definition of “concurrency.”

Terminology: threads, process, and tasks
Again, according to Apple:

  • The term thread is used to refer to a separate path of execution for code. The underlying implementation for threads in OS X is based on the POSIX threads API.
  • The term process is used to refer to a running executable, which can encompass multiple threads.
  • The term task is used to refer to the abstract concept of work that needs to be performed.

While I could introduce you to threads, using a technology like Grand Central Dispatch is generally so much easier and less confusing. You may need the level of control afforded by threads one day, but I doubt you will very soon. From Apple:

… Although threads have been around for many years and continue to have their uses, they do not solve the general problem of executing multiple tasks in a scalable way. With threads, the burden of creating a scalable solution rests squarely on the shoulders of you, the developer. You have to decide how many threads to create and adjust that number dynamically as system conditions change. Another problem is that your application assumes most of the costs associated with creating and maintaining any threads it uses. …

One of the technologies for starting tasks asynchronously is Grand Central Dispatch (GCD). This technology takes the thread management code you would normally write in your own applications and moves that code down to the system level. All you have to do is define the tasks you want to execute and add them to an appropriate dispatch queue. GCD takes care of creating the needed threads and of scheduling your tasks to run on those threads. Because the thread management is now part of the system, GCD provides a holistic approach to task management and execution, providing better efficiency than traditional threads. …

Concurrent versus parallel
For most of us, we just need to know that our code supports “the notion of multiple things happening at the same time.” GCD hides these details from us for good reason. But if you really want to know:

… In many fields, the words parallel and concurrent are synonyms; not so in programming, where they are used to describe fundamentally different concepts.

A parallel program is one that uses a multiplicity of computational hardware (e.g., several processor cores) to perform a computation more quickly. The aim is to arrive at the answer earlier, by delegating different parts of the computation to different processors that execute at the same time.

By contrast, concurrency is a program-structuring technique in which there are multiple threads of control. Conceptually, the threads of control execute “at the same time”; that is, the user sees their effects interleaved. Whether they actually execute at the same time or not is an implementation detail; a concurrent program can execute on a single processor through interleaved execution or on multiple physical processors. …

Serial versus concurrent queues (deterministic versus nondeterministic)
In GCD, we submit tasks — blocks of code — to a queue for eventual execution. GCD provides “dispatch queues” to which tasks are submitted, managed, and executed. Think of a dispatch queue as a line of customers waiting to place orders at a fast food joint. Think of each person as a task, i.e., each customer will place an order, and they are always served in first-in, first-out order (FIFO). GCD provides serial and concurrent queues. Before specifically defining these queue types, let me say that serial queues are deterministic; in other words, they are exactly predictable, linear, where no randomness is involved. While a serial queue executes its tasks one at a time, in the order submitted, those tasks are still running in the background. A serial queue may be slower than a concurrent one, but it’s still in the background.

Concurrent queues are stochastic or nondeterministic; meaning that they’re inherently random, i.e., “The same set of parameter values and initial conditions will lead to an ensemble of different outputs.” You’ll soon see why I mentioned these tidbits. While tasks are submitted to a concurrent queue in the order you specify, some or even most of them will probably execute simultaneously.

Formally, according to Apple:

Dispatch queues are an easy way to perform tasks asynchronously and concurrently in your application. A task is simply some work that your application needs to perform. … A dispatch queue is an object-like structure that manages the tasks you submit to it. All dispatch queues are first-in, first-out data structures. Thus, the tasks you add to a queue are always started in the same order that they were added. GCD provides some dispatch queues for you automatically, but others you can create for specific purposes. …

Serial queues (also known as private dispatch queues) execute one task at a time in the order in which they are added to the queue. The currently executing task runs on a distinct thread (which can vary from task to task) that is managed by the dispatch queue. Serial queues are often used to synchronize access to a specific resource. …

Concurrent queues (also known as a type of global dispatch queue) execute one or more tasks concurrently, but tasks are still started in the order in which they were added to the queue. The currently executing tasks run on distinct threads that are managed by the dispatch queue. The exact number of tasks executing at any given point is variable and depends on system conditions. …

The main dispatch queue is a globally available serial queue that executes tasks on the application’s main thread. This queue works with the application’s run loop (if one is present) to interleave the execution of queued tasks with the execution of other event sources attached to the run loop. Because it runs on your application’s main thread, the main queue is often used as a key synchronization point for an application. …

Asynchronous versus synchronous
When you submit a task — block of code — to a GCD queue, you need to be aware of what happens after submission. You can write code so that when a task is submitted to a queue, your code must wait until the task completes (this is synchronous). On the other hand, you can write your code so that when you submit a task to a queue, your code keeps on executing regardless of the submitted task’s execution (this is asynchronous). Here’s a great definition:

… A synchronous function returns only after the completion of a task that it orders.

An asynchronous function, on the other hand, returns immediately, ordering the task to be done but not waiting for it. Thus, an asynchronous function does not block the current thread of execution from proceeding on to the next function. …

It should be apparent from this discussion why we only update the UI on the main thread, but if it’s not clear to you:


Application user interfaces are always single-threaded, even in multi-threaded devices – there’s only one representation of the screen and any changes to what is displayed need to be coordinated through a single ‘access point’. This prevents multiple threads from trying to update the same pixel at the same time (for example)!

Your code should only make changes to user interface controls from the main (or UI) thread. Any UI updates that occur on a different thread (such as a callback or background thread) may not get rendered to the screen, or could even cause a crash. …

GCD code in Swift 3 and Objective-c
You can advantageously choose between serial and concurrent GCD queues. If you have a set of tasks, for example a bunch of image downloads that you need to run in the background, but don’t care about the order in which the downloads occur, use a concurrent queue. If you have a set of CPU-intensive tasks that you’d like to run in the background, but must be run in a specific order, use a serial queue. Since a serial queue executes tasks one at a time in the order you specify, they’re great for protecting shared resources; no two or more tasks can access that resource at the same time. Serial queues are also great for tasks that have interdependencies and must be run in a specific order. Think of building a house. You can’t put up the roof until the house’s foundation has been laid and at least one floor with walls must be constructed.

Notice yesterday that I submitted my image file download tasks on a concurrent queue using Swift 3 (see highlighted lines of code 9, 10, 11, 12):

Note that the ordering of image download tasks is always stochastic (random) when using a concurrent queue. Download my companion project and see for yourself:

Today I wrote code which submits my image file download tasks on a serial queue using Swift 3. Note that while a serial queue executes its tasks, one at a time, in the order submitted, it’s still running in the background. It may be slower than a concurrent queue, but it’s still in the background. See highlighted lines of code 3, 4, 5, 13, 14, 15, 16 below:

Note that the ordering of image download tasks is always deterministic (the same) when using a serial queue. Download my companion project and see for yourself:

Here’s the Objective-C equivalent of my Swift code using a concurrent queue:

Here’s the list of URLs to images to be downloaded:

Note that the ordering of image download tasks is always stochastic (random) when using a concurrent queue:

Here’s the Objective-C equivalent of my Swift code using a serial queue:

Note that the ordering of image download tasks is always deterministic (the same) when using a serial queue:

Concurrency glossary
I found an Apple glossary which contains terms to keep in mind as we continue our exploration of concurrency in iOS. Go to the page and read all the definitions:

critical section A portion of code that must be executed by only one thread at a time. …

main thread A special type of thread created when its owning process is created. When the main thread of a program exits, the process ends.

mutex A lock that provides mutually exclusive access to a shared resource. A mutex lock can be held by only one thread at a time. Attempting to acquire a mutex held by a different thread puts the current thread to sleep until the lock is finally acquired. …

Wow. We’ve covered a lot of ground regarding iOS concurrency — and we’ve only scratched the surface. Apple has a more robust concurrency protocol based on classes like NSOperation and NSOperationQueue. We haven’t even talked about concepts like critical sections, mutex, and deadlock. Please check in regularly for more posts. I hope you enjoyed today’s and yesterday’s posts. Please leave a comment if you have questions or feedback. Thanks!

[Download the full Swift Xcode project from GitHub.]
[Download the full Objective-C Xcode project from GitHub.]

Author: Andrew Jaffee

Avid and well-published author, software engineer, designer, and developer, now specializing in iOS mobile app development in Objective-C and Swift, but with a strong background in C#, C++, .NET, JavaScript, HTML, CSS, jQuery, SQL Server, MySQL, Oracle, Agile, Test Driven Development, Git, Continuous Integration, Responsive Web Design, blah, blah, blah ... Did I miss any fad-based catch phrases? My brain avatar was kindly provided by http://icons8.com under a Creative Commons Attribution-NoDerivs 3.0 Unported license.

Leave a Reply

Your email address will not be published. Required fields are marked *