Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.

A Guide on How to Eliminate Thread Explosions in iOS: GCD and Swift Concurrency

2025/12/02 10:53
7 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Introduction

Thread explosion is a situation where multiple threads run simultaneously, causing performance degradation and memory overhead. In this article, we explore how to eliminate thread explosions and how Swift Concurrency helps prevent them.

Thread Explosion in GCD

Foundation

The system doesn’t provide an exact answer to how many threads we have. Based on WWDC Swift Concurrency Behind the Scenes, we can conclude that there are 16 threads per CPU core.

\ Let’s consider the following code:

import Foundation let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) for _ in 0...127 { queue.async { sleep(5) } }

\ The queue is a concurrent queue, and it schedules 128 tasks concurrently without limiting the number of active threads. Each task simulates a real-world heavy operation. Due to concurrent execution, the system might spawn numerous threads, causing performance degradation and increased CPU usage.

The output is separated into two groups, each containing 64 elements, due to a thread limit of 64. These 64 operations can be executed in any order because we are dealing with a concurrent queue.

\ Grand Central Dispatch (GCD) doesn’t have a built-in mechanism to prevent thread explosion. Next, we examine thread explosion in concurrent and serial GCD queues.

Deadlocks in Concurrent Queues

The thread explosion may lead to deadlocks in concurrent queues. Let’s consider the following example:

import Foundation let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1", attributes: .concurrent) let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0) (0..<64).forEach { _ in queue1.async { dispatchSemaphore.wait() } } (0..<64).forEach { _ in queue2.async { dispatchSemaphore.signal() } }

\ It may not be obvious that this code will cause a deadlock. The first concurrent queue schedules 64 tasks, each waiting on a semaphore. In this particular case, the thread limit is 64, meaning all available threads are occupied by these tasks. However, none of the tasks can proceed because they are all blocked, waiting for a signal from the second queue. Meanwhile, the second queue is also trying to run its tasks, which involve signaling the semaphore.

\ Deadlock

But, since all threads are blocked by the first queue’s wait() calls, none of the signals from queue2 can be processed, resulting in a deadlock where both queues are waiting for each other indefinitely.

\ There are three possible solutions.

  1. Use OperationQueue to limit simultaneous tasks.

    let operationQueue = OperationQueue() operationQueue.maxConcurrentOperationCount = 5 operationQueue.qualityOfService = .background

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0)

    (0..<64).forEach { _ in operationQueue.addOperation { dispatchSemaphore.wait() } }

    (0..<64).forEach { _ in queue.async { dispatchSemaphore.signal() } }

    \

  2. Use DispatchSemaphore to limit simultaneous tasks.

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 3)

    (0..<128).forEach { index in dispatchSemaphore.wait() queue.async { dispatchSemaphore.signal() } }

    \

DispatchSemaphore controls access to the queue and does not allow performing more than 3 operations at a time.

  1. Use Swift Concurrency to prevent thread explosion.

    \

Swift Concurrency can manage thread explosion. We will explore how it prevents thread explosion later in this article.

Deadlocks in Serial Queues

Deadlocks can occur in serial queues as well. The GCD thread limit for both concurrent and serial queues is capped at 512.

let dispatchSemaphore = DispatchSemaphore(value: 0) for _ in 0...511 { let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1") queue1.async { dispatchSemaphore.wait() } } for _ in 0...511 { let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2") queue2.asyncAfter(deadline: .now() + 1.0) { dispatchSemaphore.signal() } }

\ Serial Queues Deadlock

\ The inner for-loop creates 512 serial queues, consuming all available threads. As a result, the first loop occupies all threads, causing a deadlock because there are no threads left to execute the second loop’s queue operations.

Swift Concurrency

Swift Concurrency prevents thread explosion, and in this part, we are going to take a look at how it works. Let’s explore how Swift Concurrency manages tasks efficiently and avoids creating excessive threads.

Priorities

In Swift Concurrency, we have only three task priorities: .userInitiated, .utility, .background.

  • .userInitiated is a high-priority task for user-initiated actions.
  • .utility is a medium-priority task for background processing.
  • .background is for low-priority tasks that can run in the background.

\ Let’s consider some examples:

// The high-priority task for user-initiated actions. Task(priority: .userInitiated) { await loadSomeData() } // The medium-priority task for background processing. Task(priority: .utility) { await processDataInBackground() } // The low-priority tasks that can run in the background. Task(priority: .background) { await performBackgroundCleanup() }

\ In this example, await loadSomeData() is assigned the highest priority with .userInitiated, indicating that it is a user-focused task requiring prompt execution. Background tasks, such as cleanup, are set to .background priority, allowing them to run without blocking critical operations.

How Swift Concurrency Is Managing Threads

Let’s consider examples that convert the cases from the previous section, but using Swift Concurrency. The task will perform a “heavy” operation, and we will run it with different priorities.

Tasks With the Same Priority Level

Let’s consider the following example:

func runTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runTask(seconds: 2) }

\ We can see that only 6 tasks are performed simultaneously, which is equal to the number of CPU cores on my device.

\ User Initiated Tasks

If you pause the execution, you may get a clearer picture of what happens behind the scenes.

\ The Number of Tasks

\ We may notice that all of these operations are performed inside com.apple.root.user-initiated-qos.cooperative, which limits the number of threads so that it doesn’t exceed the number of CPU cores.

\ Based on this observation, it’s clear that Swift Concurrency prevents thread explosion and doesn’t create more threads than there are CPU cores on the device.

Tasks With All Priority Levels at Once.

Let’s look at another example. In this case, we’ll run tasks with different priorities and observe what happens.

func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserInitiatedTask(seconds: 2) } for _ in 0...127 { runUserUtilityTask(seconds: 2) } for _ in 0...127 { runUserBackgroundTask(seconds: 2) }

\ As we can see, the .utility and .background queues are limited to 1 thread when there’s a higher-priority queue (.userInitiated).

\ The Number of Tasks

In this specific scenario, the maximum number of threads is 8.

The Last Example

What if we add some delays before starting each group of tasks?

\ In this example, we demonstrate how introducing delays before starting each group of tasks can impact their execution. We have three types of tasks, each with a different priority: .background, .utility, and .userInitiated. Each task sleeps for a specified duration, and tasks are executed in three groups, with a 2-second delay between each group.

func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserBackgroundTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserUtilityTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserInitiatedTask(seconds: 2) }

We can see that all three queues—background, utility, and user-initiated are processing multiple threads simultaneously. Interestingly, if the lower-priority queue is started first and given some time to run, the higher-priority queue doesn’t seem to negatively impact the performance of the lower-priority queue.

\ This suggests that the system is capable of handling tasks with different priorities efficiently, maintaining a balance even when tasks run concurrently.

Conclusion

While GCD doesn’t inherently prevent thread explosion, it offers tools for managing concurrency, such as task queues and dispatch groups. However, thread explosion can occur if tasks are dispatched incorrectly or in excess. In contrast, Swift Concurrency improves upon this by offering a more structured and efficient approach to concurrency management.

\ By prioritizing tasks and limiting concurrency based on available resources, Swift Concurrency reduces the risk of thread explosion and optimizes performance, ensuring that multiple tasks can run concurrently without overwhelming the system.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Why YouCam AI API is the Secret Weapon for E-Commerce Startups

Why YouCam AI API is the Secret Weapon for E-Commerce Startups

 The New Standard of Personalized Shopping In an era where digital engagement dictates market share, the transition from “browsing” to “buying” depends on confidence
Share
Techbullion2026/03/25 14:34
Vitalik Buterin Reveals Ethereum’s Bold Plan to Stay Quantum-Secure and Simple!

Vitalik Buterin Reveals Ethereum’s Bold Plan to Stay Quantum-Secure and Simple!

Buterin unveils Ethereum’s strategy to tackle quantum security challenges ahead. Ethereum focuses on simplifying architecture while boosting security for users. Ethereum’s market stability grows as Buterin’s roadmap gains investor confidence. Ethereum founder Vitalik Buterin has unveiled his long-term vision for the blockchain, focusing on making Ethereum quantum-secure while maintaining its simplicity for users. Buterin presented his roadmap at the Japanese Developer Conference, and splits the future of Ethereum into three phases: short-term, mid-term, and long-term. Buterin’s most ambitious goal for Ethereum is to safeguard the blockchain against the threats posed by quantum computing.  The danger of such future developments is that the future may call into question the cryptographic security of most blockchain systems, and Ethereum will be able to remain ahead thanks to more sophisticated mathematical techniques to ensure the safety and integrity of its protocols. Buterin is committed to ensuring that Ethereum evolves in a way that not only meets today’s security challenges but also prepares for the unknowns of tomorrow. Also Read: Ethereum Giant The Ether Machine Takes Major Step Toward Going Public! However, in spite of such high ambitions, Buterin insisted that Ethereum also needed to simplify its architecture. An important aspect of this vision is to remove unnecessary complexity and make Ethereum more accessible and maintainable without losing its strong security capabilities. Security and simplicity form the core of Buterin’s strategy, as they guarantee that the users of Ethereum experience both security and smooth processes. Focus on Speed and Efficiency in the Short-Term In the short term, Buterin aims to enhance Ethereum’s transaction efficiency, a crucial step toward improving scalability and reducing transaction costs. These advantages are attributed to the fact that, within the mid-term, Ethereum is planning to enhance the speed of transactions in layer-2 networks. According to Butterin, this is part of Ethereum’s expansion, particularly because there is still more need to use blockchain technology to date. The other important aspect of Ethereum’s development is the layer-2 solutions. Buterin supports an approach in which the layer-2 networks are dependent on layer-1 to perform some essential tasks like data security, proof, and censorship resistance. This will enable the layer-2 systems of Ethereum to be concerned with verifying and sequencing transactions, which will improve the overall speed and efficiency of the network. Ethereum’s Market Stability Reflects Confidence in Long-Term Strategy Ethereum’s market performance has remained solid, with the cryptocurrency holding steady above $4,000. Currently priced at $4,492.15, Ethereum has experienced a slight 0.93% increase over the last 24 hours, while its trading volume surged by 8.72%, reaching $34.14 billion. These figures point to growing investor confidence in Ethereum’s long-term vision. The crypto community remains optimistic about Ethereum’s future, with many predicting the price could rise to $5,500 by mid-October. Buterin’s clear, forward-thinking strategy continues to build trust in Ethereum as one of the most secure and scalable blockchain platforms in the market. Also Read: Whales Dump 200 Million XRP in Just 2 Weeks – Is XRP’s Price on the Verge of Collapse? The post Vitalik Buterin Reveals Ethereum’s Bold Plan to Stay Quantum-Secure and Simple! appeared first on 36Crypto.
Share
Coinstats2025/09/18 01:22
Resilient Pair Softens Below 111.00 Amidst Prevailing Bullish Momentum

Resilient Pair Softens Below 111.00 Amidst Prevailing Bullish Momentum

The post Resilient Pair Softens Below 111.00 Amidst Prevailing Bullish Momentum appeared on BitcoinEthereumNews.com. AUD/JPY Price Forecast: Resilient Pair Softens
Share
BitcoinEthereumNews2026/03/25 14:01