Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.

A Guide on How to Eliminate Thread Explosions in iOS: GCD and Swift Concurrency

Introduction

Thread explosion is a situation where multiple threads run simultaneously, causing performance degradation and memory overhead. In this article, we explore how to eliminate thread explosions and how Swift Concurrency helps prevent them.

Thread Explosion in GCD

Foundation

The system doesn’t provide an exact answer to how many threads we have. Based on WWDC Swift Concurrency Behind the Scenes, we can conclude that there are 16 threads per CPU core.

\ Let’s consider the following code:

import Foundation let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) for _ in 0...127 { queue.async { sleep(5) } }

\ The queue is a concurrent queue, and it schedules 128 tasks concurrently without limiting the number of active threads. Each task simulates a real-world heavy operation. Due to concurrent execution, the system might spawn numerous threads, causing performance degradation and increased CPU usage.

The output is separated into two groups, each containing 64 elements, due to a thread limit of 64. These 64 operations can be executed in any order because we are dealing with a concurrent queue.

\ Grand Central Dispatch (GCD) doesn’t have a built-in mechanism to prevent thread explosion. Next, we examine thread explosion in concurrent and serial GCD queues.

Deadlocks in Concurrent Queues

The thread explosion may lead to deadlocks in concurrent queues. Let’s consider the following example:

import Foundation let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1", attributes: .concurrent) let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0) (0..<64).forEach { _ in queue1.async { dispatchSemaphore.wait() } } (0..<64).forEach { _ in queue2.async { dispatchSemaphore.signal() } }

\ It may not be obvious that this code will cause a deadlock. The first concurrent queue schedules 64 tasks, each waiting on a semaphore. In this particular case, the thread limit is 64, meaning all available threads are occupied by these tasks. However, none of the tasks can proceed because they are all blocked, waiting for a signal from the second queue. Meanwhile, the second queue is also trying to run its tasks, which involve signaling the semaphore.

\ Deadlock

But, since all threads are blocked by the first queue’s wait() calls, none of the signals from queue2 can be processed, resulting in a deadlock where both queues are waiting for each other indefinitely.

\ There are three possible solutions.

  1. Use OperationQueue to limit simultaneous tasks.

    let operationQueue = OperationQueue() operationQueue.maxConcurrentOperationCount = 5 operationQueue.qualityOfService = .background

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0)

    (0..<64).forEach { _ in operationQueue.addOperation { dispatchSemaphore.wait() } }

    (0..<64).forEach { _ in queue.async { dispatchSemaphore.signal() } }

    \

  2. Use DispatchSemaphore to limit simultaneous tasks.

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 3)

    (0..<128).forEach { index in dispatchSemaphore.wait() queue.async { dispatchSemaphore.signal() } }

    \

DispatchSemaphore controls access to the queue and does not allow performing more than 3 operations at a time.

  1. Use Swift Concurrency to prevent thread explosion.

    \

Swift Concurrency can manage thread explosion. We will explore how it prevents thread explosion later in this article.

Deadlocks in Serial Queues

Deadlocks can occur in serial queues as well. The GCD thread limit for both concurrent and serial queues is capped at 512.

let dispatchSemaphore = DispatchSemaphore(value: 0) for _ in 0...511 { let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1") queue1.async { dispatchSemaphore.wait() } } for _ in 0...511 { let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2") queue2.asyncAfter(deadline: .now() + 1.0) { dispatchSemaphore.signal() } }

\ Serial Queues Deadlock

\ The inner for-loop creates 512 serial queues, consuming all available threads. As a result, the first loop occupies all threads, causing a deadlock because there are no threads left to execute the second loop’s queue operations.

Swift Concurrency

Swift Concurrency prevents thread explosion, and in this part, we are going to take a look at how it works. Let’s explore how Swift Concurrency manages tasks efficiently and avoids creating excessive threads.

Priorities

In Swift Concurrency, we have only three task priorities: .userInitiated, .utility, .background.

  • .userInitiated is a high-priority task for user-initiated actions.
  • .utility is a medium-priority task for background processing.
  • .background is for low-priority tasks that can run in the background.

\ Let’s consider some examples:

// The high-priority task for user-initiated actions. Task(priority: .userInitiated) { await loadSomeData() } // The medium-priority task for background processing. Task(priority: .utility) { await processDataInBackground() } // The low-priority tasks that can run in the background. Task(priority: .background) { await performBackgroundCleanup() }

\ In this example, await loadSomeData() is assigned the highest priority with .userInitiated, indicating that it is a user-focused task requiring prompt execution. Background tasks, such as cleanup, are set to .background priority, allowing them to run without blocking critical operations.

How Swift Concurrency Is Managing Threads

Let’s consider examples that convert the cases from the previous section, but using Swift Concurrency. The task will perform a “heavy” operation, and we will run it with different priorities.

Tasks With the Same Priority Level

Let’s consider the following example:

func runTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runTask(seconds: 2) }

\ We can see that only 6 tasks are performed simultaneously, which is equal to the number of CPU cores on my device.

\ User Initiated Tasks

If you pause the execution, you may get a clearer picture of what happens behind the scenes.

\ The Number of Tasks

\ We may notice that all of these operations are performed inside com.apple.root.user-initiated-qos.cooperative, which limits the number of threads so that it doesn’t exceed the number of CPU cores.

\ Based on this observation, it’s clear that Swift Concurrency prevents thread explosion and doesn’t create more threads than there are CPU cores on the device.

Tasks With All Priority Levels at Once.

Let’s look at another example. In this case, we’ll run tasks with different priorities and observe what happens.

func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserInitiatedTask(seconds: 2) } for _ in 0...127 { runUserUtilityTask(seconds: 2) } for _ in 0...127 { runUserBackgroundTask(seconds: 2) }

\ As we can see, the .utility and .background queues are limited to 1 thread when there’s a higher-priority queue (.userInitiated).

\ The Number of Tasks

In this specific scenario, the maximum number of threads is 8.

The Last Example

What if we add some delays before starting each group of tasks?

\ In this example, we demonstrate how introducing delays before starting each group of tasks can impact their execution. We have three types of tasks, each with a different priority: .background, .utility, and .userInitiated. Each task sleeps for a specified duration, and tasks are executed in three groups, with a 2-second delay between each group.

func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserBackgroundTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserUtilityTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserInitiatedTask(seconds: 2) }

We can see that all three queues—background, utility, and user-initiated are processing multiple threads simultaneously. Interestingly, if the lower-priority queue is started first and given some time to run, the higher-priority queue doesn’t seem to negatively impact the performance of the lower-priority queue.

\ This suggests that the system is capable of handling tasks with different priorities efficiently, maintaining a balance even when tasks run concurrently.

Conclusion

While GCD doesn’t inherently prevent thread explosion, it offers tools for managing concurrency, such as task queues and dispatch groups. However, thread explosion can occur if tasks are dispatched incorrectly or in excess. In contrast, Swift Concurrency improves upon this by offering a more structured and efficient approach to concurrency management.

\ By prioritizing tasks and limiting concurrency based on available resources, Swift Concurrency reduces the risk of thread explosion and optimizes performance, ensuring that multiple tasks can run concurrently without overwhelming the system.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

The post A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release appeared on BitcoinEthereumNews.com. KPop Demon Hunters Netflix Everyone has wondered what may be the next step for KPop Demon Hunters as an IP, given its record-breaking success on Netflix. Now, the answer may be something exactly no one predicted. According to a new filing with the MPA, something called Debut: A KPop Demon Hunters Story has been rated PG by the ratings body. It’s listed alongside some other films, and this is obviously something that has not been publicly announced. A short film could be well, very short, a few minutes, and likely no more than ten. Even that might be pushing it. Using say, Pixar shorts as a reference, most are between 4 and 8 minutes. The original movie is an hour and 36 minutes. The “Debut” in the title indicates some sort of flashback, perhaps to when HUNTR/X first arrived on the scene before they blew up. Previously, director Maggie Kang has commented about how there were more backstory components that were supposed to be in the film that were cut, but hinted those could be explored in a sequel. But perhaps some may be put into a short here. I very much doubt those scenes were fully produced and simply cut, but perhaps they were finished up for this short film here. When would Debut: KPop Demon Hunters theoretically arrive? I’m not sure the other films on the list are much help. Dead of Winter is out in less than two weeks. Mother Mary does not have a release date. Ne Zha 2 came out earlier this year. I’ve only seen news stories saying The Perfect Gamble was supposed to come out in Q1 2025, but I’ve seen no evidence that it actually has. KPop Demon Hunters Netflix It could be sooner rather than later as Netflix looks to capitalize…
Share
BitcoinEthereumNews2025/09/18 02:23
Bitmine Immersion Technologies (BMNR) stock :soars 5% as $13.4B Crypto Treasury Propels Ethereum Supercycle Vision

Bitmine Immersion Technologies (BMNR) stock :soars 5% as $13.4B Crypto Treasury Propels Ethereum Supercycle Vision

TLDR Bitmine surges 5.18% as $13.4B ETH treasury cements crypto dominance. Bitmine’s $12.6B Ethereum trove fuels bold 5% market ownership goal. Bitmine rebounds strong—ETH hoard drives record treasury valuation. Bitmine’s ETH empire grows to 3M coins, powering stock’s sharp rally. With record ETH and cash reserves, Bitmine solidifies crypto supremacy. Bitmine Immersion Technologies closed 5.18% [...] The post Bitmine Immersion Technologies (BMNR) stock :soars 5% as $13.4B Crypto Treasury Propels Ethereum Supercycle Vision appeared first on CoinCentral.
Share
Coincentral2025/10/14 02:40
Headwind Helps Best Wallet Token

Headwind Helps Best Wallet Token

The post Headwind Helps Best Wallet Token appeared on BitcoinEthereumNews.com. Google has announced the launch of a new open-source protocol called Agent Payments Protocol (AP2) in partnership with Coinbase, the Ethereum Foundation, and 60 other organizations. This allows AI agents to make payments on behalf of users using various methods such as real-time bank transfers, credit and debit cards, and, most importantly, stablecoins. Let’s explore in detail what this could mean for the broader cryptocurrency markets, and also highlight a presale crypto (Best Wallet Token) that could explode as a result of this development. Google’s Push for Stablecoins Agent Payments Protocol (AP2) uses digital contracts known as ‘Intent Mandates’ and ‘Verifiable Credentials’ to ensure that AI agents undertake only those payments authorized by the user. Mandates, by the way, are cryptographically signed, tamper-proof digital contracts that act as verifiable proof of a user’s instruction. For example, let’s say you instruct an AI agent to never spend more than $200 in a single transaction. This instruction is written into an Intent Mandate, which serves as a digital contract. Now, whenever the AI agent tries to make a payment, it must present this mandate as proof of authorization, which will then be verified via the AP2 protocol. Alongside this, Google has also launched the A2A x402 extension to accelerate support for the Web3 ecosystem. This production-ready solution enables agent-based crypto payments and will help reshape the growth of cryptocurrency integration within the AP2 protocol. Google’s inclusion of stablecoins in AP2 is a massive vote of confidence in dollar-pegged cryptocurrencies and a huge step toward making them a mainstream payment option. This widens stablecoin usage beyond trading and speculation, positioning them at the center of the consumption economy. The recent enactment of the GENIUS Act in the U.S. gives stablecoins more structure and legal support. Imagine paying for things like data crawls, per-task…
Share
BitcoinEthereumNews2025/09/18 01:27