You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DispatchQueue is an object that manages the execution of tasks serially or concurrently on main thread or on a background thread.
Serial Custom DispatchQueue
A custom DispatchQueue ensures that tasks are executed one at a time in the order they are added.
letqueue=DispatchQueue(label:"com.avii.example")
Using sync on Custom Serial Queue
Defination: When using sync, the calling thread blocked until the submitted block completes.
Purpose: Ensures tasks are performed sequentially and allows for thread-safe access to shared resources.
Below we created an example, that demonstrate, how sync is useful to prevent race conditions.
classThreadSafeExample{private(set)varbalance=100privateletqueue=DispatchQueue(label:"com.avii.example")func decrement(by value:Int)->String{
queue.sync{if balance >= value {Thread.sleep(forTimeInterval:4)
balance -= value
return"Success"}return"Insufficient Balance"}}func get()->Int{
balance
}}letobj=ThreadSafeExample()DispatchQueue.global().async{print(obj.decrement(by:100))print(obj.balance)}DispatchQueue.global().async{print(obj.decrement(by:50))print(obj.balance)}
As mentioned above, DispatchQueue initializer creates a serial queue.
letqueue=DispatchQueue(label:"com.avii.example")
Then keep in mind that execution of multiple DispatchWorkItems will happen sequentially.
Like, in the below example, we are printing numbers inside DispatchWorkItem block in the duration of 1 second and also scheduling a block execution after 3 seconds to cancel our other block.
It is used to manage the execution of multiple tasks concurrently, can be used in a situation where you need to aggregate the results of these tasks before proceeding further.
DispatchGroup is not the best choice when oness task is dependent on other.
Initialization:
letgroup=DispatchGroup()
Entering the Group:
You need to call group.enter() before starting an asynchronous task.
group.enter()DispatchQueue.global().async{
// Perform your async task here
group.leave()
// Call leave() when the task is completed
}
Waiting for All Tasks to Complete:
There are two ways to wait for the tasks to complete:
Synchronous Wait:
group.wait()
// This blocks the current thread until all tasks in the group have completed
Asynchronous Wait:
group.notify(queue:.main){
// This block is executed when all tasks in the group have completed.
print("All tasks are completed")}
What problem Concurrent DispatchQueue Barrier flag solves ?
Problem:
Suppose there is a resource that is shared between mutiple threads. One thread is performing write operation (which is time consuming) and can take few seconds to complete. At the same time if one thread tried to access that resource, it will get unpredictable reesult (like mismatch between number of items).
To solve this problem you may think to use serial queue, in which you will wrap your read operations into queue's sync/async blocks and will wait until the write operations completes before executing read operations.
Let's proceed with an example:
finalclassMessenger{staticletshared=Messenger()privateinit(){}privatevararray=[String]()
// Creating a serial queue
privateletqueue=DispatchQueue(label:"com.avii.mt")
// Simulating the posting of a message using Thread.sleep.
func postMessage(_ message:String){
queue.async{[weak self]inThread.sleep(forTimeInterval:4)self?.array.append(message)}}func getLastMessage(_ completion:@escaping((String?)->Void)){
queue.async{[weak self]incompletion(self?.array.last)}}}
// Appending operation using #Thread 1
DispatchQueue.global().async{Messenger.shared.postMessage("M1")}
// Reading operation using #Thread 2
DispatchQueue.global().async{Messenger.shared.getLastMessage{ message inprint(message ??"nil")}}
In the above example, we used used serial queue.
But this approach will slow down my read operations. Now my read operations will execute sequentially, with one read operation having to wait for another to complete.
In that case, barrier flag can help us to solve this problem.
Using this flag, we can prevent the read operations from happening until the write operation has completed.
Barrier flag Definition form Apple:
When submitted to a concurrent queue, a work item with this flag acts as a barrier. Work items submitted prior to the barrier execute to completion, at which point the barrier work item executes. Once the barrier work item finishes, the queue returns to scheduling work items that were submitted after the barrier.
An abstract class that represents the code and data associated with single task.
Because the Operation class is an abstract class, you do not use it directly but instead subclass or use one of the system-defined subclasses (e.g., BlockOperation) to perform the actual task.
Despite being abstract, the base implementation of Operation
class include significant logic to coordinate the safe execution of your task.
An operation object is a single shot object - that is, it executes its task once and cannot be used to execute it again.
You typically execute operations by adding them to the operation queue.
If you do not want to use operation queue, you can execute an operation yourself by calling its start method.
Asynchronous Versus Synchronous Operation
You can design your operation to execute in a synchronous or asynchronous manner.
Operation object are synchronous by default. In a synchronous operation, the operation object doesn't create a seprate thread on which to run its task. When you call the start() method of synchronous operation directly from your code, the operation executes immediately in the current thread.
By the time the start method of such an object return control to caller, the task itself is complete.
How to execute task asynchronously
You can add an operation to an OperationQueue to execute it asynchronously. The queue ignore the isAsynchronous property and always calls the start() method from seprate thread.
When you call start() method of an asynchronous operation, that method may return before the corresponding task is completed.
BlockOperation
An operation that manages the concurrent execution of one or more blocks.
The BlockOperation class is concrete subclass of Operation class that manages the concurrent execution of one or more blocks.
You can use this object to execute several blocks at once without having to create seprate operation objects for each.
When executing more than one blocks, the BlockOperation itself is considered finished only when all the blocks have finished executing.
addExecutionBlock(_:) -> Add the specified block to the receiver's list of blocks to perform.
Calling this method while receiver is executing or has already finished causes an NSInvalidArgumentException to be thrown.
executionBlocks: [@Sendable () -> Void] { get } -> The blocks associated with the receiver.
The blocks in the array are copies of those originally added using the addExecutionBlock(_:) method.
OperationQueue:
A queue that regulates(maintains) the execution of operations.
An operation queue invokes its queued operations objects based on their priority and readiness.
If all the queue operations have same queue priority and isReady property returns true, the queue invoke them in the order you added them. Otherwise, the OperationQueue always invokes the operation with the highest priority relative to other ready operations.
Respond to Operation Cancellation
Finishing its task doesn't necessarily mean that operation performed that task to completion; an operation can also be cancelled.
For currently executing operations, the operation object's work code must check the cancellation state, stop what it is doing and mark itself as finished.
For operations that are queued but not yet executing, the queue must still call the operation object's start method.
cancelAllOperations(): Cancels all queued and executing operations. This methods calls the cancel method for all the operations in the queue.
Calling the OperationQueue's cancelAllOperations() method does not automatically remove operations them from the queue or stop those that are currently executing. Operation object itself must check for cancellation.
Managing Dependency
You can make the operation dependent on the completion of another specified operation using addDependency(_:) method of Operation class.
letoperationQueue=OperationQueue()letbo1=BlockOperation{letemployee=Employee()
employee.syncOfflineEmployeeRecords()}letbo2=BlockOperation{letdepartment=Department()
department.syncOfflineDepartmentRecords()}
/* bo2 is dependent on the bo1, so the execution of bo2 will only
* start after the completion of bo1.
*/
/*
* Make sure you configure your operations dependencies before adding them to the Operation Queue
*/
bo2.addDependency(bo1)
operationQueue.addOperation(bo2)
operationQueue.addOperation(bo1)print("Finished scheduling tasks")
Output:
Finished scheduling tasks
Starting Employee sync operation
Finished Employee sync operation
Starting Department sync operation
Finished Department sync operation
You can achieve the same output using a Custom Serial DispatchQueue.
As we know, in the above example, the custom serial DispatchQueue executes their tasks serially, and because the 'concurrent' attribute is not specified, it will wait for the first block to complete before starting the next.