Mastering Concurrency in iOS - Part 1 (Concurrency, GCD Basics)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
foreign currency I have been receiving many comments and messages on LinkedIn for covering this topic for a long time and because there's so much to be discussed in this topic I was not sure about how to do that finally I have been able to structure the content now before starting with this video let me tell you that what's my idea and how we are going to cover concurrency because this topic is very wide and there's so much to be discussed in this topic and I want to discuss everything in very detail in very depth which cannot be done in one video of 15 or 20 minutes we will be doing a series of videos at this point of time I'm not very sure about the number of videos in which I will be breaking the content into but I have a very strong feeling that after watching the complete series after watching all the videos you will be having a very strong grip over concurrency you will be able to answer the questions based on concurrency the code Snippets the output based questions which are generally asked in interviews and you will be able to deal with the complex problems data inconsistency related problems the threading issues the weird crashes all of them so let's have a look at the contents of this series the topics that we will be discussing and then we'll start the video so we'll start with what is concurrency why do we need concurrency and then concurrency versus parallelism because these two terms are very confusing for understanding them we will have a look at the OS fundamental topics like time slicing context switching and then we will see that how can we achieve concurrency so we all know the answer is multi-threading but then we'll see that how will we achieve multi-threading in iOS what are all the possible ways what are Pros what are cons and what are the recommended ways for example gcd operation queue and then the latest concurrency introduced into a 5.6 using async a very tasks task groups and all that we will understand the difference between synchronous and a synchronous we will understand the difference between serial versus concurrent and then what is the difference between serial and synchronous because again these two terms are also very confusing people treat synchronous as serial and asynchronous as concurrent so we will go in very detail and we'll understand that then we'll come to cues that what are serial cues concurrent cues what are system provided cues what are quality of services why qos has been introduced by Apple what is the need of it and then custom cues how can we make our own cues related to that there are some Advanced topic like Target cues attributes we will understand them and then a whole bunch of things related to dispatch like dispatch group dispatch work item dispatch barrier seem offers logs Etc we'll also go through operation and operation queues how can we Master them and then the new stuff introduced in Swift 5.6 task task group are saying await those kind of stuff and towards the end of the series we will be having a separate video on interview questions and I'm very sure that you would be able to answer all the questions that would be discussing in that video because in the videos prior to that we would be going through the concepts and the concepts will be discussed in very detailed in very depth now let's start with concurrency let's start with the very basic question that what is concurrency now you might already be knowing the answer of this but I'll still recommend you to watch this first video because this is going to be the very Foundation we will be discussing the Core Concepts in this video which are eventually going to help us in understanding the complex topics so let's see that what is concurrency now as per dictionary the literal meaning of concurrency is that the fact of two or more events or circumstances happening or existing at the same time for example this telephone operator it appears that he is dealing with two phone calls at the same time so when it appears that two or more than two events are happening at the same time we say that the phenomena is concurrency or the events are taking place concurrently now in terms of programming what does it mean concurrency is the execution of multiple instruction sequence at the same time because that is how the work is done using the instruction sequences so when multiple instruction sequences are taking place are being executed at the same time concurrency is happening for example if we have two tasks here Task 1 and task two and both of them are being done at the same time but or at least it is appearing that both of them are taking place at the same time we can say that the tasks are being executed concurrently or the phenomenon is concurrency now comes the point that what is parallelism because in concurrency 2 it is appearing that the two tasks are taking place at the same time so what's the difference with parallelism in parallelism the dictionary says the literal meaning of parallelism is that the state of being parallel or of corresponding in some way to understand this in a better way let's have a look at this super mom it is not just appearing that she is doing six jobs at the same time but they are actually taking place at the same time the difference with concurrency was that it was appearing that the telephone operator is responding to the two calls at the same time but actually he was answering one call at a time so in parallelism multiple processing elements are used simultaneously for solving any problem basically it involves multiple resources more than one resource we can say that the problems are broken down in the instructions and the instructions are executed simultaneously at the same time using multiple resources so more resources are involved in parallelism now if we see we have two tasks here and the two tasks are taking place at the same time unlike concurrency so if we see concurrency against parallelism we'll see that concurrency is like having one counter and there are two queues so one queue is being entertained at one point of time and then the person from the second queue will be entertained while in case of parallelism we are having two counters against two queues so both the queues are being processed at the same point of time so of course parallelism will give us more speed but it will involve more resources and because we cannot go on spending on resources we need to find out some way through which concurrency can be achieved now there are two concepts through which the concurrency is achieved and those are time slicing and context switching of course there are other Concepts also which involves the time Quantum latency scheduling algorithms if some interaction occurs it affects concurrency and all those things are also there but we will be understanding time slicing and context switching because these two concepts will give us a very clear picture of how concurrency is achieved in computer systems now context switching it basically means switching of the context and context in a nutshell is the collection of values which needs to be loaded in various program registers of the stackpoint that register the program counter register and various other registers in order to start or resume the execution of a thread so that is context switching and time slicing is basically the period which scheduler grants to each thread before the promption occurs prediction is nothing but pausing of the execution so if a thread is being executed and if it is paused after a certain duration of time we will say that that is the particular time Point time and the context switching took place after that very duration let's understand this better so here you see that Task 1 is being printed or it is being paused then task 2 is being executed after a certain duration task 2 is being paused again Task 1 is being executed and that is how multiple tasks are being executed and it is appearing that the tasks are being executed at same point of time because this context switch happens very frequently and it creates an illusion that multiple tasks are being executed at same point of time although they are not now let's take an example wherein we have three processes P1 P2 and P3 wherein P1 takes one second for its complete execution P2 takes 4 second and P3 takes five second now let's understand that have time slicing and context switching will happen at time t equal to 0 we have three tasks in the queue P1 P2 and P3 and because it is a queue task will be executed in a first in first out order and the execution of P1 will start let's say transport 2 milliseconds if milliseconds is the unit that we are looking at and the time Quantum is 2. so for 2 millisecond the execution of P1 will be done after that the execution of P2 will start the context will switch to P2 and P2 will be executed from time 2 to 4. after that context will again switch and the execution of P3 will start so this is how P1 P2 P3 all all three of them are being executed at time T is equal to 8 the execution of P2 will be resumed the context will switch to P2 it didn't switch to P1 because P1 required one second for the execution and that has already been done so the context will switch to P2 and the execution of P2 will take place at time T is equal to 10 P3 will be executed and then again the context will switch to P3 because P2 required 4 seconds 4 milliseconds my bad and that has already been done so that is how context switching take place time Quantum time slicing this is how it is done and that is how concurrency is achieved using time slicing and context switching it was very important to understand this because this is happening under the hood although we will be using the gcd operation queue and the other higher level apis but we must know that how concurrency is actually being achieved what is concurrency and how time slicing context switching time Quantum schedulers and those things are taking Place scheduling algorithms and there's a there's a hell lot of thing which you can read on internet but this was the most basic thing the most basic idea for for understanding the concurrency and how it is being achieved now let's see that what are the problems associated with the concurrency from my experience I can say that there's only one problem related to concurrency because of which it becomes really difficult to understand it becomes a really tough topic for for in general developers to understand and that is data inconsistency data inconsistency is a very easy to understand word and if we talk about specifics the problems are divided in various categories like Phantom read problem lost update problem and few more but eventually all of them can be treated as data inconsistency and that is the only problem related with concurrency all those crashes that you have seen the weird issues the data glitch and everything else all of them is because of data inconsistencies somewhere underneath and we will be looking at concurrency in such a way that we'll try to resolve all of them you would be able to identify the root cause after understanding concurrency in detail now let's see that how concurrency is achieved in iOS so it is needless to mention that for achieving concurrency multi-threading is something and I'm expecting that you would be knowing that what is multi-threading at least you would have heard of this term that there's something called multi-threading and because of which concurrency is achieved through multi-threading we achieve concurrency in any of the Platforms in any of the languages that you have worked on or maybe you would have heard of them so the point comes that okay we know about multi-threading but then what next how multi threading is used in iOS or are there any apis available for that let's look at it one of the possible ways of achieving multi-threading is by creating the threads manually you can create multiple threads assign some tasks to each of them and when all of those threads will be executed simultaneously you can say that you have achieved multi-threading we'll have a look at it and we'll understand the problems related to that but other than that we have some other apis for example Grand Central Dispatch operation queues and the modern apis introduced in Swift we'll have a look at them later in this series but for now let's understand that what are the benefits of creating the threads manually of course it is not the recommended way and we will discuss the cons in a minute but in the Pro section we can say that it is the very raw approach of creating the threads of achieving the multi threading and it gives us more control more customization we can start the thread cancel the thread change this text size basically we are dealing with a very raw API directly with the thread there's no abstract layer above the thread we are directly in contact of the thread and then the entire control is is within the developer's hand so these are the benefit bits of directly dealing with the threads but again with more power comes more responsibility we'll see that in a minute before that let's see that how can we create the thread in xcode so I am having a saved code snippet here for the custom thread let me use that we created a class custom thread in which we are having this method create thread and thread selector now these are not essentially needed the the line on which we should focus is this line that is thread we are creating a thread using its initializer and then passing on target as self and selector as the method that we have created over here that is thread selector and custom thread in action that is the print statement that we have put in this third selector method so that we come to know when the thread is actually created and then the object is nil if we run this in the playground we'll see the output that custom thread in action which shows that our thread started execution and the print statement got executed apart from this there are a couple of other ways to through which we can create the thread as in we can control the execution when to start if it should start with a delay or how to cancel how to change the stack size and all of them are very easy just like this you can have a look at them over the internet but but for now we'll just discuss this much about manually creating the thread because in practicality in real life in actual projects you are never going to create a thread manually it is a bad practice and it comes with a with a whole set of cons let's have a look at them so when we directly deal with the thread API there are a bunch of problems which come and here are few of them so the first one is that the responsibility of managing the threads comes to us when the system condition changes we need to handle the thread separately in a different Manner and now because there's no other API there's no controlling abstract layer over the threads we are supposed to deal with that that's one of the problems and then the deallocations of the resources when the tasks are finished that will again come into the picture because here Auto release pool The Arc won't be in the play again that will have to deal with then comes the memory leak problems because of the memory management is not done properly memory leaks will happen eventually the crashes will happen that is also one of the things Auto release pool is not in the picture maintaining the order of the execution is also one of the major problems because when you are dealing with multiple threads when you are dealing with with multiple tasks at the same time each of them is using a separate thread a common resource is being accessed some of them are getting completed early some are taking time so maintaining the order of execution the access to the common resource and everything else it is it is a big problem it is it is a mess so it's not recommended to deal with the thread API directly instead there are some other cool apis which Apple has provided which takes care of all this mess and we are just supposed to delegate our task and the thread and managing of everything will be taken care by them let's have a look at them the first one is Grand Central Dispatch which is commonly referred as dispatch the first one is Grand Central Dispatch which is commonly referred as dispatch so if we go with the Apple's documentation it says that dispatch is a framework which executes code concurrently on multi-core Hardware by submitting the work to dispatch queues managed by the system I understand that this is not very easy to understand so let's break this and understand in bits and parts so basically gcd is a queue based API that allows us to execute the closures on a fifo basis the first in first out basis basically the the principle of queue we can submit the tasks those tasks are submitted in a queue and then gcd is basically an API it is something which takes care of that queue and our submitted tasks are processed in a first in first out order now execution is done by a worker spoon there is is a pool of threads which actually executes the tasks let's have a look at it so assume that you have a task queue to which you submitted your task that is to be executed and then there's a pool of threads which is having multiple threads some of them are already executing some tasks while one of them is available for executing the next task available in the queue so when any of the thread becomes available in the tasks queue in the workers pool a task from the task queue is taken out in the FIFA order of course and it is submitted to that thread then the execution takes place at the thread the state changes now now this particular task is being executed and when the execution will complete that will be moved to the completed task or basically you will get the Callback that your task has completed executing this is the basic idea of workers pool there's a tasks queue and then there's a worker spool the task is taken out in the FIFA order that is submitted to a thread whichever is available the execution happens and the task is completed you get the necessary callbacks so where does gcd comes into the picture what is dispatch doing what is Grand Central Dispatch the framework created by Apple what is the role of gcd so gcd decides that which thread will be used for executing the task all we need to do is just submit our task to gcd and then gcd will take care of everything that which thread is available which thread should this particular job should be executed on depending on the priority quality of service and everything else that we'll see later in this video but basically the problem is sorted all we need to do is just submit a task to gcd that will take care now developer is not responsible for executing the task gcd will choose some appropriate dispatch queue and then the tasks will be executed now you would be thinking that what is dispatch queue because this is the first occurrence of this word in this video and going ahead we will be discussing dispatch queue in very detail and that is the whole base whole principle of the concurrency in iOS so let's see that what is dispatch queue actually in layman terms dispatch queue is an abstraction layer over the queue of the tasks so gcd manages a collection of dispatch queues to which we submit the task and eventually it is decided that which thread from the workers pool should be chosen for doing that particular job so gcd either executes the work serially or concurrently but they are always chosen in the form of a FIFA order because at the end of the date is a queue first in first out order because Q works on the fifo basis so so the order in which the tasks are submitted to the queue they will be picked up on the on the same order in the same basis and then the execution can vary serially or concurrently that we will see in very detailed but gcd will ensure that the tasks are picked up in the FIFA fashion and now comes the point that okay we got it gcd is something it is it is basically having a collection of dispatch queues to which the work is submitted but then comes the point that how can we submit our work to dispatch queues so we can do that in the form of closures in the form of blocks and those submissions can be synchronous or asynchronous now this is very confusing that that what do you mean by synchronously submitting a Blog synchronously submitting a a job to be executed or or how does it differentiate from the asynchronous manner so for now just understand that the submissions or basically the difference is the order of execution versus the manner of execution order of execution decides that the tasks will be picked up serially or concurrently while the manner of execution means that how the job will be executed whether it will block your current execution that was already happening or will it keep on happening and this will be taken care separately so that that decides the manner of execution and this is the whole difference between the synchronous versus serial and asynchronous versus concurrently don't worry if you didn't understand this you will see it in even more detail we will see it in the form of of code Snippets we will see the outputs to make it crystal clear so I have just started giving you the direction to think on the lines on which you should think so that it becomes absolutely clear let's see that what is the difference between synchronous and asynchronous because that is the manner of execution that I was referring to so when we say something that that synchronously execute a task or we are submitting a task to be executed synchronously it means that block the current execution that is happening on the current thread it means that block the current execution till this job of mine which I am submitting on a dispatch queue until that particular job is completed block the current execution that was happening that is the meaning of synchronous that is a manner of execution a fraction of execution in which we are saying that unless this particular job of mine is executed don't execute anything else which was being executed so that is what synchronous means and if we talk about asynchronous it says that continue the current execution that was happening keep doing it while this new job of mine will be executed asynchronously somewhere else at the same time or maybe at some later point in time but don't block the current execution that was happening so if you are saying that don't block the current execution which was happening keep doing it the control will immediately return from that function that method now let's understand that what is CL versus concurrent it is very important to understand these terms these fundamentals because these are going to help you in solving those complex problems this will help you in predicting the output of those complex code Snippets which are asked in the interviews so take your time if you feel like pausing the video do it listen it again but ensure that you understood it completely so that understanding the other complex the other Advanced topics which are built on top of them they become easy for you so serial queue it says that one task at a time as simple as that no complexity is involved over here synchronous was a manner of execution which said that stop the current task block the current execution and execute this particular block which I have submitted now this has been submitted to a queue if that is a Serial queue it will execute one task at a time serial simply means one task at a time while concurrent means multiple tasks at a time now don't get it wrong even when you have submitted the tasks to a concurrent queue you want them to be executed concurrently they would be dq'd in a Serial fashion because eventually it is submitted to a queue and the whole concept of queue lies on P4 first in first out so they would execute simultaneously they would execute concurrently but the dqing will happen serially that's the whole principle of a queue so to understand this better if we refer these two images in serial you would see that Task 1 task 2 task three all of them are being executed serially one task at a time while in case of concurrent execution multiple tasks are being executed but they were dequeued serially so the starting of each of the tasks is after the previous one while the completion can happen before the task which is started executing early so in this case task 3 got completed before task 2 whose execution was started before task 3 that is a separate matter altogether but the point understand here is that serial means one task at a time and concurrent means multiple task at a time and both of these things are different from synchronous and asynchronous as the key takeaway from this discussion there are two points that you should remember first one is that serial or concurrent affects the destination queue to which you are dispatching the queue to which your task has been submitted how the execution will take place there whether it will be seriously or concurrently that will affect your destination queue while sync and async affects the current thread from which the dispatch is taking place so when you are dispatching the task whether the current execution which was already happening on this thread it should stop or it should keep on going that is decided by Sync and async so these two points are our key takeaways from this discussion now let's see a little bit of code I have this code snippet here now based on our discussion of Serial versus concurrent sync versus a sync and the understanding that we have developed about the dispatch queues try to predict the output of this one you can pause the video for a while and give it a thought later you can validate your answer and the thought process so if I just run it in the playground we see the output as four five six zero one two three nine now let's understand that why it was like this so we executed our first Loop the one which is responsible for zero one two three on dispatch queue dot Main we haven't discussed that what is dispatch queue dot main till now but I am expecting that you would be knowing that what is main queue and at least you would be knowing that main queue is a Serial queue we'll go in details of main queue but for now just understand that main queue is some serial queue so basically you have submitted this block of code to be executed asynchronously on a Serial queue now let's remember our discussion related to synchronous and asynchronous asynchronous means do not block the current execution this newly submitted piece of code will be executed separately so as per that logic we go ahead from here and then we execute this block of code that is for I in four to six this block of code is not associated with any dispatch queue or any sync async blog so this will be executed directly and because of it 456 will be printed so we see that four five six are the first three numbers in our output the next block is again executed asynchronously on a Serial queue now let's recall the serial queue concept serial queue means one task at a time so this was already submitted to the same serial queue that is dispatch queue.main and then this task has been submitted so serially means one task at a time and because the execution happens in a FIFA order the first one which was submitted that will be executed so that is the reason we see 0 1 2 3 and then we see 9 over here so this justifies the output and I hope that your thought process and prediction would have been the same that was pretty much for this first part of the video series that we are doing on concurrency in next part we will be discussing dispatch queues in detail as in main queue system provided queue custom queues water quality of services why did Apple bring quality of services how does the execution happens and some Advanced parameters in creating the custom queues for example Auto release frequency Target queues and things around that if you like my videos if you like my content you can consider subscribing to the channel and you can support me through the super thanks button available down there next part of this video series will be available very soon till then take good care of yourself happy coding and stay safe
Info
Channel: iCode
Views: 34,113
Rating: undefined out of 5
Keywords: concurrency, ios, swift, gcd, grand central dispatch, icode, pallav, mobile, android, multithreading, parallelism, threads, data race
Id: X9H2M7xMi9E
Channel Id: undefined
Length: 25min 28sec (1528 seconds)
Published: Sun Jul 17 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.