Mastering Concurrency in iOS - Part 2 (Dispatch Queues, Quality of Service, Attributes)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey guys welcome to my channel i code i am pallav and this is the second video of our series that is mastering concurrency in ios we are doing this series for understanding concurrency and everything around it in very detail in very depth in the first video of the series we understood about concurrency that what is concurrency why is it how is it achieved what is the difference between concurrency and parallelism we tried to understand the concepts like time slicing context switching how they help in achieving concurrency and then we started with concurrency in ios we looked at how to create the thread manually what are the recommended ways we looked at gcd what are dispatch queues concurrent queues serial queues what are the differences between them we had a very close look to understand that how serial is different from synchronous and concurrent is different from asynchronous if you haven't watched the first part in which all of this was included here's the link now in this second video today we will dive even deeper in dispatch queues we'll see that what is main queue what are system provided global concurrent queues how can we create custom queues we'll look at the parameters which are used for creating the custom queues as in attributes auto release frequency target queues and we'll try to understand them in very detail we'll also try to understand quality of services and why it was introduced by apple in the first place what were the problems and how qos helps in solving them so let's get started in the previous video we saw that dispatch queue is an abstraction layer to which the tasks are submitted gcd maintains a collection of these dispatch queues and the work submitted to these cues is actually executed on a workers pool on on a pool of threads now let's see that what are the different types of these dispatch queues so essentially there are three types of queues main queue global concurrent queues and the custom queues now there are some similarities and some differences between three of them so let's try to understand each one of them in detail main queue is one of the system created queues it is serial in the nature of execution and it uses main thread now these three points are not very difficult to understand we saw that serial queue executes one task at a time and main queue follows the same it is one of the system created queues we get a set of queues which are created by a system and main queue is one of them so that is also fine but the third point that it uses main thread this is something which differentiates main queue from the other queues main queue is the only queue available which uses the main thread the access to main thread is not available to any other queue be it system provided or be it user created also ui kit is tied to main thread so this creates a whole set of difference we'll do a very detailed analysis of the fourth point that why ui kit is tied to main thread what can possibly happen if we try to create something like ui kit and try to tie it to any other background thread why is it necessary to update the ui on main thread only or why does ui freeze happens and the entire concept related to it the things around that for example the main run loop the view draw cycle the refresh rate everything else we'll see everything in very detail but that will do in the video in which we'll be covering the interview questions related to concurrency this is because all of these points are very interrelated and it creates a very beautiful series of follow-up questions so it's better to understand them in the form of questions and answers so that it not just clears the concept but will help you in answering the questions in interviews also so for now let's just understand that main queue is something provided by system it is one of the system created queues it is serial in nature of the execution it uses main thread and because ui kit is tied to main thread we perform all the ui related operations on the main queue everything around this will be discussed later in this series now let's move to the second type of queues that is global concurrent queues global concurrent queues are also system provided queues they are created by system and in that sense it is different from the user created queues they are concurrent in nature of execution unlike the serial queues or the main queue that we just saw and the third point is that they do not use main thread so there's a confusion there's there's a very important concept which creates confusion that can global concurrent queues use main thread in any of the possible ways and one of the possible ways which creates a misconception is the user interactive quality of services we'll see that what are quality of services we'll see what is user interactive what is the meaning of that but for now if you are having any doubt related to the understanding of concurrent queues that can they use main thread or not just make it clear that global concurrent queues do not use main thread at all and the fourth point is that the priorities of these queues are decided by quality of services quality of services is something which decides the priority of these queues because there are a number of system provided queues so we can dispatch our job we can dispatch our tasks on any of these queues and then how those tasks will be executed what will be the priority that is something decided by qos and that we'll see in a minute but before that let's see the proof of third point that concurrent queues do not use main thread in any of the possible scenarios let's jump to xcode let's write a small piece of code here to understand that which thread is used by which queue so i'll start with dispatch queue dot main dot let's say async and we'll print something here and there's a property called is main thread through which we can check that the current thread of execution is main thread or not so i'm using that in this print statement that if thread is main thread then print execution on main thread otherwise execution on some other thread this will be executing on dispatch queue dot main and we saw that main queue uses main thread so our expectation is that the first statement here that the execution is being done on main thread should get printed and now let's create one more dispatchq dot global dot async so global is a way to access the system provided global concurrent queues so here we'll use the same thing but we'll change the print statement so that it's easy to differentiate execution on let's say global one current queue let's try and give it a run we see here that in second case where we use the global concurrent queues the output was execution on global concurrent queue and that is because the check for which we use this ternary operator that thread is main thread that failed and that's why the second statement got printed while in the first case where we were using dispatch queue dot main that is we were using the main queue and eventually the main thread we got the output as execution on main thread that is because this condition of thread dot is main thread was true so this proves that the global concurrent queues does not use the main thread and now let's discuss the point in which most of the developers get confused that is user interactive quality of services don't worry if you don't know that what is qualitative services or what is user interactive we are going to see that in a minute but just in case you know and if you get confused with that so let's just try to use the quality of services here right qos and let's just use user interactive and now let's try executing this again you see that execution on global concurrent queue so even when i have used the quality of services as user interactive it did not print execution on main thread which proves that no matter what quality of services you are using or how you are trying to access the global concurrent queues it is never going to get executed on main thread there are some other advanced concepts relating to this which may change the output in certain cases in certain scenarios and that involves the concept of target queues and couple of other stuff but that will see after understanding the target queues which we'll be discussing later in this video now let's see that what is quality of service we have been actually discussing it for past few minutes that user interactive and have quality of service impacts the execution the priority and and everything else so let's see that what is actually quality of service and why it was brought why it was introduced by apple in the first place because whatever we have seen till now everything was working fine there was no need of quality of service we have introduced this word in this video itself so the concept of threading multi-threading time slicing context switching everything that we have seen till now it was working perfectly fine then why apple introduced this quality of service and what it is so for understanding that let's first see that how application is actually launched and what happens under the hood treat this as a sandbox of our application basically everything related to our application will take place in inside this box when we tap on the app icon system launches our application basically it creates the main thread and on the main thread the the application actually starts the ui kit is tied to main thread and then the app delegate is invoked so all the necessary methods of app delegates those are called and then our application basically starts we start receiving the events the main thread basically waits for the events waits for the user interaction so that the other parts of the application can be executed the application can actually run the way it is supposed to be now imagine a scenario that your app delegate requires some data requires some information from some database so what you will do is that you will call the database you will call necessary methods of the database to fix the data from from the db and that will be done from appdelegate now till the time this process does not complete your main thread will be involved in performing these actions and hence it won't be in the state to listen for any of the events so till the time this activity continues if your user tries to tap anything on the screen if if they try to perform any of the actions any of the user interaction basically your app won't respond so once you get the necessary data from the database once you get the callbacks after that the appdelegate will update the ui using that data and then again the main thread will be in a state 2 to listen the events now if this happens in a friction of a second then user won't even notice it will look fine and even you won't notice as a developer it will look that everything is good but in case the database takes some time maybe two to four seconds five seconds or more than that in responding then you will actually realize that your application has become unresponsive and then for solving that you will go towards multi-threading and that's the way for it that is why multithreading exists so what you will do is you will break this chain of calling the database directly from appdelegate and you will introduce one more queue that will be the user created queue or maybe the global concurrent queue which will take care of making the necessary calls from database that particular queue will get the callback from the database while your main thread will be free for processing the events for listening to the events and once the data will be received from the database then you will update the ui and again the main thread will be free for the user interaction so that way you have solved the problem of ui freeze the unresponsiveness of application and and it's all good till now now let's see that where is the problem actually so let's remove all of these for now and let's just focus on these two threads one is your main thread and one is the background thread created by the dispatch queue the the gcd now when you have two of them and imagine that you have a single core although this is not the case these days we have very advanced processors quad-core hexagon or octa-core basically they have so many cores in them for executing so many tasks in parallel but just for the simplicity for understanding the problem imagine that the device on which your application is running it is having one single core now on that one single core you need to execute the tasks dispatched on these two threads on these two cues the the main cue the main thread and the background thread on the gcd the dispatch queue how will that happen so basically the question is that which one do we execute now your answer to this would be that we have read about the time slicing the context switching and everything that you told in the first video that will come into the play the permission will happen the task on the main queue the first thread will be executed then the permission will happen the context will switch on the second the task on the gcd will be executed again the permission will happen context will switch and that's totally fine because that is what actually happens but gone are the days when it was so easy to understand that just because of those two concepts everything will be executed so smoothly so for understanding the problem in a better way involve some complexity think of the use cases where multitasking is happening so yours is not the only application on the device user is using your application in which you are having two threads and you are deciding on which one to be executed at the same time the user is using some other applications let's say they are listening to music they are they are browsing on safari or maybe they are using youtube or anything else that can possibly happen on the device now think of the processor it is taking care of so many tasks of so many processes the context switching is happening across the threads of so many processes the the computation the processing of so many tasks are happening at the same point in time the processing is being done at such an extent that it will cause the processor to heat up the heating will happen and eventually your other components of the device will be affected to encounter this we used to have a fan placed over the processor so if you if you remember the cpus we used to have a fan installed over the motherboard and specifically over the part where the processor was placed even in the older laptops you would have seen the fan and the job of that fan was to balance the heat to keep the processor cool now in these days the laptops that you are using the the ipad they are so slim these days the iphones the the macbooks none of them are having that fan installed in them yet we are using so many applications yet we are doing the heavy processing so how it is being managed and that is where quality of service comes into the play so quality of service tells the system about how it should utilize the resources required for your job to be executed you would be thinking how to utilize the resources what i was referring to so for any of your job to be executed system requires various resources as in it needs to decide that which thread it should be executed on if it requires input output the access to the io stream is needed the timer coil sync the scheduling priority for how long should your job be executed on that particular thread before the printing happens and there are certain other factors which should be set to a particular value so that the processor works optimally now because we cannot set these factors we cannot change the values of each of these parameters depending on the job that we are dispatching on any of the threads system has provided basically ios the gcd has provided us something called quality of service which is one single abstract parameter that will decide the values of all of these factors all of these parameters so that we can sit back and relax all we need to do is just specify that which quality of service do we want for this particular job and then everything else will be taken care by gcd but essentially these four are the quality of services that should be used when we want to dispatch any of our tasks on any of the global concurrent queues let's see them so first one is the user interactive now user interactive should be used when we are dealing with the animations basically any of the user interactive jobs any of those kind of actions for them you should be user user interactive now again i am not saying that we should not be using the main queue main queue is the way main queue is the recommended way for dealing with the ui kit for dealing with any of the user interactions so then why user interactive quality of service has been provided now that is a very debatable point if you'll go through the apple forums if you'll go through the stack of discussions you will find many angles you'll find different approaches different thought process behind using this particular quality of service but that we will discuss in the video related to the interview questions and answers for now we can imagine we can settle down on user interactive should be used when animations or any kind of user related job is to be done the next comes user initiated now this should be used when where we require immediate results so for example you are scrolling a table view and for the next set of cells which are coming into the viewport some data is required so whenever any kind of data is required which will impact the rendering of the ui immediately and that has been initiated by user for those kind of tasks user initiated should be used utility is something which is used for long running tasks so you can think of maybe downloading something wherein user is aware of the progress it is a long running task but it is not something which is of that high priority that it is impacting the the user experience at all so in such scenarios you can use utility and background is something which is not visible to user at all so for example if you are creating the backups or maybe if you are restoring something from the servers maybe you can use background for that if you still have some doubts about which service should be used when you can ask certain questions to yourself and then based on the answer you can decide on which service should be used so for user interactive you can ask the question if updating of ui is involved if the answer is yes go for user interactive for user initiated you should be asking that is the data required for seamless user experience and if the answer is yes go for user initiated the next question that you can ask is is user aware of the progress so for example download it is something which is not impacting the user experience but again the progress is visible to user so for such kind of tasks ask this question to yourself and if the answer is yes go for utility and for the last background you should be asking that is user aware of the task if user even knows if something is being done and if the answer is no go for background now comes the priority that how system will allocate the resources for each of these services so it goes in this decreasing order the user interactive is having the most priority then comes the user initiated then utility and then the background apart from these four there are two other keywords one is default and one is unspecified so default falls between user initiated and utility while unspecified is something where qos information is missing the absence of qos information is treated as unspecified and priority wise it is least now let's jump to xcode and see all these in action here's a code snippet just try to predict the output of this okay let me walk you through the code so we are having two dispatch queues over here basically we are dispatching two jobs on these two queues both of them are using the global concurrent queues provided by the system and we have specified the quality of service so for the first one we mentioned as background and the job is being dispatched asynchronously for the second one the quality of service is user interactive and this two is being dispatched asynchronously if you want you can pause the video for a while you can give it a thought you can try to predict the output so let's try to work it out for the first one we are mentioning the global queue and the qos as background okay that is fine i mean we are using concurrent queue and we are dispatching as asynchronously so remember what we discussed in the first video that whenever we use async for for dispatch it means that do not stop the current execution move to the next block of code so cool this will be dispatched on some other queue at some later point of time or maybe separately and then the system the control will come to this point here also we are using a concurrent queue okay and again this block of code this for loop is being dispatched to this particular queue of quality of service user interactive asynchronously cool so by this we can say that we cannot predict the output because the the two codes will be executed on two different threads which we do not know about and they will be executed asynchronously so if two jobs are executing asynchronously on two different threads it is not possible to predict the output exactly but let's try to do something else here can we predict that which one will be finished first because both of these tasks will take almost same time we are printing a number in the range provided that is 11 to 21 and from 0 to 10 so both of these tasks are essentially same both of them are printing the numbers and even the count of the numbers is same that is 10 so let's try to predict that which one will finish first now from the discussion that we just had about the quality of service we discussed that user interactive is having the highest priority which means that even though both of these tasks will be executed asynchronously on two different threads yet there are strong chances that this particular job will finish first because more resources will be allocated to this particular job because the quality of service is user interactive so let's try and give it a run you see the 10 that is the last limit specified in this range it got printed before 21 so even though numbers have been printed asynchronously i mean they are not in the order so after 0 we are having 11 we are having 12 yet we saw that 10 is definitely before 21. so can we say that this task which was dispatched on a queue using user interactive it finished before the one which was dispatched on the queue using background as qos now again don't take me wrong it depends on the complexity of the task if you are downloading an image if you are downloading a movie then it's all together a different matter but we are comparing because both of these tasks will will take almost same resources almost same complexity that is why we are trying to compare them let's run it one more time just to see if a resumption is right or not again the execution of this block of code completed before the first one let's run it for couple of more times just to see if a resumption is right or not we see that every time the output is different but 10 is definitely coming before 21 let's try it one more time again 10 is before 21 so this shows that when we go for user interactive as the quality of services it gets more resources it gets more priority basically so that was about quality of service we will see more code snippets related to the serial queue concurrent queue synchronous asynchronous and those kind of executions but before that let's have a look at the custom queues the the user created queues so let's see that how custom queues can be created and what are the parameters that initializer takes so this will be a dispatch queue and in the dispatch queue we see that these are the parameters so label is one then qos attributes auto release frequency and target okay let's try to understand them all label is basically the name of the queue it is an identifier provided at the time of initialization and can be used later for the debugging purposes when crash happens when you want to check that which particular queue was in action at this point of time you can use this label so that you get the name of the queue and then you can debug accordingly so that is about label qos we saw in detail what are the quality of services and everything now let's see what are attributes so attribute parameter can basically take three values either concurrent or initially inactive or both of them now what does they mean concurrent is basically used for creating any concurrent queue so by default the queues are serial in nature if you want to create any concurrent queue you can specify the attribute as concurrent and then the queue that you will get it will be a concurrent queue initially inactive attribute is used when you want the execution of that queue whatever the task has been dispatched to that particular queue when you want that their execution should start at later point of time not immediately as soon as you dispatch the task for those scenarios you can use initially active and later when you want to start the execution of those tasks dispatched on that queue you can call the method activate on the object of dispatch queue and the execution will start you can use both of them if you want a concurrent queue that should be initially inactive so you can pass both of them in an array and then you will get a concurrent key which is initially inactive so that is about attributes that was quite easy to understand now the next thing is target queue so there was one more parameter if you would have noticed that was auto release frequency we will come to it later i mean we'll come to it after understanding the concept of target queues because auto release frequency somewhere requires the knowledge of target queues so let's understand that what is target queue and this is a little advanced topic i feel because in interviews whenever i have asked candidates about target queues i felt that they struggled a little bit so let's try to understand this in detail so target queue is basically a queue on which the actual execution happens so when you create a custom queue you might think that whatever task whatever job that you will be dispatching to this queue it will be executed on this one but behind the scenes they are actually executed on some other queue and that is the target queue so system has a fixed set of queues and all the jobs that you dispatch on either the the global concurrent queues or your custom created queues they are executed on those fixed set of queues which are the target queues now you can definitely change the target queue of your custom queues if you want to change the behavior of your or if you want to play with the system so that provision is always there you can change the target queues but essentially the point to understand here is that whatever task that you dispatch on your custom queues it is actually executed somewhere else and that queue is known as target queue now comes the point that what about the priority because for last 10 to 15 minutes we have been discussing about the quality of service the priority and those kind of things so if we say that our job will be executed on some other queue that will be the target queue what will be the priority so any queue that you create its priority will be inherited from the queue on which the actual execution will take place that is its target queue okay so the next question arises that what if we do not specify any priority well in that case the default priority would be the default priority global queue so we saw that there were four quality of services and then there were two more the default and the unspecified so in this case the default priority global cube would be used as the priority of your custom queue if you do not specify any of the qos for your queues you would be definitely thinking that why the concept of target queue exists at all so the thing is that if system would have provided the freedom and flexibility of creating as many queues as you want the developers would have definitely messed up with the resources of the system and to keep them in control there are definite amount of cues on which the task is actually executed but you can always create the custom queues for your own flexibility for managing the things your way cool so after understanding this the obvious question comes that where can we use target queues what is the actual use of them and as the proper answer to this question i would say that with experience you learn how to use the target queues and as the shorter explanation or one of the easy use cases of target queue is this so here you see that we have two queues a and b and assume that both a and b are serial queues so when i say that they are serial queues it means that one task will be executed at one point of time task one will be executed and then task two will be executed similarly on b three will be executed and then four will be executed so both of them are serially independently what i mean is three can execute after one or maybe three can be executed at the same time when one is being executed because the execution is being done on a different queue although three and four will be executed serially because both of them are on on the same queue that is qb and one and two will be executed serially because they are on a different queue but what if we want all the four tasks to be executed serially so basically what i mean is that i want to make both the queues a and b serial to each other in that case we can introduce a qt which will be a serial queue and which will be set as the target key of both a and b so whatever tasks that are dispatched on a or b they will be eventually executed on t and because t is a serial queue it makes both a and b serial to each other so the task will execute this way 1 2 3 and then 4 if you talk about other use cases target queue can also be used for changing the priority of the task so let's say if you are having a custom queue and if you want the the jobs dispatched to this queue to be executed on a lower priority you can set the target queue as background or maybe utility or if you want them to be executed on a higher priority you can change them to the user initiated so that way you can change the priority using target queue that is also one of the use cases but for better understanding let's jump to xcode and let's see target q in action so i have a code snippet here let me walk you through it first and then you can try to predict the output so we have a queue a which is labeled as a and because anything has not been specified explicitly so by default this a is a serial queue because unless we specify attribute as concurrent it is a serial queue then we have a queue b labeled as b and we specified that this is a concurrent queue we also specified that this targets to queue a so any dispatch queue can target to any other dispatch queue just make sure that you do not create the cyclic dependency but other than that you can always target to any of the queues so this dispatch queue b is targeting to the serial queue a now on queue a we are dispatching this block asynchronously that is printing of numbers from 0 to 5. there's another block being dispatched asynchronously which involves printing of number from 6 to 10 there are other two block of codes which are being dispatched to qb asynchronously and note that we we explicitly mentioned that b is a concurrent queue so on this concurrent queue b we are dispatching this block of code to be executed asynchronously and this one too now you can try to predict the output you can pause the video for a while and give it a thought so let's try to work out the easy part first a dot async a was a serial queue and this block has been processed asynchronously so we will move on to the next block of code which is this block of code but because a was a serial queue so one will be executed at one point of time and then the execution of the next block will be started so this will print zero one two three four five then the next set that was in queue that was six seven eight nine ten then on b we have dispatched this as asynchronously and this two and it appears that b was a concurrent q so both of these will be executed parallelly and asynchronously of course but the point to notice here is that the target of b was set as a and we discussed that when we specify a target queue the behavior of that target queue is inherited so because a was a serial queue now b will also act as serial queue because whatever that has been dispatched to b that will be actually executed on a which was a serial cube so these two blocks will also be executed on a serial queue asynchronous well asynchronously doesn't matter here because the next block of code will be again dispatched on the same queue and then because it is serial queue it will be processed one at a time so in that case the output for these two blocks will be 11 to 15 and then 16 to 20. now let's give it a run and we see that the output is 0 to 5 then 6 to 10 11 to 15 and 16 to 20. so basically all these four blocks were executed in a serial manner though b was a concurrent queue because the target was set as a which was a serial queue so that is how we can use target queues one of the things to remember about target queue here is that if you do not set the target at the time of initialization you cannot change it later although we have a method named set target for changing the target but if we try to execute that method on an already activated queue it will result in a crash so let's try for that let's say we have b here and let's not specify the target at this point of time but after the initialization let's try to change the target b dot set target as a now if we try and running this we'll get a runtime error because we cannot change the target of an already activated queue if you do not want to set the target at the time of initialization but want to change it later in that case the queue should not be activated so for that we can use the initially inactive attribute and then can change the target later so let's try doing that let's mention the attribute as okay concurrent was there and along with that let's mention initially inactive now because this queue is not activated yet we can change the target and after this point of time we can say that b dot activate so this cube b will be activated after the execution of this line and before that we can change the target so now let's try and give it a run or maybe to understand let's dispatch something on b so b dot async print testing activation deactivation let's say and let's try and running this right there so we see that that the crash the runtime error that we encountered earlier it did not happen this time and that is because we changed the target while the queue was not active so this is something that that we should keep in mind while uh while changing the targets or while thinking of target queues now the last parameter remaining is auto release frequency let's have a look at that so auto release frequency is basically a constant which tells about the frequency with which dispatch queue auto releases the objects so i'm expecting that you know about arc and that it takes care of the memory management the auto release of the resources of the objects so auto release frequency is related to that only it tells about the frequency with which the dispatch queue auto releases the objects how the auto release pool is managed by the dispatch queues so essentially we get three options here the inherit work item and never in case of inherit the behavior is inherited from the target queue and that is why we understood the target queue first so in case of inherit whatever the auto release behavior is there with the target queue that will be inherited in case of inherit now when we specify it as work item it means that an individual auto release pool will be set up for that particular dispatch queue that will be pushed while the execution will start and that will be popped when the execution will complete that is in very layman terms but it should be treated as an individual auto release pool is set up in case of work item and in case of never a different auto release pool is never set up for that particular dispatch queue i'll focus more on this when we'll discuss the memory management the arc stuff and those kind of things but for now it's good to understand that there are three options available for the auto release frequency which tells about the frequency with which the dispatch queue will will auto release the resources which is being used for the jobs that are dispatched on those dispatch queues now before summing up this video let's jump back to xcode and work out some other code snippets which will give us some insights about the behavior of dispatch queues now this is going to be very interesting so i have some saved code snippets here and i have taken these code snippets from the articles by aina jain if you want to follow those articles i'll mention the link in the description do try to read them do try to follow them she has covered concurrency in a very very detailed and in a very good manner so these are the references from those articles and because you already covered those concepts in these examples i'm not reinventing the wheel by writing the same code for the same thing let's try to understand those examples so let's go with the serial queue async one first let's see that what is actually happening in the code and then we'll try to predict the output so this is our method do assign task in serial queue and before that we have our serial queue declared over here and this is actually the good practice so far for giving the labels we should always use the reverse domain so com.q.serial okay we have a serial queue here which is being used inside this method so let's not go through this first we'll go through this when the when we'll be calling this method when we'll be executing this so this is the call for this method now let's see over here we have a loop iterating from one two three inside that we are dispatching this particular block asynchronously on a serial queue again when we dispatch something asynchronously it does not block the current execution and moves to the next line next piece of code available for execution so let's leave it here for now and the next line that we see that is to be executed is that we are dispatching another block of code on the same serial queue so these two blocks will be executed serially and this was also dispatched in an asynchronous manner so this line will be executed that is last line in playground okay let's hit the play button and check the output and then we'll try understanding it so we have got task running in other thread okay so before understanding this let's go through this method what is happening over here so the block that we are dispatching asynchronously on the serial queue in that we are having the check for the thread on which the execution is happening if it is a main thread i mean if it is happening on main thread task running in main thread will be printed otherwise the other thread will be printed so this dispatch queue this is our custom queue and we saw that any global queue or custom queue they do not use main thread so that is why task running in other thread has been printed over here there's a concept related to this which we'll see in a minute but for now let's just stick to our our basics our concepts that main thread should not be used i mean main thread is not used by any of the custom created queues or the global concurrent queues so that's the reason we got running another thread here and now we are actually downloading an image so this will take some time and then we'll see the counter variable over here the one finished downloading meanwhile because this was being dispatched asynchronously so this will be executed on a different thread so meanwhile this last line in playground that got executed because control proceeded from here so that is for last line in playground then one finished downloading again task running in the other thread for the second iteration then again to finish downloading task running in other thread three finished downloading so because this block of code was executed three times because of the loop that is how it ran and because this was a serial queue zero one two three the next block of code that is printing of this value with this star sign that got executed after the completion of this block so this justifies the output so whatever we understood about the dispatch queues we can actually use our knowledge to predict this output and we can justify this output what are the reasons for each of the print statements so this was the example of something being executed on a serial queue in an asynchronous manner now let's move on to the next set of code which is serial queue in the synchronous fashion this is almost same as the previous one but the difference is that here we are using sync for this block of code while async is still here for this one now this will change a lot of things i suggest you to pause the video think of the output and then check that what actually happened so let's hit the play button and let's try to see the output the first line in the output itself that is task running in main thread this questions this challenges our entire understanding that we have developed over the time so we discussed that main thread is used by main queue only and the other cues they do not use the main thread they use some other background thread so in this case this was a custom queue dispatch queue and this should have used the other background thread and if that was the case then this if condition should not have been executed and eventually the task running in main thread should not have been printed so what went wrong here what what happened differently so as a justification to this output when we use sync block basically when we try to synchronously dispatch some block of code some piece of work to any dispatch queue and if that dispatch try to block the main thread so whatever execution that was happening on the main thread if that is not continued because of that sync block then system may or may not use the main thread for the execution of the task that you have just dispatched to some cube let's try to understand it slowly so whenever you use sync block with gcd and if for some reason it hinders the execution of the main thread so whatever that was being executed on the main thread if it is stopped because of your sync block and now that the main thread is idle system thinks that why not this block of code should be dispatched on main thread itself because anyway main thread is idle because of the sync block so in those cases system tries to utilize the main thread now this is a concept which is very contradictory which is very conflicting with all the definitions and all the standard descriptions about the the apis which has been provided by apple but again it is important to understand that it may or may not now because this is being executed in the playground where we don't have anything else there are no threads created there are no background processes working so this can change in an actual project but for this scenario where we have nothing else we have actually blocked the execution that was happening on the main thread and because system found that main thread is idle let's dispatch this job on main thread itself and that is the reason we saw this line in the output the task running in main thread so this justifies and then the other outputs are are pretty easy to understand so one finished downloading so whenever that downloading of the image completes one finished downloading will be printed task running in main thread we just went through the justification of this statement then to finish downloading that is how it went and then the next block was dispatched asynchronously so the execution of this line happened which is last line in playground and then 0 1 2 3 as expected so that was about the serial queue with synchronous block let's move to the third that is concurrent queue asynchronously the same code the only difference is that we have introduced the attribute here and we have explicitly mentioned that this is a concurrent queue concurrent queue this block of code is being dispatched asynchronously let's try to look at the output now because the queue was concurrent multiple tasks were executed at the same point of time and this justifies that task running in other thread again this block was dispatched asynchronously because it was dispatched asynchronously it did not block the execution that was happening on main thread the main thread is not idle it is working on something else and hence the task running in other thread got printed again as a justification to the previous output so task running in other thread other thread this was asynchronous then last line in playground this was also dispatched asynchronously on a concurrent queue so it it started after the execution of the first one but we cannot guarantee that when which task will be finished in a concurrent queue hence zero one two three got printed and then the other output two three one because it is a concurrent queue and the job has been dispatched asynchronously so this justifies and now the last piece of code which is concurrent queue in a synchronous fashion the same story again the only difference is that now on this concurrent queue this block of code is being dispatched synchronously you are thinking it right task running in main thread this is what i meant when i say that you are thinking it right so this was in playground we used synchronous dispatch on a queue this blocked the execution of the main thread main thread was idle system thought of dispatching this particular block on the main thread hence we got task running in main thread and then the other output is is pretty easy to understand it's one finished downloading task running in main thread one two three because this one two three was dispatched asynchronously so before execution of this one control jump to the next line which was last line in playground so that's why this got printed before this one and then one two three so this justifies the output now this was a very interesting concept to understand that at times it may appear that your job will be executed on some background thread while it can actually take place on a main thread but this is again subject to many factors many conditions and it may or may not system takes care of that but it is important to understand this that's pretty much for the second video in the series of mastering concurrency in ios in the next video we will try to understand dispatch in even more detail we will see dispatch work item dispatch sim offers dispatch group barriers logs those kind of stuff next video will be available very soon and if you like my content if you like my videos you can consider subscribing to the channel till then take good care of yourself happy coding and stay safe you
Info
Channel: iCode
Views: 23,149
Rating: undefined out of 5
Keywords: concurrency, ios, swift, gcd, grand central dispatch, icode, pallav, mobile, android, multithreading, parallelism, threads, data race, target queue, main thread, main queue, qos, quality of service
Id: yH0RBTdNi3U
Channel Id: undefined
Length: 45min 47sec (2747 seconds)
Published: Sat Jul 23 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.