Asynchronous Non-Blocking Microservices Tutorial in Springboot with Java Code Example for Beginners

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hello good morning friends welcome back to your favorite Channel code one dices today in this video I'll explain you what is non-blocking asynchronous microservices yes friend you would have heard this term it is a very trending term these days right everybody wants to have their micro services non-blocking and asynchronous what does that mean how can you create non-blocking asynchronous microservices using a spring boot they are framework available like node.js where you can create this non-blocking asynchronous microservice very easily but the people like those who are very comfortable those who are very familiar with the springboard how can they achieve the similar functionality similar non-blocking asynchronous microservices using a spring boot framework as we all know most of these microservices we build in Spring boot using rest controllers hence we would like to know how to create our microservices as asynchronous microservices or non-blocking microservices using springboard so today I am going to show you that so State till end of this video it is going to vary informative and very exciting there is a lot to learn in this video friends so stay with me till end of this video There's No Agenda here so friends here is the agenda of this video today I know it's a little long agenda but yes we will cover this quickly and by end of this video you will be able to dial up asynchronous non-blocking microservices using a spring boot framework okay friend so I will explain you what is asynchronous and non-blocking first of all let's understand that let's have a similar understanding what is the meaning of asynchronous and non-blocking then I'll explain you what is a traditional blocking microservices and what are the problem we have with those microservices then I'll explain you difference between blocking and non-blocking microservices then what are the challenges we have with blocking microservices understanding non-blocking microservices and how can we build it understanding async a notations in springboard framework then I'll show you I'll give you a Java code implementation of non-blocking API using a spring boot asynchronous annotations then we'll understand the use case of non-blocking microservices and benefits of non-blocking microservices and later I'll summarize what we learn in this video okay friends so it is going to be very exciting and very informative stay with me stay till end of this video friends before we proceed in this video I request you to subscribe this channel to grow code one digest family friends I am creating a lot of quality videos on programming coding Concepts design pattern and design principles cloud and content Technologies but I'm not getting subscribers I request you to like share and subscribe this channel so that I can grow our code one digest family thank you oh I'm so excited okay friends so now before we proceed in this video Let's understand what is sequential and parallel processing why parallel processing is more efficient and faster than sequential processing so in computer engineering you would have heard about parallel processing okay so we all know like parallel processing is always faster than the sequential processing so what happens in sequential traditional sequential processing is the program starts and then task one comes task one is given to the CPU for the processing it process the task and then produce a result then it moves to a second task right and then process the second task and produce a result so this is sequential unless until task one is done plus 2 cannot be started or completed right so this is what we say sequential and this is dependent so assume you have hundreds or thousands of tasks then it will take long time because if one of the task is taking more time then all the tasks will get stuck and they will be processed in sequential order now now as of today if you see there is a revolution in Hardware as well we get quad core Octa Core processors in our laptop and mobiles right so we have multiple processor available but in sequential flow it always executes only on single processor that means we are not using our Hardware our resources efficiently so we have other processor available with us but those processors are ideal because we have not used the parallel processing or multi-threading in our application hence we should code our application using multi-threads and using parallel processing Concepts where we can engage all those resources available if I start five threads then those five threads will acquire try to acquire different different CPUs or they will be using the single CPU in around Robin fashion so if more resources are available more resources are given to multiple threads if if those processors are occupied with some other application then the same processor can be used by multiple threads in round robin fashion But ultimately all this tasks if you see in a parallel processing all these tasks are executing in parallel task one task two and task 3 and giving us a result so obviously this parallel processing is always faster than sequential processing see there are scenarios where we have to use sequential processing when the output of Task 1 becomes the input to the task two hence in those scenarios we have to go for sequential processing but where all these tasks are independent that they can be processed independently so in this scenario we should definitely make our program efficient and use the parallel processing multi-threading to perform the job weekly with me friends we are going to have a parallel processing using multiple threadings friends just for your information I have also created videos on Java multi-threading and springboard multi-threading in past if you want to learn how to achieve multi-threading in Java how to achieve multiple threading in Spring boot so go and watch this video the link is there on your screen you can watch this video to learn how to achieve the multi-threading in Java and springboard very simple Concepts how to create multiple threads how to get your job done through multiple threads I am going to use those Concept in this video as well so if you just want to learn how to achieve multiple threading so go and watch these videos these are very useful videos to create a basic understanding of multi-threading wow okay friends now let's understand what is asynchronous processing how it is different from our synchronous processing so if you see here process a and process B when process a calls the process B for some input for some results then this thread this particular thread the process is waiting for the response to come now this process we may be a database operation or maybe a i o operation or maybe a network call where this process a thread will be keep waiting for the response to come once this job is done then only process a can take thread a can take some other task to perform that means we are wasting the thread wasting the time of the trade for so long to get the response from process B but in asynchronous what happens is process a request data from a process B and doesn't wait for it to complete it continue working with other tasks available to process and then as and when the response comes back it accepts the response and send it back to the user or do some operations we can achieve as of today using callback features or there are multiple ways to achieve it but most common approach that we use as of today in asynchronous processing or asynchronous programming is callback that means as and when the response is ready then I'll come back and take the response or perform certain operations receiving the response I shoot the request then I continue with my work and then as in when the response is ready I'll get the response and then process it right friends so this is the difference between synchronous and asynchronous processing where in asynchronous processing we are not waiting the thread is not waiting for the response to come it can continue it's working it can continue its processing the other other job other task right obviously this approach asynchronous approach is much more efficient and in in latest trending programming languages like node.js or JavaScript Frameworks they all are based on asynchronous programming so in this demo we are going to use the we are going to use both the approaches that means parallel processing via multiple threading and asynchronous programming for non-blocking microservices so I'll show you how can we do that how can we create a non-blocking microservices using both of these Concepts that means parallel processing as well as asynchronous processing with me friends okay if you have doubt please put your query in a comment section I'll try to reply before proceeding a video I want to make sure that you understood you understood the concept of parallel processing and asynchronous processing right really okay friends so now let's understand what happens in traditional blocking microservices so the micro services or web application that we create what happens in that so the requests come from the user it reaches to a servlet thread okay and then sublet makes a call or controller makes a call to a database to get the data it finds a query on database and try to get the result set for your query now all this time if you see your servlet thread This Thread was blocked it was not doing anything it was just waiting for the response to come from the database this query may take longer time to perform the operation hence this Chevrolet thread was waiting through that time right this is a very inefficient way of programming encoding imagine you are getting lacks of requests every minute so what will happen in that scenario all those records will piled up your application will slow down the customer experience will be really really bad it will say your sites your apis is not responding right because all threats are waiting in the waiting state to get data from the database there are too many requests coming in and the processing is taking longer time because of underlying database or IO operation or network call call to any other services hence we have to use the non-blocking apis so that this thread This Thread we are not blocking the servlet thread our controller thread or servlet thread right the very first thread which accepts the customer request should not be blocked it should accept the request from the customer no matter how many requests are coming that's where you can make your application responsive and faster a user should not feel that its request is taking a longer time or the request is blocked obviously in any application we always have a limited number of threads so if the number of requests are more than those threads so those threads are going to starve those requests are going to wait so we have to make sure there's the servlets thread This Thread is not waiting for data to come from database it should continue serving the customer request or can we do that obviously we are going to solve this problem using a non-blocking API concept right we are going to achieve the exactly same thing there right so stay with me it is going to be very informative and very interesting prints okay friends now let's understand again the problem with traditional blocking microservices when they are high number of requests coming in so any web server be Tomcat JBoss ecology or any server that we use the maximum number of threads they support is 200 that we can configure we can increase but again this is a default number of threads that Tomcat can have at any point of time as soon as we started Tomcat server it has 200 threads available to accept the client request at the same time right so it has 200 threads available in a pool assume your side is super duper hit and you are getting less of requests every minute that means every second you are getting more than 200 requests that means all your threads are occupied serving the client request anyone other client requests are waiting there right you see here thread one is occupied by user one thread 2 is occupied by user to request threat 3 is occupied user 3 request because all these requests are coming at the same time exactly at the same time so if you see we have 200 thread is blocked by a 200 user okay and then we have more requests at the same time let's say user 205 its request is Waiting For Thread to be free all these threads are busy because this thread has taken the request even into business layer and the same thread ph1 is going to the business layer calling the other microservices or calling the database or doing a file i o operations so it takes time unless until this operations performed and return the data to the user till that time thread one is completely blocked from user to getting the data from the underlying system be it other micro services or repository layer or file IO okay so if you see thread one is getting data from other microservices thread 2 is blocked because of it is getting data from database threat 3 is blocked because it is doing some i o operations hence having more requests at the same time this architecture is not going to work it is going to slow down your application and going to hit the user experience you have to seriously think about your architecture this architecture is not going to work for you really okay friends so now in just previous slide we have seen what is a problem we have when the two main requests coming at the same time blocking microservices get blocked all those requests get blocked right so what we do differently in non-blocking microservices so what we do is we don't block our customer facing thread so these are servlet threads or our controller threads we don't block this customer facing thread okay to perform the internal operation internal job like calling the other microservices or firing a query onto database or doing a file i o operation what different we do here is as soon as we get a request we come to business layer then what business layer do it it frees the trade it doesn't block the th1 and creates a new thread get that job done with the help of new thread okay so we have a thread pull of let's say thousand these are worker thread these are not our main thread these are our own main threads these are customer facing thread what happens is as soon as the request reaches to business layer we take the request and we take a thread from a thread pull and we ask that worker thread to do the job it may be calling other microservices to get data or it may be firing a query onto database to get data or doing any IO operation so my worker thread will do that work for you but my main threads are always it's just taking the request from the customer giving it to the business layer and business layer is assigning that work to a worker thread that means you know blocking customer facing threads it's always free as soon as the request is handed over to the worker thread This Thread is free it is free to take the request from other user right now till then the worker thread is performing the job it is getting the data and once the data is ready it will send it back to the same user using a callback methods so we have a callback feature where we can use that callback feature to send the response back to the the same user so if you see here no matter how many requests are coming in here how many users are there and the main thread job is just to take the request from all the users and give it to the worker thread we have thousands of workers thread available to do that job right doesn't sound interesting I will implement this architecture and give you a demo how this works and will compare the both blocking as well as non-blocking endpoints how it is working what is the performance of it so stay with me tiller of this video I am going to give you demo of this architecture you're good okay friends let's understand what is blocking and non-blocking microservices and what are the difference between these two microservices so blocking API block the execution thread until a result is received as I said right non-blocking API don't wait for the response to come from the backend layer and continue serving the incoming request single threads of the request end to end from controller to business layer to repository layer very much correct multiple worker threads perform the backend task as I said right we have a pool of worker threads we use those worker thread to get the job done system become unresponsive if few hundred requests comes at the same time system remains responsive even if the thousands of requests coming at the same time because our main threats are always free main thread is taking the request giving it to the worker thread hence main thread is free always free now microservices written in Java spring boot are by default blocking microservices during boot web flux node.js are the framework to build non-blocking microservices but here I am showing you using the traditional springboard framework how can we achieve the similar results similar non-blocking microservices also friends I have created a video in past on Java multi-threading on Spring boot multi trading how to achieve multi-threading in Java how to do multi-threading in Spring mode so if you are interested you can go and watch those videos or can you do the multi-threading in Java and springboard the links are there on your screen you can see that go and watch this video to learn multi-threading in Java and multi-threading in springboard we are going to use those concept here as well in this video okay friends now let me show you Java code implementation of non-blocking micro services using spring boot framework springboard a notations friends this code I am sharing in my GitHub repository the link is there on screen you can download this project and play with it okay so the repository link is also given in the description section of this video you can download the code and run it in your local and understand how it is working before we move to a coding let me explain you there are few annotations that we are going to use in our project okay enable async is a class level Innovation so this annotation tell this spring framework to run methods in a separate thread now let's understand what is async notation this is a method level annotations and this annotation tells springboard to run this method in a separate thread and thread will come from a predefined thread pool configuration this is a class level annotation this indicates that the class has at the red bean definition method so spring container first process the class with the at the rate configuration annotations and get display being ready to be used by other component in application right friends so we are going to use this three a notations extensively in this demo and so I just wanted to touch up on this let me show you the code implementation that I have done for non-blocking microservices so friend this is a maven-based spring boot project where I have pom.xml this is a version of spring and I'm using Java 8 here and nothing fancy in this form.xml just a basic dependency and I'm using the lombok or logging and out of the box get a Setter facility let me show you how the structure look like so I have Source main Java and resources folder inside resources I have this application.properties now this is a configuration file where I have defined where to run my microservices that is running on 8080 and this is the just a test file to read or write to perform i o operation okay so this is a test file location a test file that my microservice is going to read or write into this file now if you go and open the Java code so here I have created two controllers one is blocking and one is non-blocking why because I want to test the performance of both the calls when it is going via blocking controllers and when it is going through a non-locking controller so we'll see that let us open the blocking controller and if you see I have defined this get call where I'm getting a customer by name from the database and it is calling this customer service right and then I have post call to save the customer then third call is to read the content of the file and fourth call is the post call to write into a file so these are the four operations that we have for blocking on the same line in non-blocking controller also I have the same operation right here I am reading the customer data using get call here I am saving the customer data in post call now I am reading the file and I am writing into file with get and post call respectively here the end point is non-blocking and in blocking one here I have a endpoint as a blocking if you see this non-blocking controller is calling async service that means this is a business layer class but in blocking it is calling our customer service business layer class let's see that so if we go to business layer we have this classes let me show you the configuration how the configuration look like so this is the core of our multi-threading we want to achieve multi-threading in this project and we have to Define this configuration class where I have named it as async thread pull config you can name as per your convenient but we are saying that it's a configuration that means we are creating a bin out of it and this beam object with this name will be available in the spring container throughout the life cycle of the spring container so anybody whoever wants to access this beam they can refer by using this name and what we are saying is enable async enable async means we are enabling creating multiple thread in springboard framework now here if you see in this method what we are doing is we are creating a multiple threads we are getting a thread pull ready so that our business layer can use this thread pool to get a thread from this pool and perform a task if you see we are using this threadful executor and then setting the core pool size and Max pool size right to have a number of threads ready available before our application starts as this is a configuration class so this will get ready first then our controller business layer classes repository layer classes will get ready once this is ready then using the bean name we can request a thread from this pool core pool size is thousand and Max pool size is thousand that means thousand heads will be created now let me show you first the blocking service that is customer service as soon as the request comes here then it uses this repository to get the customer by name then add customer to save customer into repository same line we have a file server which can read the file and write into a file now what is difference here in async service this is our async service that we have now in this if you see what we are doing here is if you notice on this methods we are using async in notation here okay so what we're saying is as soon as the request comes and this method try to execute this method will execute into a separate worker thread this lines these three lines are going to be executed by worker thread as soon as control comes to async service on this method a new thread will get pulled from the thread pool that I've shown you earlier and this method will execute in the new thread same way even saving customer is also executed in a new thread same way reading a file is executing a new thread same way a writing file in the execute in a same thread I'll show you I am printing a thread name also which thread is executing it so we'll get to know because we have thousands of thread domain object having a domain the customer and file data nothing fancy over there just repository so here we have repository it is connecting to mongodb and I have a mongodb configuration in this configuration file I am connecting to my local mongodb okay friend so now as we have seen all the classes we have understood the form.xml configuration file configuration classes our controller class service layer classes everything now it's time to run and see what happens let me start this application I'm running this application now so as it is building the application it will compile the code and run all the test cases once the application is up and running then we'll hit the API endpoints that we have so we are going to hit the blocking endpoints as well as non-blocking endpoint and see the performance what happens okay friends so now application is running now what I'll do is let me show you my Postman so in the postman I have created the endpoint calls to test our blocking as well as non-blocking apis so if you see here I have a blocking get customer API then blocking add customer API is a post call then blocking get file read and blocking I'll write same way then I have created non-blocking where I am doing get customer Post customer read file and write file using non-blocking so let me first call The Blocking get customer okay friends now let me hit this API and see what happens yeah if you see this blocking get customer API is executed and it's took around 12 37 milliseconds now let me hit this non-blocking get customer API and see what happens in non-blocking it is handing over the job to workers thread and freeing it right so if you see this is executed in 146 millisecond but blocking it's took almost 1.2 seconds right it's a huge difference on a same way let's try saving a customer using blocking API so I'm saving this check two I am saving this customer using blocking API let us see so it took around 7 45 millisecond now what I'll do is I'll save the customer using non-blocking apis okay so let me go here and I'll say Jenny 2 and save this yes so it is happening with the 19th millisecond in non-blocking API now let me read a file in blocking API so I'm reading the same file from same location but this is blocking API okay let us see what happens so it took 538 millisecond and I am reading the file using non-blocking API is non-blocking file read let's see so it is 23 millisecond now I'm writing into a file okay with a blocking API so this content I am going to write into a file how much time it take it takes around 531 millisecond and I am writing same content into a file using non-blocking apis and it takes 14 millisecond so you see for every call The Blocking one of our main thread is blocked to perform the i o operation but as the main thread hands over the task to a worker thread then it become faster and especially this is very relevant friends please understand I need your one minute to pay attention here this non-blocking concept is very important when you have a reasonable size of request coming in every second it is not for couple of requests four or five requests coming every second but you have a scenario whenever you have a scenario when you are getting thousands of requests every second millions of requests every minute that time you should go for this approach otherwise it is not going to create any advantage for you this approach this architecture the use case is not for 100 200 requests in a minute okay this approach this solution is only for once you have lacks of incoming request every minute that means thousands of requests coming in every second so remember that when you have a high traffic coming in for the data that time you should go for this approach with me I hope this is clear [Applause] okay friends so now let's understand where to use non-blocking microservices as I said when you have too many requests coming in at the same time whenever you are creating an application or your application is expecting too many requests every minute every second that time you should think of implementing non-blocking microservices you can use it during the peak season when you see there is a rise in request at particular Festival time during holiday time that time you can use this apis use this approach to serve more user traffic your application today serves one lakh of users every day and you see there is there is going to be surge in your user base so that time you may want to upgrade your application to use non-blocking apis right friends use this approach if you perceive your requests traffic is increasing as I said right if you see that in coming years your traffic is going to increase because as your business is growing hence you should think of upgrading your application to use non-blocking microservices than a blocking microservices oh wow that is really that's amazing okay friends so now let's understand the advantage of non-blocking microservices right as we saw the use case scenario where to use when to use non-rocky micro Services right so now let's understand the advantages of non-locky micro services non-blogging microservices helps in serving too many requests coming at the same time so please understand when you have a scenario where you are getting too much of traffic at the same time there is a surge in traffic at particular time like very basic example we know that we all book a train ticket on IRCTC website and there is a specific time when reservation opens up right for such scenarios consider this implementation when the too many requests are coming too many users are coming to your site right to perform the operation in such scenario you should use this non-blocking apis non-working microservices keeps our system responsive right if you remember like 10 years ago when we used to go to IRCTC website it used to freeze during the satkal ticket booking because too many requests are coming in too many people are flowing their requests too many people wants to book the ticket on their website at the same time so this solution is perfectly made for the scenario non-blocking microservices helps in handling the sudden surge in traffic as I said right when you expect there is going to be surge in a traffic at a specific time on a specific day then you can use this approach okay friends now let me summarize what we learn in this video today I explain you what is asynchronous what is parallel processing right so we understood that then I explain you what is traditional blocking microservices what is the problem with that right we also seen the difference between blocking and unblocking microservices then we understood the challenges that we have with blocking microservices then we also understood the solution using non-blocking microservices I explained you what all springboard a notations that we are going to use as a part of the solution I explained you the Java code implementation of non-blocking apis using a spring boot asynchronous denotation right so if you knew the Java code for controller service configuration classes repository layer classes right so you can use that code you can download it from the GitHub repository and play with it then I also explain you the usage in which particular scenario you should use non-blocking API and also understood the advantages of non-blocking apis right friends with me if you like this video so give it a thumbs up and do subscribe to our Channel if you are new to the codewords channel I create lot of quality videos on Java framework spring framework container Technologies design patterns design principles so friends do subscribe our channel to grow old wonders family wow that's amazing friends if you like this video so give it a thumbs up and subscribe to this channel for the more interesting videos click on the Bell icon for the latest video notifications and do not forget to share this video with all your friends and colleagues this is very useful information for students beginners and software engineers I am putting a lot of efforts in creating this contents so please help me growing the code one digest family please subscribe to code one digest channel for the latest programming and technology related videos thank you [Music]
Info
Channel: codeonedigest
Views: 8,285
Rating: undefined out of 5
Keywords: spring boot, java, microservices, spring boot tutorial, spring boot tutorial for beginners, spring boot full course, asynchronous api, asynchronous api in spring boot, asynchronous api calls java, asynchronous api vs synchronous api, non blocking api, non blocking api in spring boot, non blocking api calls java, non blocking api java, non blocking api vs synchronous api, api, asynchronous microservice, asynchronous microservice in spring boot, asynchronous microservice java
Id: utMoWx1XcrE
Channel Id: undefined
Length: 36min 19sec (2179 seconds)
Published: Mon Apr 10 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.