Three Concepts Every Node.js Developer Should Understand

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone today I want to talk about three concepts that I believe every node.js developer should have in mind whenever they're writing code and they intend for it to scale well and be performant for each of these Concepts we'll discuss them briefly and then jump into some code to see them in action feel free to skip around as you'd like now if you're interested in learning more than this video be sure to check out my nestjs microservices course where you can learn how to build and deploy production grade nodejs microservice okay so I'm going to utilize nestjs as a backend framework to review these node.js Concepts using nestjs isn't mandatory however it's going to allow us to easily expose HTTP traffic in our application in a nice standardized way so let's go ahead and get started if you don't have the nest CLI you can simply install it using npm install DG njs CLI and you can get the latest now I have the nestjs CLI we can initialize a new project by using the nest new and then we can give it a name you can call whatever you want I'll call it node.js concepts for this lecture and I'll use pnpm as my package manager so go ahead and let it finish installing some dependencies so now that we've installed these dependencies we'll simply CD into the nodejs concepts folder and you can see here it's telling us we can run pnpm Run start to start our server let's run pnpm run start Dev which is going to start the development server and actually watch for changes and recompile if we change our code so you can see our Nest app has successfully started so let's go ahead and open up Postman so that we can try sending a sample request to this backend server before we go ahead and review the boilerplate code that's allowing this to work okay so in Postman I'll go ahead and launch a new get request at http Local Host 3000 which is the default port for This nestjs Server which we will soon see so if you launch a get request at it you should see we have a Hello World response coming back let's go ahead and take a look at the code inside of This Server so we can understand it a little bit better okay so this is a default nestjs project importantly in the source directory is where all of our code is going to live we have this main.ts file which is actually boot dropping the application and creating our app with our root app module here and it's also listening for HTTP traffic on Port 3000 which is why we're able to send it get requests so we have that app module that we're importing so in here we have a controller and a service that we're defining so the controller is where we Define our HTTP routes so in this case we have a get route and this is the hello world response that we sent back on that get request so if we go ahead and look at the app service get hello which is calling we simply are returning this hello world string so this is just the boilerplate nestjs project out of the box now we're able to actually send HTTP traffic let's go ahead and dig deeper into some of these node.js Concepts that I want to review okay so the first nodejs concept I want to review is the difference between blocking and non-blocking code in node.js so when we say blocking and node.js node.js is single threaded right it runs on a single thread and so if we write code that is going to block this one thread then node.js won't be able to do any other work it won't be able to respond to http requests or do anything else if we're blocking that main thread now the way node.js is able to scale so well and handle so much io on this one thread is through the concept of the event Loop now you can think of the event loop as merely a way for us to specify what happens when an event occurs so on the browser you're very familiar with the concept of providing an onevent Handler for when a user clicks a button for example well you can think of the event loop as something very similar it's essentially a loop that's always running checking to see if it needs to execute any of these callbacks that we Define in our node.js code so the event Loop is the reason why we're able to scale so much on a single thread so you can think for example on a typical request 95% of the time is going to be waiting for some database IO to occur where we're fetching data from a database for example most of this time we're simply waiting back from the database and so the idea with node.js is that we're not blocking that main thread while we're waiting instead we've Associated that database call with a call back and once the network request completes then node.js will be able to execute that callback code again only when the database call is completed so we're not blocking that main thread and so this is actually very similar to how modern CPUs operate a program is only going to run on the CPU for a specified amount of time and as soon as it needs to stop and wait for some IO like a network call it gets kicked off of the CPU and won't come back on until it's ready to do more work so this asynchronous model is exactly how node.js is operating on a single thread now other programming languages like C Java and so on they handle a concurrency by using separate threads and this introduces additional complexity node.js we only have this one single thread and the event Loop is constantly running checking to see if it needs to execute any of the callbacks we've associated with asynchronous code that could be waiting for Io now if you're new to node.js this can sound like a lot to take in and so that's why I'm going to leave some link links in the description for some great articles to further understand how node.js in the event Loop actually works so now that we have a brief overview of what the event Loop actually is I want to go back to review blocking versus non-blocking code so blocking code in node.js is going to be synchronous code that's essentially going to block that main node.js thread and it means the event Loop isn't going to be able to run and check any additional callbacks in our asynchronous code and what this means is we won't be able to actually do any more work while that blocking code is running so let's take a closer look with some real code so we can actually understand what this looks like all right so back in our application let's take a real world look at blocking versus non-blocking code in nodejs and see how we can make sure we're always not blocking this main thread so the event Loop can constantly keep running and make sure our nodejs code is always scaling and Performing well because if we have blocked that event Loop well we're going to see the consequences of that really soon so back in our app controller let's go ahead and first create a new route in here so we'll create a new get route and I'll call this blocking so this is going to be our blocking route where we Implement some blocking code to see the effects of writing blocking code and what that looks like so we're going to have a function called blocking we'll call this and I'll simply return this.a service do blocking so we'll have to go on the App service and write a new function to actually implement this blocking code all right so let's start off with the function blocking so it's important to note that node.js is synchronous by default and it's all happening in that one thread so each line of code is going to be executed one after the other by default and so that's going to be important to know in this blocking function so let's go ahead and I want to create a while loop that's essentially going to run in propertuity for a few seconds so we can see the effect of that so to do that I want to create a new variable called now which will be the current date so I want a new date. getet time so this is the current time and then I want to create a while loop here right that will have a new date. getet time so while the current time is less than now plus and this is the amount of time essentially we're going to run the loop for so I want to go ahead and specify essentially 10,000 milliseconds which is going to be 10 seconds so I want to run this while loop for 10 seconds and show the effect of this on the nodejs run time and after this Loop we'll simply return an empty function so let's go ahead and save our code and head back to postman where we can go ahead and give this a try now okay so back in Postman we can go ahead and launch a request at the blocking route by by targeting SL blocking here and sending a get request there now you can see the request is running right now and it's going to take up to 10 seconds for that while loop to complete and we should get a response back at the end so you can see here it took 10 seconds and we got a response back so out of the box nothing looks bad here right we're just running some long running code and we do get a response back in the end however to see the effects of this blocking code we want to launch a new get request at Local Host 3000 and we want to just Target our hello world route so if we just send a get request of course we can see we get that hello world response back instantly now what I want to do is I want to execute the blocking function and then immediately you can see we tried to execute our get function and you can see it's hanging it's not returning hello world and that's because our blocking code is running as as soon as the blocking call completes then our get route for hello world will complete as well and the reason for this is because this blocking call is blocking the one node.js thread it's not allowing the event Loop to continue running and checking for additional callbacks for example the call back associated with this hello world request and response this call is completely blocking our nojz code now something else I want to show you is if you go to a new terminal and you type in the top command if you're using Linux or Mac this will tell you the CPU usage of your applications now I want to run the get route again for blocking and go ahead and look at your CPU you can actually see our node app this is our nestjs app here is actually using the entire core associated with the app it's using 100% CPU while this request is running and that again is a consequence of of writing this blocking code when this blocking code is running it's it's telling the CPU hey I need to run on the CPU I'm executing some synchronous code and I need to occupy the CPU time for this full 10 seconds my while loop is running so it's getting scheduled on the CPU for 100% of the time here you can see and so there's no waiting there's no asynchronous code that's happening it we're not executing something and telling the CPU okay I can wait a little bit go do something else and then come back to me and that's why this code is blocking it's hogging the entire thread that nodejs is assigned and it's what we want to avoid when we're writing node.js code so let's go ahead and see how we can write an asynchronous version of this code so that we can tell the CPU hey I've done a little bit of work but now you can go ahead and do other work while I'm waiting so let's see how we can Implement an asynchronous version of this sleep that we're doing here essentially so let's create a new function an asynchronous function called non-blocking and we're going to do the same thing we just want to wait 10 seconds but now we're going to do it in an asynchronous way so we can free up the CPU and free up the thread for the event Loop to continue running process other requests in the meantime so we're going to go ahead and return a new promise and our promise and nodejs is really just syntactical sugar for call backs so we're going to be able to tell node.js in the event Loop hey here's some code that you can execute when I'm ready so in this promise we'll enter in an async function that has the resolve parameter and that's where we can actually resolve the promise and tell nodejs hey we're ready to actually do something with a value so in this case I want to use set timeout which is going to allow us to execute some code after an amount of time so after 10 seconds the same amount of time we're waiting in our blocking code then I want to resolve this call and return so now we've essentially transformed this blocking sleep into an asynchronous sleep where instead of Performing this work synchronously we're doing it asynchronously by returning a promise that will schedule this code to resolve in 10 seconds and the big importance of this is that we don't block for this full 10 seconds we fre up nodejs to do other work while this promise is pending so let's go ahead and prove this out by going back to our app controller and now we can create a get route for nonblocking so I'll call this non-blocking and we'll call this async nonblocking and we'll return N.A service. nonblocking so let's go ahead and give it a try and Postman now so now we'll change this route to be non block blocking and we'll call it and you can see it's running and while it's running we can go back to our other get request and see how we're able to execute it and get response back while our non-blocking call is running and so the reason that this works is because this non-blocking code isn't occupying the CPU time on the thread because it's asynchronous we told nodejs hey we have this call back to execute when whenever it's finished then you can come back and execute my code in the meantime you can do other things we can also go ahead and look at our CPU and of course node.js isn't even registering on our CPU here because we're not doing any CPU intensive work while this is running we do a little bit of setup and then we immediately free up the CPU to do other things while the promise is pending so I hope this has shown you the difference between blocking and non-blocking code really you can think of it on a high level synchronous code is going to be blocking asynchronous code is non-blocking and it's going to allow the CPU to do other things it's going to allow the event Loop to keep running okay so the next concept I want to review is the idea of concurrency in nodejs so as we've already reviewed JavaScript execution in nodejs is single threaded so we're not going to have concurrency like we do in other languages like C or Java where we have multip threads everything's in a single thread so when we say concurrency what we're talking about is the event Loops ability to execute callback functions after completing other work so the way I like to think about it is more less of concurrency and more about parallelism or the ability to start multiple tasks at the same time and then we let the event Loop handle these tasks to achieve this parallelism we're going to look at using promises so let's go ahead and check it out so back in our app controller let's go ahead and create a new function first to demonstrate this code running in a non-parallel way so we'll create a new get route and we'll call it promises so I have an async promises function that will return this.a service. promises all right so let's go into the app service and want to create a new async promises function and in this function I essentially want to Loop over an array and do some asynchronous work with that array so you can typically imagine in a nodejs app you might have an array of values that you have to enrich by sending them to some external service so typically you could Loop over that array and send one request at a time and so that's what I want to demonstrate in this function is just executing One Promise at a time and seeing the effect of that so to do this let's first start off by creating a results array where we're going to keep track of all of our results now I want to create a for Loop that's going to start at zero we're going to iterate 10 times and increment this I value each time so now I want to actually execute some asynchronous code in this Loop so let's create a private async sleep function that's going to emulat some IO so it's essentially just going to sleep and wait for us now we want to do this in an asynchronous way of course so we don't block the event Loop so we'll simply return a new promise that has a resolve function and I want to log something out at the start of this sleep so to do this I'll create a new private readonly logger the top of our service so this will be a new logger from nestjs common and now I want to log something out at the beginning and say we are starting our sleep and then I want to run a set timeout similar to what we've already done before so after a second we'll say our call took a second to reach some database or external service then we will log sleep complete and we'll go ahead and resolve the promise with an empty object so now we have this dummy IO right so now we have this dummy IO or sleep that's happening we can actually use it in this Loop so we'll simply push to the results array and call await this do sleep so we're awaiting this promise and after its completion we are pushing it to the results array and we'll simply return the results back to the caller but the key to this is that we're doing this one at a time so each iteration are waiting one One Promise at a time and then at the end we are returning our results array so let's check out the effects of this if we go back to postman and execute a get request for promises so if we go to our terminal and look at the logs you can see this iteration so we start the sleep sleep completes start sleep sleep completes and this is going to keep on going for each of our 10 iterations it goes one at a time starting complete pushing our result and finally we return back our results array with 10 empty objects after 10 seconds so this makes sense but now what I want to show you is how we can actually execute these promises all at the same time and essentially wait until they all return and once they all return we can return immediately and make this call a lot faster because we execute all the promises in parallel so we're telling the event Loop go ahead and execute all of them at once so let's go ahead and see how we can do this by creating a new get route we'll call this promises parallel where we're going to execute multiple Promises at once and see the effect of this so call this promises parallel and return this.a service. promises parallel and so let's go ahead and implement this function now so we'll have the async promises parallel method and so this is going to be very similar so let's go ahead and copy the existing code from promises however the big difference here is now I don't want to await this call one at a time so simply I want to push the promise itself to this results array which I'll go ahead and rename to promises so we're simply pushing the promise from the Sleep Method into this array and then what we can do is we can use the promise do all function which we can see here that will create a promise that is resolved within an array of results when all of the provided promises resolve or reject so essentially what this is saying is it's going to wait until all of the promises in a provided array will resolve a reject and then it'll return the return values from them all which is exactly what we want we want to execute all these promises at once and simply return as soon as they're all finished or they eror out so let's see the effect of running this code we can go ahead and run the promises parallel function now execute it and see it's already completed it only took a single second to run this now if we look at our logs you can get a really good idea of what's going on here we have the start sleep log statement but all 10 of them are executing at once here and that's because we're starting all of these promises at once right we're not waiting each of them like we were in the first Loop no we're executing all them at once and so after we start the sleep we wait a second and then the event Loop comes back around and sees oh I have a call back to now execute and so it executes the call back after a second and they all complete around the same time which you can see here with the Sleep complete log so this way we're not waiting for each promise individually we're grouping them together and allowing them all to run at once and finish at once now technically this isn't happening at the same time it's just happening very close together since nodejs is happening all in one thread and the event Loop is essentially again it's running around in a loop checking to see when these callbacks are ready you can obviously see how much more performant this is when you have an array of promises or essentially an array of asynchronous work to do and you don't want to do that one promise at a time you can group these promises together and execute them in parallel going to speed things up a whole lot now it's important to note that these promises do have some memory overhead in our application so we want to be careful when we're executing an Unbound array of promises and and executing them all at once like this using promise.all we want to make sure we're only executing this concurrently on a bound data set that we know won't overload our system if we need a way to do this in a more controlled way we have libraries that provide us a way to do this like PQ that I will leave a link to in the description if you're interested in checking out all right so the last concept I want to review is horizontal scaling in nodejs so in OJs as we know from this lecture is single threaded so we only have a single core to work with in one node.js app so what this means is if we need to use more than a single core of CPU we need a way to actually scale horizontally in our application which means running our application with multiple instances and we can load balance requests between those different instances of the application now of course we do have ways to use multiple threads under the hood in nodejs now using worker threads however that does offer complexity along with it so I want to review this horizontal scaling approach now in order to first get a better look at this I want to install a new package in our application that's going to allow us to do some load testing so let's pnpm install save autoc canon in our project which is simply an a CLI load testing tool that we can use to launch a lot of requests at our app so in a new terminal window I've gone ahead and entered this command pnpm autoc Canon which is going to execute autoc Canon and then I want to execute it against Local Host 3 ,000 SL promises which is our promises route and then we have the- C for the number of concurrent requests we're making so this case I'm launching 10,000 requests which is obviously an insane number of requests to launch at our app however we're not doing a lot of work in our app so if you were doing more CPU intensive work in your application you obviously wouldn't need this amount of requests to reach 100% CPU but in our case we're not really doing a whole lot so we're just going to launch a sheer number of large requests at it this- t is going to make sure we don't stop for any timeouts that occur and- D is the duration so we'll go ahead and run it for 60 seconds so I'll go ahead and run the autoc Canon command which is going to go ahead and run our test against our API and now you can see our logs are spitting out our promise generation and if we go to top we can see the node process is increasing in CPU usage so you can see here we've gone up to about 60% and more and of course the kernel itself on my system is being maxed out because we're launching so many requests and if we were to actually push this further we would run into constraints we wouldn't be able to push the CPU Beyond 100% in our application so we really want a way to be able to use more than 100% CPU in our node app and scale it out and to do that we're going to use horizontal scaling which means we're going to run multiple instances of our app and distribute those requests to each of the instances and they can each use up a core so to scale our application horizontally we're going to take advantage of Docker and kubernetes so you want to make sure you have Docker desktop on your system and you also have kubernetes enabled on it so if you don't have Docker desktop or kubernetes I will leave a link in the description where you can download dog or desktop and actually get a kubernetes cluster installed on it all right so to go ahead and get started we first need to actually create a Docker file so we can dockerize our app and build it into an image that we can then use and deploy in a kubernetes cluster so at the root of our project I'll go ahead and create a new Docker file so I've gone ahead and pasted in a basic two-stage Docker file here that's going to allow us to containerize the NJ app app we've already built now I'm not going to go in depth into this Docker file because I've done this in several other videos so if you'd like to learn more about this Docker file and how we've actually built it I'll leave a link in the description so you can check that video out but essentially all you need to know is that these instructions will tell Docker how to actually build our code so it can be run inside of a container which is what we want in kubernetes so we have this node command we're running after we've compiled all of our code to actually run the node app so now that we have the ability to build this Docker file we want to actually deploy it somewhere so we can host the image and Pull It in our kubernetes cluster okay so in my browser I've gone to hub. deer.com which is a repository where we can essentially host our Docker images so go to this website if you want to create your own repository to host your image you simply click on the create reposit button and call it whatever you'd like and you can see we have instructions here to actually tag the image and push the image so you can go ahead and do this or if you'd like you can use the image that I've already created and pushed up here called nodejs Concepts so let's go ahead and see how we tag and push this image to so firstly we're going to have to make sure we build this Docker image so to do this we're going to run Docker build and then give it a tag name so the tag name here we call NOS Concepts and then the- F specifies the docker file and this dot here is where we're running it from so this will be the root directory where we currently are so you can see it's going to run through these steps to actually build our Docker file so that this code can be run and deployed in a container so now that our Docker file is actually built going to go ahead and run Docker tag and so Docker tag is going to essentially rename this image and then we want to enter here the name of the repository so in this case it's my username mg22 SL nodejs Concepts and then we give it a tag so I'll give it the tag of latest which will be the default tag so now that we've tagged it we can simply copy the name of the image that we tagged it to and run Docker push to push the image to our repository we're now actually ready to deploy this in kubernetes locally so we can run multiple instances of this app I want to firstly go to the root of our project and I want to make a new directory called k8s which is where we'll keep all of this kubernetes code so we'll CD into k8s and now I want to use Helm so if you don't have Helm it allows us to essentially manage our kubernetes cluster more easily uh and make it more portable so if you don't have Helm I'll leave a link in the description where you can grab it once you do have helm we can run Helm create and then give it the name of the chart name we want to call it this is going to create a new chart called nodejs Concepts so we've created node.js Concepts and now if we go back to vs code in our Cades folder we can see it's created this nodejs Concepts folder for us with some default files so this chart. yl is going to describe the metadata for the chart this values. is simply values we can use to override the configuration of the chart which we're not going to use so we'll just delete everything and then lastly we'll just delete everything from inside the current templates folder now again this isn't going to be an in-depth view of Helm or kubernetes if you want to learn more about that I'll leave a another Link in the description where you can learn more from one of my other videos for now we simply want to create the minimum number of templates necessary to run application and scale it horizontally so to do this easily we can go back to our terminal and I want to CD into the node.js concepts folder and then into templates so in here I want to use kubernetes to actually generate the yaml we need to deploy our application so let's do this we're going to run qctl create deployment so the deployment is the Manifest for describing how to run our container containers so let's create a deployment called node.js Concepts and then we'll Supply the image so we know this image is going to be the docker image we've already Tagged so we'll simply set this equal to M22 nodejs Concepts at latest or you can use your repo if you've pushed this image up yourself and then I want to specify the port option so we specify we want a port exposed at 3,000 and finally we're going to set dry run equal to client and then output to yaml and so this is going to make sure we don't actually create this deployment it's simply going to Output it to yaml so we'll pipe this to a deployment. yaml file and this will go ahead and create this deployment. yl for us with everything we need to create this deployment it's already filled out for us which is excellent so we can see here we have the container container section where we have the image we're pulling the port and the name so we know we're exposing our app on Port 3000 which is just what we want and by default there's going to be one replica so a single instance of this app so now that we have the deployment we also need to create a service so a service is going to allow us to expose the deployment so we can send HTTP traffic to it and it will be properly load balanced to all of the underlying instances so to do this we'll follow similar pattern we'll create a new service and this is going to be a node Port service so node Port service is one that allows external traffic to enter into our kubernetes cluster and it's simply going to open up a port on the local machine that's running which is exactly what we want so we're going to go ahead and specify TCP which is the port we want to expose in our case it's 3,00 3000 so this is going to be the node port and then the Target Port on the container so we want to Target Port 3000 so we also want to give this a name we'll call this nodejs Concepts and then we'll again use dryon equal to client and output yaml and pipe this to a service. yaml file so now we've created this template if we go to the service. yl we can see here we've created a service and the only thing we need to update here looks like the name was incorrect let's name this node.js Concepts and now we have a service that is exposed on Port 3000 and it's going to Target Port 3000 in our underlying container and this key bit here is the selector so we're going to select the app that has this app label nodejs Concepts and this corresponds to our deployment where if we scroll down here we can see we apply this label to it app nodejs Concepts so when we send requests to this node Port service it's automatically going to load balance it to any of the instances that have this selector which is going to be the key to this horizontal scaling so this is all we need to go ahead and get started let's go back to our terminal now and we can CD into the root k8's folder and now I want to run Helm install so how we actually deploy this Helm chart and then we'll call it node.js Concepts to install it and make sure we use dot here to specify the path which in case is the current directory so we can see here that it has been deployed so we can run Cube C get pods to get the pods in our application which are the instances of the app running in our deployment so we can copy this pod name and run Cube CTL logs and output it and we can see here the app has started successfully which is just what we want to see now importantly if we also run Cube CTL get service we can see the node Port service that we created and the key bit we want here is this node Port so this node Port is how we're actually going to talk to the service so this 30102 node Port will Target the port 3000 on our app so this port here is actually bound to our local machine and so we can run a get request at this node port on Local Host and we can execute the simple get route and get back to hello world so this is coming back from our kubernetes cluster which is excellent now what I want to do is actually scale up our deployment horizontally so I've run Cube CTL get pods we can run cctl scale deployment no JS Concepts and specify the replicas to five and what this is going to do is it's going to Simply scale our app up to now five running pods and each of these pods is a new node.js application that has a single core available to it when it's running and so this is how we actually scale our node.js app Beyond a single core we've now added new instances and now when we send requests to the service it will distribute these requests to these underlying pods and allow this compute to be scaled horizontally let's go ahead and actually run a test here to verify this so if we go back to the terminal where we ran autoc Canon previously we can go up until we find it or use controlr to search for autoc Canon now I simply want to change the port here that we're targeting so instead of Local Host 3000 now I want to Target our node Port so I'll Target 3012 and execute this Auto Cannon now you can see our test has gone ahead and started and so now instead of running top I run I want to run Docker stats to look at the stats of our kubernetes pods since Docker is still running our kubernetes pods underneath the hood you can see as our test is running the CPU usage of each pod has increased and it's about even and the reason for this is because we're sending requests to this nodeport service and it's load balancing those requests to each available node.js pod so hopefully this demonstration of some Core nodejs Concepts has been useful for you we've looked at how we can write non-blocking code and make sure we don't block the event Loop which is key to node. js's scalability to thousands of concurrent requests at the same time it all has to do with the idea of executing these asynchronous callbacks and not blocking the CPU or the only thread that node.js has available to it we've also looked at how we can execute multiple promises in parallel to increase our throughput and finally we've looked at how we can scale our application horizontally to take advantage of more than a single core in a nodejs app so I really hope you've learned a lot in this video thanks so much for watching I'll see you in the next one
Info
Channel: Michael Guay
Views: 10,249
Rating: undefined out of 5
Keywords:
Id: _cNIsBTg8HA
Channel Id: undefined
Length: 39min 27sec (2367 seconds)
Published: Tue Nov 14 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.