Serverless Doesn't Make Sense

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
there are two things in this world that i don't understand girls and serverless and part of that could be for me just like not having a lot of experience with it because every time i dip my toe in the serverless water it is just freezing cold and i'm just not wanting to get started with it from the outside serverless looks very attractive i give it at least an 8 out of 10 and the part that i find most beautiful is that it can auto scale to exactly the load you have and you don't even have to think about servers if you have 20 people using your app on a daily basis and then all of a sudden your app just blows up on reddit and now 20 000 people are using your app no problem everything works just fine as more requests just like surge in there you're gonna have functions pop up left and right to handle all of them for you and you only pay for the number of calls that actually happen we'll just ignore the fact that your database died after 20 seconds under that load because you're on rds free tier in theory this sounds fantastic which is why i've been waiting for an excuse to try something with serverless that's a little bit more serious and the day finally came around where i wanted to switch how i was dynamically resizing images you know like where you pass like 300 width and 600 height to a image endpoint and it'll actually like resize the image on the fly before it sends it back to your browser i wanted that and i wanted that serverlessly or at least i thought it might be interesting to try with serverless and i had seen something that aws has where you can deploy a serverless image handler for you and it does exactly what i want the only thing is i avoid aws like the plague so i made the mistake of trying google cloud i know i know i should have known better but you know what you live you learn i know now i won't make the mistake again and for those of you that are unaware google's serverless option is called cloud functions and it is just kind of slow to put it lightly i created a very basic node.js function that used the sharp library to resize images i'll stick the code on the screen if i can find it so you can take a look at if you want to pause and check it out you can but i ended up trying that out and it took around two ish seconds to load and in 2020 that is kind of slow as molasses i figured a live reenactment would be helpful but i didn't have any molasses so i got the next best thing and you can tell here google cloud is really exerting itself here by how much the honey bottle is shaking but unfortunately its infrastructure has decayed into crystallized honey so it's not very forthcoming to process this request but it does indeed get in the end so i have to give it kudos for that at first i thought this is just what i deserve for using a turtle language like node.js and i was contemplating switching over to like golang see if it would be faster but the thing is i was using sharp and sharp has native bindings underneath the hood for like c or c plus plus or something so it should have been fast and it wasn't and after further investigation i found the true culprit was not node.js but it was cold start which if you're unaware is just a jargon term for when your function is called for the very first time and it takes a little bit to initialize and spin up and before your code actually gets executed google cloud has to like create a node process and parse like your dependencies and all that little jazz and so it takes a little bit of time in my case the cold start was consistently taking over a second and my actual code was only running for like 100 to 400 milliseconds depending on the size of the image i looked online to see if this was normal and found someone who documented the range for cold starts for google cloud and it's kind of standard for cloud functions to be this slow for node.js and other languages are actually even slower but if you compare this to aws lambda you find out google cloud functions just kind of suck for node.js the cold start for lambda is anywhere between 0.2 seconds to 0.6 seconds compared to cloud functions which were at 1 to 4 seconds in case you're wondering azure is just as bad here's the chart comparing lambda versus cloud functions versus azure and azure is kind of just as bad when it comes to cold start with cloud functions except it can randomly be way worse i was shocked to see lambda winning by such a big margin and it's not exactly competitive either so i have no idea who is using google cloud functions and azure like he he needs to switch asap and of course once i learned that it was a no-brainer to switch over to lambda so i took five minutes porting over my code and then the next 10 weeks configuring api gateway and then i was ready to give it a try also it's probably worth noting i didn't use that one click aws image serverless handler thing i mentioned earlier that basically spins up the entire infrastructure for you because i had my images on google cloud storage and not on s3 and that was actually for a very good reason because i tested out both and google cloud storage just gave me better upload speeds than s3 so i actually did want my images over there here are the metrics for calling my lambda function a few times the first column here is how long it actually took to execute the second column is just a rounded version that aws bills me at the third column is how much memory the function had and the fourth function or column is how much memory the function actually used i also tested increasing the memory because you can go up to three gigabytes for your function and i found two gigabytes was around the sweet spot for mine and you can see the performance is much better for this and the one thing i was a little curious about at first was why would i increase memory if we look at the memory used like my functions not using any memory but actually as you increase the memory of these functions it also increases the cpu which helps with cold start and also just helps with resizing the image itself i got excited when i saw these results because my goal is to get something on average to be less than 500 milliseconds and it does exactly that unfortunately my method for testing wasn't exactly very scientific and i think most of these calls if not all of them were warm function calls and i haven't really talked about that but after you call a function for the first time right we talked about that's going to be a cold start and it's going to initialize but then after that the function becomes warm for like 15 minutes or so and it doesn't have to re-initialize and so it's a lot faster and that's what we're seeing what i did next to get better information is to turn on tracing with lambda and this could be done through x-ray and so now i have basically traces of the lambda calls that i can look at and actually can see like what part of it was a cold start and which part of it was actually the function call being called so here you can see that this entire call took 900 milliseconds and the actual invocation this is my code running i believe is 243 milliseconds initialize for 441 milliseconds and then this bit right here that is the cold start time i believe um so you can see a lot of the time here to here is before my code is even being called and so it's actually closer to around i guess a second that it's spent initializing right here's another one and you can see the ones that are sub 500 milliseconds like this 200 you can see it's just the function call being called and there's like no initialization or it's very small amount so there's like no cold start there so the function is great when it is warm but when it cold starts it's a little slow for my liking i believe in practice if you have a consistent amount of requests that are just coming in and it's not like all at once or something and it's more of like this nice and even then you have less cold starts but that is something that i'm still needing to test out ws also has this thing called provisioned concurrency which you can pay extra to keep on x functions and keep them warm so let's say you have like 30 requests that happen per second and you know you're gonna get it then maybe you can provision like 30 functions that'll always be on and be ready and this is the part that i don't really understand because if you have consistent traffic or you know kind of around the number of requests that you're going to get why are you even using serverless in the first place why not i just use a regular server it seems to me that if your api is only good when the functions are warm and you're provisioning functions to make sure they're warm that you're kind of just missing the whole point behind serverless and that you can scale up or scale down at a moment's notice but the thing is to take advantage of that you have to be okay with cold starts and that's kind of tough because it kind of feels like you're just handicapped when it comes to performance i've seen a bunch of stuff that serverless experts recommend doing and not doing to reduce the effect of cold start and you can go down that path and you can try optimizing those things but i'm over here thinking on a whole nother level you know what i want i'll tell you what i want this is this is big brain time okay i want my serverless functions to never cold start not even once and next thing you know i'm not using serverless anymore i'm spinning up a thick kubernetes cluster getting a hundred percent warm requests and you know what it feels amazing not having to wait 500 milliseconds before your code can even execute or i have option two for you which is cloudflare workers which boasts on its website to have zero milliseconds of cold start time i did play around with creating a worker to resize images and i kind of got close to it working but it uses a different run time than lambda so there are some limitations you can't use node.js libraries that have native bindings like sharp so i ended up trying out this web assembly example to resize images that was kind of jank to be honest and it kept rebuilding the webassembly code for some reason so i wasn't too confident in it plus i saw on the pricing page that even if you start paying money that you can only get up to 50 milliseconds of cpu time per request which i didn't think would be enough for resizing images so cloudflare workers didn't seem like a good choice for my particular use case but it is something i want to try out in the future if i get something that fits in within the limitations because zero milliseconds of cold start time is something that i want in my life but there you go that's pretty much why i haven't been too keen to convert everything that i do over to serverless and i'm not quite sure what i'm gonna do with my image resizer on lambda if i'm just gonna keep it there what i might do is just leave it for a little bit and see under normal usage what percent of requests are cold starts versus warm and then i'm gonna be sticking a cdn in front of it anyway so it's okay if a lot of the requests are slow and yeah i mean there's a good chance that after a little bit me and serverless are just going to be breaking up for this project you
Info
Channel: Ben Awad
Views: 244,947
Rating: undefined out of 5
Keywords:
Id: AuMeockiuLs
Channel Id: undefined
Length: 10min 13sec (613 seconds)
Published: Thu Oct 15 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.