Hands-on with Rust: Async / Await | Rawkode Live

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hello and welcome to today's episode of rocco live at raw code academy today we're doing a hands-on session looking at how rust and asynchronous programming work together before we do that there's a little bit of housekeeping please like comment share and subscribe all the things on youtube you can always find this channel available at rocco.live also we have membership options available and courses where you can come and learn the complete guide to influx db so check out the different options available there and uh feel free to jump into the discord we have around 500 people in there now talking all things cloud native kubernetes and everything in between so come and say hello and i look forward to meeting you okay so because uh i am not a rust person well i'm trying to be but my synchronous rust is pretty poor as daniel is going to find out we have a wonderful guy today so hey senor how are you doing good thanks for yourself yeah yeah i'm doing all right today yeah awesome always fun to be able to rate a little bit of rust for people so it's going to be awesome yeah i'm sure it's going to be great i'm excited awesome can you uh tell us a little bit about you and introduce yourself to the audience please yeah sure so yeah i'm samuel as as dave's mentioned um cool so yeah i work as a platform engineer at a company called aerobotics it's a company in cape town south africa where i live and yeah we build software for the agriculture industry using a whole lot of nice buzzwords of like drones ar machine learning whatever really cool stuff i actually started off as a data scientist surprisingly and then about like a year and a bit in figured i enjoyed engineering more um and so moved over uh and then yeah i've been like doing rust luck in my free time for like a year roughly so yeah that's pretty much how i ended up here i'm assuming being originally a data scientist you must you must be a python developer as well right yeah yeah i know i'm not a person so what caught your attention about about rust um so at the time so this was last year so yeah last year i was basically learning a bunch of programming languages so last year was like my first year as a full-time developer and um i was just learning language because everyone's like learning new languages makes you a better programmer so it started off with haskell that went terribly wow jumping on the deep end yeah i know it wasn't a great it wasn't a great time honestly couldn't really figure out le ferrate um and so then moved on to a truck closure that was that was a great experience did a little bit of go and then then at some point i was like let me learn rust and that yeah i read the book i know yeah i mean just from there it just caught my attention as like a really interesting language and it also solved a lot of problems that like i had seen developers make in in python myself included so for instance like a function just returning none or some random value or whatever and then it blows up in your face at run time whatever so there was situations like that and i was like you know rust has these last guarantees and that kind of you know forces you to program in a bit of a better way so but that kind of like just stuck with me and then [Music] i've stumbled upon awesome rust mentors and then from there found a bunch of people in the rust community and they just stuck like that was that was it for me so yeah since then pretty much been doing rust like in my personal time as much as possible yeah i think the the russ community is should be like an example of how to make a welcoming programming community and just always taken aback by how happy and charitable everyone is with their time and their knowledge like just it doesn't matter what your problem is or what you're running up against there just seems to be always someone they're willing to help 100 yeah it's it's really amazing and i think it's it's probably one of the biggest contributors to our rust is doing so well as a language just because it's a difficult language to learn and so having a community that's there to support you through it actually makes a lot of difference so yeah yeah i've been writing rest for a little while now and still bang my head against the table quite often so yeah it is definitely difficult to learn but i am writing much more functionally correct code because of like the type system and the guarantees and all that so awesome all right can you then do us a favor and maybe just tell us a little bit about what we mean by asynchronous programming sure so yeah so just for context like i am i started learning about like async i think late last year well i mean i started learning about brass lee last year but it's been it's been an interesting journey and so the the i guess the most exciting thing about async for me was that i could actually learn like the details of it um and so kind of like how i got here so with async programming like the main idea is to solve let's say a class of problems that allow your code or applications to be more scalable um and so you basically want to run like concurrent software so you want to handle multiple tasks all at one time and this is good for things like web servers so i guess that's like the canonical use case [Music] where you know you have like multiple in-flight requests and you want to be able to handle them all and obviously if someone taking extremely long or your concurrency models isn't that great uh you're gonna have uh like requests backlogged and then just start getting 500. so um yeah async async programming programming kind of like came in to fit that void as far as i understand and so the the main uh use case or where it fits in the base is with i o bound tasks so that's with um most of the time taken for the like execution is what's dominated by waiting so if you can think about it like making a a database call for instance um you'll do a bunch of work make a database call wait until it resolves until you get your information back and then you can continue working so there's like a big span of time that's spent waiting and so effectively what you want to do is kind of like maximize the amount of time that the cpu is being utilized so you don't want to keep weight you don't want to have this dead time where a request comes in and then you've got to wait and then nothing happens between when you make that request to the database and when you get the information back and so async programming kind of fits in that way by allowing tasks to basically like uh give up the all like yield the execution to the cpu at points in time and say like i've made a request i'm gonna be busy waiting so something else can run in the meantime and then when that thing finishes right like when i'm ready to run again i'll let you know and then i'll proceed to run and so that it's just kind of like unlocked a lot of um performance and so yeah i mean yeah that's pretty much it's pretty much a long long explanation no that works really well you know like i think when i when i first started working like in code professionally and i won't give you a year that will show my age but you know i was working with with parallel and php languages which doesn't actually have like any native concurrency or asynchronous primitives and then yeah blocking the the request well like you know our request comes in and then blocking until you make some other external requests and then send it back it's just not going to ever scale right like concurrency and asynchronous uh i've seen yeah i've had like a quite a bit of uh experience with django's kind of like sync model so it's kind of like a basically you can think about like a thread per request and so what happens if like if any part of your of your request is really slow then all of a sudden you get this like huge backlog of requests coming back because something might take like i don't know let's say like in a bad bad example friends like if something took two seconds and your requests are coming in at like every millisecond or something obviously things are gonna go wrong right so so yeah i think really fits that that specific uh like problem space and yeah basically solves for that yeah i think you know maybe a lot of the people that subscribe to the channel may be familiar with go which has a relatively hands-off asynchronous model and that you never really need to worry about it you just say you'll run this thing in the background yeah rust isn't quite like that right it's a bit more complicated yeah yeah i think actually i actually think like goes way of doing it like okay i don't have very uh informed opinions here but i do think that it's actually quite a nice um way of handling it in the sense that because it effectively does the same thing but just behind the scenes so you as a dave don't have to worry about actually writing like async awaits but i don't know if you know which one's better like the ergonomics of you but yeah in rust i think in rust i think node js python and a few others yeah it's really explicit so you'd have to write like you know pepper your code with async awaits and have your red or colored functions as as that essay says the rust model seems to be really similar to the node one at least i see a lot of similarities and even just the syntax and some of the like for me the problem parts it's always like you have to define your function that says think and then it has to be called from an asynchronous context you can't just go through it around uh awaits anywhere which in the goal world is unheard of because in goal you literally just put goal before a fun line and that's it right it just it just does it but with with node and we've rushed the similarities are there and i have to actually think up front about where i need the concurrency to happen or yes whatever and then i think my first time trying to write is synchronous rust and realizing that i had to have like this executor contact and the main function that then delegated the access and that i was just like it's like yeah it is it is definitely difficult maybe a little bit overwhelming i guess for me like that's that was my introduction to async programming in the first place so i kind of missed it you're going to make it easier for all of us today hopefully but yeah yeah i really enjoy like um so far as much as i've played around with ace and crust like i've enjoyed it so far but i haven't heard there are some pain points i think for a lot of people that are using it in production so but that's also still new so it'll get there all right well how do you want to kick this off do you want to we move over to the screen share and start writing some code yeah let's do that definitely i think that'll i think that'll be fun all right we can see a vs chord but you may have to bump this font size up a little bit okay cool cool yeah let me do that i think you can just command plus a few times or change yeah that would work too yeah let's see go 20 go 24. that's what i used okay cool yeah they were kicking i like it awesome awesome i bet that's massive on your side though right yeah really it's big and it's funny because i use like uh 10 or something ridiculous like that that's not the worst one so 2d4 is huge but it's actually it's actually a great quality of life i've got a hello from javier and the chat hey yeah cool so yeah i was thinking like it would be cool to kind of like run through just a bit of the um kind of like basics of async and just kind of like um walk through some very beginner stuff and kind of like break some sort of like assumptions that we might have of it whatever so perfect cool so firstly um yeah one thing i think that's become like a big deal in async is like the executor so you know we've got like tokyo and async standard and then they're not entirely compatible and whatever so you basically have to choose your one up front and like ecosystems are built around each one for for this use case i'm gonna use tokyo purely because i know some stuff about turkey and don't know too much about the eastern standard so it's nothing nothing against the a6300 beats um but yeah i mean it's just basically all i know so um yeah basically just like a quick just running through like a quick uh creating a quick uh asynchronous function so tokyo is the executor um and then it's got like an entry point so in this case this async fund name and yeah i mean this is basically all we need to do to make code asynchronous so we can actually also increase the front of my uh let's see i think the size of the sidebar is okay if that's what you're worried about on the terminal let's see there we go yeah sweet um cool yeah so this is basically all you need to do to make a asynchronous or make your code asynchronous in the beginning um for the purpose of this one i'd prefer to use so you can use them multi-threaded or single threaded async what's going on now what did i do um um so yeah you can have a multi-threaded or single thread one for the purposes of this bank of recommendations uh uh current threat which is just a single 3d executor i think it's a bit easier to understand the model so i actually have this image here that i was so i was actually writing a blog post at some point that i never got to finishing but you can imagine that like let's say we have task a and b here uh and it's executing so this portion is where it's doing some work this part where it's empties it's waiting um similar for task b here all of them all both of these tasks will run on like the same thread so task a will run uh it will just started with task a then when it gets to this yield point um it'll give up control so in async tasks are cooperatively scheduled which just means that they themselves tell the execute like the executor when they're when they're like blocked and then still give up the cpu basically and so it works on like so you have to have like basically like good actors so in this case what will happen is if you had like a cpu bound task and it never yielded the cpu then nothing else would be able to run up towards the speed processing message um but yeah we have this like task a which runs and then we have then it yields the cpu and then task b will start running but it's all on the same today there's a bit easier to understand than having it on multiple threads it does the same thing but like on multiple threads effectively um cool so yeah this is basically all we need to have an async function and if we clog around this uh hopefully it will compile and work yeah i've heard the rust compilers really fast [Laughter] also on the macbook it's arguably also not as fast as others cool so yeah we've run this function we've got a hello world out feels very synchronous so nothing too crazy and effectively it's well it's not but effectively it's run um in a synchronous in a synchronous fashion um and just to just to kind of like unpack this so this uh tokyo main flavor current thread is effectively the same as this so it's obviously a macro and it does the same thing as let's see i can comment the science so these two things are equivalent um barring that there's no commune and so basically what happens is it creates a new runtime and then it has this block on method and what this does is uh as it says here it's um it'll run the future and then block until it's complete and so basically this this here is here is your future and you have this root feature so features are often made up of other features and so in the case of like let's say a web server whatever you'll have like this outermost future and then all of your tasks running all of your tasks running uh within that future um and then only when this root future sort of like top level future resolves or this finish and so yeah basically we have this async function here which does the same thing so this should effectively give us the same output uh let's see yeah hello so we've we've come full circle and pretty much got the same thing um but yeah now we can try and do some more sort of like useful stuff for instance so let's say we wanted to so let's say you wanted to just make like a request using a request so we can use a request is async basically it uses tokyo um underneath and so let's say so we can create a client the request is that http client yeah yeah it's a http client so you can get a new client i think this will work let's say uh get what is this your organization so this dot send is a um this return this actually returns the future this dot send um and and then we've got to await that future to basically run it so one thing about um uh rust's async um model is that futures don't run until they polled which is basically just to say that they do nothing they like late they're lazy by nature so they do nothing until uh you kind of like force them to run or or drive them to run which is what this await does so if you don't use starter weights nothing will happen so if you create a future and never acquitted the result then that future will never run so yeah if you if you never await the future it'll it'll basically never run um so we can we can actually like test that out so firstly let's maybe just like print a let's just do something simple so in this case the same thing so we'd expect this to return like a channel uh it'll just be like this one i should definitely get a fast computer it's not here it's the compiler cool so we got this turn it off um but let's say we um didn't run this so instead we just had send here and then did nothing with that so now you can actually see from the from raptor analyzer that this output is a future with some response an error um and okay now we're getting an error here so this nothing will happen here basically but there's nothing that's going to get you can't actually print a feature can we make a suggestion could we like create a function that just touches an empty file on the disk and then schedule that and then not await it and the file wouldn't be created but if we did a way to the file would be created ah we probably we can't do that we can do that let me figure out how to do that i guess even just even our print line statement would work like if we wouldn't see the print hello if the not if the future or the asynchronous function was to print we shouldn't see any output again it would be the same sorry what yeah we you i was just saying you're actually your idea is better like if we just do a print inside of the future like if we create a new function called async print hello like it should never print hello if we don't know where to oh rather than trying to remember how to do the fail system calls because i definitely don't remember yeah so let's say yeah principle so it takes nothing and then so in this case um you could actually just say so we can say like feature equal so in this case like we would expect so pretty much this hello raw code would run about this print line uh would yeah yeah so if we do that then that's pretty much what should happen okay so now it's telling us that we didn't use just a warning okay so you can see like you know this hello gets printed but then if you go like uh future. wait um then we should see hello raw code and then and then the hello right yeah so that's pretty much what we've gotten by awaiting this future an interesting thing though is that this technically is still just running synchronously so even though we've got this like async and tokyo is running um there's not multiple tasks being run at the same time so uh we can can do that just using let's say we can make some possible requests and so like some interesting things here uh might come out so if you have and then that same request again so kind of there's quite a few interesting things that could happen here so if you've got this and we kind of like duplicate the this again so if we have so you're going to make two http requests yeah so if we do that then okay let's call this like results or response one and then this response to uh what will happen is that still in this setup um okay let's do like number one this will still be synchronous yeah it'll still it'll still basically be synchronic so we'll have and then it's live taping yeah cool so in this case it's basically actually still synchronous here um because starts aren't really being spawned uh um to run asynchronously so in this case uh it will still it will still run so even though this this result one will be like an i o task so it has to wait for a response it will just still run in the order that it's benefit and so we can actually get around this right so still still in order and if you run it multiple times it'll continue to be in order um and so to get it to actually start using async properly i guess then we can start spawning tasks so tokyo has this thing called spawn which i think just take here yeah um and this spawn takes like an async function basically now cool all right yeah and so this will spawn another task onto the executor um and these will run in asynchronously basically but we might also find some weird behavior coming out here which could be quite interesting so let's say so we have that then what we want here time cool so so uh what did i do that doesn't matter cool so now we have these two tasks basically being spun up so the spawn will spawn an asynchronous task onto uh on to tokyo and this will do so these weren't these weren't blocked so what we'd expect here is for so because these get scheduled asynchronously uh you would expect this to run this print line randomizing first and then [Music] whichever order he's complete it might still complete in the same order it's been written purely because um yeah the one request happened earlier than the other but sometimes it doesn't i had fun yesterday figuring that out so if you run that then uh cool so yeah so this is actually an interesting and interesting outcome here and so this has to actually do with the fact that like there's when i was talking about having root features and futures um or a root task and then tasks being spawned within sort of like this task or being a combination of other futures what technically happened here is that that block on function that i had earlier or that i showed earlier um it only waits for the top level future to resolve and so what's happened is that these two tasks have spun off but then the top level feature is basically like this in essence yeah and so when when it got here then it's finished the top level features finished and so it just doesn't completely so a quick way of actually handling this is to use tokyo join um and this basically just says it waits for each uh task to finish before completing basically so it'll wait quality and spawn returns like these things called join handles um yeah i think i've had this problem before when i've been doing is i'm like why is it not running yeah it's a video it's a very interesting one so yeah in this case now we've got this like uh turkey joint and then this will basically wait for these two to finish i don't know which order they'll finish and it say it can change it could be in the same order but now we'd expect to see run async and then the status of these two things being printed cool there you go so so yeah so now you can see that now this is actually all running asynchronously um which yeah i mean which is the call and in exactly what we wanted out of that system basically okay can i just summarize this so that i make sure i understand this all correctly okay so can you scroll up one line there we go okay so we have this rust macro which is the tokyo main and where you're setting it to use an executor which is current or a single threaded that basically macro is doing the boiler plate you pasted earlier which sets up a tokyo executor blocks on an asynchronous function which is in essence just our main function yeah now here you have two different requests using two different clients to the ruslan website which are being blocked on using the await syntax those who both spawned asynchronously is in tokyo's spawn construct which just takes a asynchronous function and runs it in the background and returns you a handle to that asynchronous task and then at the bottom you are using a tokyo macro again which allows us to just wait on all of those pending tasks that we have by passing in the handle and letting it do its thing yeah pretty much okay i got one question then like okay is the standard procedure to then let me say is this right do we need the two clients or could we create the client and a higher level than a function and pass it into the asynchronous functions or is it would you just try and isolate that as much as possible and have multiple clients so it's a bit of a difficult one mainly because so it's not like extremely concrete in my mind yet but in so in this case when you spawn something like the the the task normally takes ownership of those of those variables in that case so in this case client we wanted to be owned by a certain task and and all these types and and i'm not entirely sure uh why though i mean like just a broad assumption there's um you want to task to own like its entire or like all of the variables it uses because tasks can also get like shipped across threads for instance um so you it might let's say in a multi-threaded context it might start off on thread one but then end up on like a thread three for instance um and so yeah i'm like pretty sure that's probably a big reason why it needs to it needs to own uh its variables and so i'm not sure about uh kind of like passing in the same client into multiple parts i'm pretty sure it could probably be possible but i don't have enough knowledge there yeah i'll i'll i'll take that also i'm curious about the join handle is is that a future itself can we await the joint handle if we want to uh i actually don't think so i'm not sure i'm not sure i don't i don't think it's i don't uh it's actually nah it doesn't look like okay all right we have a question in the chat as well from russell who's asking is the lazy execution mechanism the same depending on the executor behavior for example if you choose multi-threads rather than a single thread does an async function only get called when awaited as well yeah so that that kind of like model with the with futures or tasks only being run when they're awaited that's just like a rust construct how that's handled uh sort of like um by your executors it's just like an implementation detail so yeah it's basically it's basically just like how the languages are solved for that and i think the the reason why it's built like that is because uh or for for them to be zero cost and so what's happened at least from what i've read is early on rust actually had its own like uh sort of like green threading library um and so but that had like global cost so basically would put put a performance cost on in using rust whether you use those that set of functionality or not um and so with having lazy features it it basically means that like no work actually happens before and there and they compile down to state machines and so no work happens before so you can use them and as well you can like create them and use them freely and if you don't await them or have an executor then you pay no performance benefits obviously then very useless but that's the that's that's the premise so i think i think it's got to do more with the uh with the kind of like design constraints that wrestling imposes by having that zero cost guarantee um and so that's why uh it's it's set up in this way so but nothing like happens automatically where's like in python or whatever once you start the event it just does its thing you know and same with like node and others great perfect cool um sweet and so yeah so that's pretty much on some tokyo stuff i was also thinking it would be cool to kind of like run through uh a bunch of stuff around futures themselves just because i pretty much find that interesting i'm just talking around like how they created what they do um all of that jazz let's do yeah sweets so um let's give me a sec sorry cool yeah so obviously we've been talking a lot about futures and tasks and whatever and so um yeah basically like a future represents um or it's kind of like canonical definition this represents some work um that'll be completed in future so it's like some computation that'll be complete in the future that's a little bit different purely because um uh that feature like it has to be pulled kind of like manually and so you can see that let's say and so yeah then we have these like async um uh keywords and so basically async is just a way to denote the function basic but it basically just returns the future so if i do like this like uh this future there's some async and just some number you can see that this return is something that implements future um and its output is uh it's an integer if you make it something else like unit type you get this like unit type outputs um um yeah and if you do like a string you'll get an string so base and so yeah a future in rust is basically something that's oh what am i saying that is um is basically something that implements the future traits uh and so yeah in this case uh in the simple case like we've got this future um that basically just returns 16 here um and people that are familiar with the node ecosystem just replace future with promise in their head and it's like the same thing uh i think so it kind of looks that way so i think i'm just trying to do that i think so i'm not entirely sure i don't i don't have too much operating experience with jay it's of no tears yeah i avoided that one um cool yeah so i mean we have this future again like you know we can go and do some work here and this obviously won't do anything um until we awaited um sorry cool so um yeah and so we have this future and so like a big question i had like initially when i started it was like what exactly is going on with the future and like how does it run um so we can actually do some cool stuff here so let's just just import a bunch of stuff so uh let me actually just copy and paste this because it's just be way easier cool so let's say we wanted to make so we can make our own future it's actually not that difficult to make a very boring future um so we're going to just do that so let's say we have this ending future are you going to tell me that futures are just a tree implementation technically yeah everyone in rust is always a trip great for crates are beautiful they're really they're really great so there's no it's no wonder cool so we can create a trail [Music] so yeah we can create a trait by implementing future for what this thing that we've called pending future um cool and this is basically the the structure of that of the future traits and so we have to have this type output this is like an associated type so we're just going to return basically nothing [Music] so it's just an implementation of a pull function that checks to see if the work is completed and then returns it yeah pretty much pretty much nice um and so okay cool so what will happen is that when when you pull it will return this kind of like poll object here and this can either be pole pending or already those are the two like variants so pending already um and so we're going to do something like really interesting here we're just going to return paul pending and like prints open polls so um yeah and so what will happen is like basically uh when the executor runs a new awaited task it'll basically start pulling your task for you and then there's like a whole lot of details there so we can make a pending future here interesting thing here is that basically we'll see when it runs um whenever that is cool so we've got this like i've been pulled and then now our executors actually hung so like there's nothing nothing's happening and the reason why is because as i said earlier like we've got this like outermost future and that future um uh what's what i'm looking for like oh it's blocked on that outermost future and so this like feed and so this one that we've created internally it has it never completes so by having this pole pending it's basically never completes and so uh the ultimate future won't also complete therefore it just hangs and so nothing happens so we can also and can i curious like i only pulled once like i would expect it to be to pull more than once is that something that's controlled by the implementation yeah so that that's actually an interesting thing and also like a great uh or just like a a thing that rusted so we can actually we can actually we can actually fix that so let's say i saw this trick the other day it's actually good so then i can explain it later okay cool so if you do this if i think this should work so in this case we'll we should see a lot of outputs here hopefully yeah there we go yeah right stop okay that's kind of what i i actually expected that behavior by default so maybe you can walk me through that one yeah cool so um that's basically like the the way uh uh kind of like rusty set up the system so the idea is that um when a task gets pulled and it doesn't complete you don't want to like keep pulling it all the time so you don't want to have you don't want it to be like this busy dude it's like hey have you finished hey have you finished hey have you finished and then it keeps saying like i haven't finished right so what it does is that um it'll set up so we have these things called wakers and wakers are basically a way to tell the executor that like hey the thing that i've been waiting for is now complete i'm ready to be scheduled so let me run effectively and so what will happen is that your task will run and then it will uh like yield they'll say you know spending then when it's ready to run again it will call this like wake object or it'll get this lake object and call wake and that will reschedule you back onto the executor so in this case with tokyo for instance because we didn't reschedule it nothing happens basically it's just it's just like uh we folded once and then it never tells tokyo again that like i'm ready to be pulled again and so nothing happens until that happened like until uh you you wake your task basically or your future okay thanks that makes sense cool cool so so yeah so that that's basically um uh and [Music] where was that yeah cool so okay cool so now we can also have like this um we can just return ready and so this will do what you would expect it to like it'll print out this i've been pulled and then print a little so cool yeah so so now we've got this feature that basically does exactly what we wanted and it's actually not that complicated as as we've seen here like we've been able to create one um you know without without too much ceremony obviously real futures are way more complicated [Music] um and so yeah and so that's pretty much like the that one on on having or creating your own futures um and then wakers i don't know if you have any more questions yeah i guess so like you know people wouldn't really be expected to implement futures on their own if i'd probably be discouraged right just use yeah just use what is already pervaded um yeah pretty much okay yeah i mean there might be there probably are cases whereby it makes sense to create your own but like most most likely you just use what's in the ecosystem or they were provided by the library itself it's nice to see how it works all the way down like you know now hopefully people have the knowledge that they can add the tokyo executor to their applications that you know they know that they can then go and create futures or use functions that return futures and how to block and await on these things or let them run asynchronously and just seeing that it's just a trade implementation with a pull function on it it was quite nice like i mean i should have thought about it and then of course it's rust of course it's going to be a trick but you know yes yeah seeing that it's really cool i like that a lot so yeah it really is cool i mean like i think they've done quite a quite a good job of of the of the implementation details and like i mean that's how it works so it's also pretty exciting to to do it yourself even if it's something like very small like this only because it can seem like a crazy it's like nice to see daughter know that it's not the craziest amount of magic like you can do something very simple and like put it up your own thing um yeah which is yeah which is pretty cool cool so i'd like to kind of like end off um well let's not end up because it's gonna still take so take some time but it'll be cool to like run through so tokyo has its own um uh oh it has like a tutorial on like making your own execute it and so i thought it would be cool to just kind of like walk through that uh just because it again like gives a bit more context into like how things are put together oh definitely like super interesting um cool so we'll make a new file yeah cool so i'm gonna be jumping back and forth just like copying code from from that tutorial so that's unlike uh it's on there it's on their website like if you go to tokyo.rs anybody like that i think they've got uh something yeah learn they have a like so yeah i mean it's it's really i encourage everyone if you're like interested in the topic to read through it because it's really it's really interesting and okay i mean it's obviously a very simple executor but it's not like nearly as complicated as you would have thought it would be surprisingly um it's simple simple examples that teaches right i mean a complex example i'm going to walk away more confused yeah it's simple works well for me i like simple yeah no the same yeah same yeah cool um give me a sec yeah sweet cool so yeah so yeah off the off the bat we'll have this like so we're gonna we're gonna start off i'll kind of go through it like uh you start off with this like executor and then you have and then we'll kind of like build all the constructs going down um which will be fine uh let me just see if i have everything i need yeah cool so yeah just off the cuff we'll have this tequila and that so so yeah first thing we'll have is like okay a task queue and then uh let's say uh trying to give a descriptive name for this uh send to execute a key right so we're just going to use a crossbeam so crossbeams like uh actually i don't have a good description of muscle i um actually just going let's see if it gives us i know it might give us anything good yeah that gets me every time with rust analyzer that if you don't use the code it doesn't get all completely yeah that'll happen right so we've got tools for concurrent programming that's what crossteam is okay perfect i know exactly what that is now it's better than any description i could give you so we're just gonna go with that um cool so yeah we've got this cross beam channel um and so this will just be a so there's gonna be some details here that i might just like skip over honestly um but it's cool so awesome so it's time to look a bit more like go now yeah yeah go in channels so yeah so we have this and cool so yeah so for this executor basically we have uh so this task will kind of like create a thing next but basically the idea here is that you have an executor and in our case we have this task queue so this is the like queue of tasks that the executor has to run so when task gets scheduled on obviously backlog the key you know it'll just go through here to figure out like what to run next and then this center execute a queue i'm just calling it this just because it's easier to understand like exactly what it does but basically we're gonna this is the sending side of this of the channel um and so what we want is for task to be able to like send um itself onto the task queue uh and so we're gonna just like pass this down into the task itself uh in essence and so that's kind of like why it's there so just adding like some methods here so we're gonna add this in uh cool so just creating a new executor so send to execute key and task so this just created so this just creates the unbounded channel so there's no like capacity um or yeah no like set capacity for it basically um and then we're just gonna return that so ask the key then we have this function called run which basically was the it's going to be our entry point so we're going to go like execute a dot run and it should basically run all the parts that are there so this looks like tox key even then they're gonna pull the task itself so here what we're gonna do is just like run through each task once it's received um in the queue and then just pull that task until until it resolves basically um and then [Music] the past function uh so this i actually need to figure out exactly why cool so for i mean this is just in the tutorial so i'm just kind of like copying it from there i'm just typing it up just so everyone can play along follow along um but yeah it's got like the so this is the kind of like little the trade balance oh yeah the trade balance it's got to be send i think because of crossbeam channel needs to be sent and then this static i'm not entirely sure i'm not gonna lie that's something i haven't i haven't wrapped around entirely but i have a very strong feeling that um it's got to do with uh actually it probably doesn't have to do with penny it might it might but yeah that that's something i'm yet to get to figure out entirely and so yeah i just throw lifelines around whenever things don't compel for the best i'm just like yeah sure static go for it does it work [Laughter] literally um so it seems here like what an executor is is a send and receive channel with a couple of functions to add stuff to the queue and then to run the yeah yeah when you break it down like this you're kind of like this this makes sense like that's really cool it's really cool obviously this is very watered down but nonetheless i mean that's it's basically like the premise um what's going on so so yeah so now we have uh uh our executor yeah with the queue um and then you can see here in the spawn function we basically uh pass this like send to execute so okay a task firstly like wraps a future and adds a little bit of functionality to them like state management i guess and then we're passing this like send to execute a queue um uh that's that i'm looking for uh it's like sender i guess um and this the task will be able to use to basically like uh reschedule itself onto the queue so cool so that's that we can then go on to like implementing our task so i'm also just going to copy this in so cool um all right let me actually just make [Music] will that your statement work with a mod statement uh sorry you said use create task task but without a mod does that work uh yeah it does work as well i mean as far as i know i think it should work well we'll see we'll see here yeah if it doesn't then we can just use one um cool so so yeah we have this task a task is just going to be a future and then that like send to uh execute a queue basically um which which will be the same as so we've got this and then that takes in a task and then this feature is uh it's just complicated but basically so it's basically um here a pen and a box damn it yeah i i don't know i love you too yeah it's the only thing it's missing is arc and then it's got pretty much all the words i hate so the mute fix is to make it uh sync because cross beam channel requires things to be sent and sync and i think sync is a market rate sync and center market rate so that means that if all of it's like uh well what the what they're out of it it's like variables or whatever that belong to to task if they are all sync then task will be synced if they're all send then task will be sent like if you know one of them isn't sync then the whole thing's not sync basically so it's like a it's an auto trade basically so this mutex is needed to make it sync um and then on on pinning printing is something i'm still also just learning um but uh in effect it's it's because rust futures or or yeah rust rust futures are they need to be self-referential they need to have the capability being self-referential so they can refer to their own like variables that they constitute of and so if they're if they've moved um then you can imagine like let's say you have something sitting at like location x and the feature then is moved in memory and it's pointing to itself then that that part where it's pointing to itself gets invalidated because it's moved in memory so you want to pin them so that they don't get moved basically so the pen removes the move semantics then like keeps it in the same menu yeah yeah so it's it basically it basically disallows certain operations from happening on something that's pinned basically so yeah so it doesn't actually like stop anything from technically happening it just disallows certain functions from from like from using certain functions on those on those objects basically there you go there's that's the most important thing i think i've learned today so now we know that we've got this dynamic trait thing of dynamic size thing in memory so we wrap it in a box we use a pen to keep it in the same point of memory so they can refer to itself within mutex so that we can control what speaks to an order yeah got it pretty much that's pretty much it cool so yeah so then we have this task and then we're just gonna uh implement some of some of its stuff so um okay first actually what could be more interesting is to have this uh today um did i import this i don't know okay now and i forgot this let's see sweet so we have this um uh thing called arc lake which basically allows us to implement the that that wake and uh wake by rif uh functionality so that's also a trait no surprise um and that's and that's needed for for us being able to like wake a task and just basically reschedule it back onto the executor so in our case we're just going to implement this week by ref fun asking the json question just now yeah sure what's the difference between standard future future and the creates futures i think oh cool so well at least again from what i know so when they stabilized uh like the future traits and everything that's obviously what's in standard in the standard um library and a lot of the like design and i guess apr development of all that happened externally similar to let's say like how tokyo and asic standard are kind of like core functionalities of how you'd use async rust but they're not provided by the standard average um and so that was initially all in this like futures crate uh but then some of it got stabilized and put into standard but then there's still a bunch of functionality that's only available in like the actual futures great and so you you basically use that to add like the additional functionality in essence pretty much so we have this schedule here and so basically what's going to happen all the premise here is that um when we call this wake bar riff or wake we're basically just going to schedule it back onto the executor here using this function which we'll implement now so so that's scheduled so [Music] yeah um closer we've got this function oh and so a weird thing like also like a gory implementation sort of like detail but um setting up these like wake functions requires kind of like some low level unsafe code and so an easy way to kind of like avoid that is to use this artwork i don't know i don't understand like uh how it negates or how it solves that problem for you in effect um but yeah it's basically kind of like a shortcut from having to deal with writing that like unsafe so then we're just gonna also then when when we want to schedule a task we basically call the schedule which will get called by like these wakers um and this will this is basically sending it though what i call this executa uh yeah i was going to see where it's executed coming from [Laughter] um just threw away a bunch of words um and so yeah we've got the center executed queue and then we send ourselves basically all this task to that queue and then i'm actually just gonna copy and paste this side so then and then we also have this poll functionality here which yeah and so um effectively when we when we poll this task also because i've got 20 minutes so when we pull this task we basically create a wake and a context and that context is used when you poll the future internally so we've got we've we have you know a future attached to the task this will never fail purely because no other thread is actually trying to unlock it so you can just unwrap here um and then we pass that context into uh into the future itself which is that which will then get called by that black future when when it wants to wake itself up basically and then the last part also i'm just going to also copy and paste this in um which is that spawning function uh functionality so this is actually this is actually quite simple where when you run a spawner task um we basically create a new yeah so when you want to spawn in the past we basically create a new um well a new task object and then we send it to um oh this is not center this oh yeah um and then we send this to the the task key basically so that's what gets passed in here uh this send to execute q so every time we spawn a task a task will just basically send itself or set or send a task um to the executor to get to run basically so that's that's basically that and those are the two kind of like big components um and then all we have to all that's left to do is create our own uh i have no idea i will figure that out just one um so yeah all that's left to do is like create our own future i also for the sake of time also just copy and paste that and just kind of like talk through it um which would just be like a lot faster cool um but i also need to solve this issue uh this is you're probably right yeah let's see i think you still need to use but you need the mod yeah yeah yeah i don't see what's going on here what's going on i know so much that is super weird okay we'll get back to that that's probably why okay let's see so oh i've got a lot of errors coming in it's unresolved that should be fine cool just give me a sec so what's the adder on your used feature statement uh oh it just says unresolved imports i think it's just not picking up that it's there let me see if i can just like reload my window and see if that's all things like which i doubt analyzer does seem to flicker after a while and i usually restart yeah yes code yeah it does happen every now and again sometimes but we'll see [Music] i think it's still building yeah yeah it's still living up here [Music] well so no it's still getting me it's changed cool so that's up here yeah that's definitely happier let's see if this works now ah there you go there we go back in business cool sweet so um all right and then uh okay we can actually get rid of all of this um and then there's just this last piece where we have a so we have this like um future delay uh which is kind of it's almost like a sleep in effect um and so basically what happens is i should probably should have written this one out um but basically um what's happening is we basically want to delay for a certain amount of time so we have this thing called the instant instant just gives us like an instant in time right now it's basically used for comparisons and so we want to say you know we want our delay to be um uh what's like from now plus two seconds and then only after two seconds will be finished so in this case what we have here is if uh the current time or like this current instant is larger than when we specified it to be so like maybe like instant plus two seconds from wherever we are then return already otherwise um we then go ahead and set it up to basically tell us when to be scheduled so so here what happens is okay we've gotten this the waker from the context um and then we're kind of using like a semi-hackish way i guess i don't i don't know but we basically spawn a new thread and then we we sleep that thread for that period of time that's left so um you know if we wanted two seconds ahead and we're actually one second ahead then we want to sleep for one second and then once that's done then we call this waker.wake and this is important because if you don't call waker.wake at some point then your executor will basically hang or this task will never get finished uh or run to completion really because it never gets work and it never gets like rescheduled um onto the executor to run and then we have this like poll pending in this case so this will return paul penny and then once this weka dot wake runs it will get executed all again rescheduled and then executed so that's basically you can ignore those so uh this waker.wake is basically what gets called here um the self.schedule which then again sends it to the executed queue and then i think we're pretty much good to go so you can go let's actually do this and then uh cool so we get a new executor then we got to create a new future so this delay then we want like an instance i think he can do this plus uh and think that should be fun cool and then [Music] uh then we want to spawn that task basically so you want to spawn the future and then we want to run basically and i think that should do the trick so this should and when you do a cargo run create your new executor create a future that we just go through put the delay which just looks for the time stamp ups that you then spawn in the future and then hopefully wait and what we should see is yeah i was gonna say yeah let's get some pranks yes [Laughter] uh let's say pending cool and then we'll say uh we are done uh don't forget the emoji i think we need to confess immediately let me go grab one quickly um okay cool and then this year we can actually this would be cool because then we can say like we can see that the task um of course we are taking the task back over i don't know okay let me go get this great emoji from someone um emojipedia there's a shortcut on the mac for emojis and i can't remember yeah i can't remember what it is because i've got mine set up as the function key and but i think oh that's an option but i think you can do it like command shift and something okay cool yeah so we should expect it depends then to complete like say waking the task then we are ready uh hopefully let's see how it goes uh if you're a drumroll what happened to my which i don't know where i am in my computer okay i think we're at the right place now it's control command and spacebar for future reference there we go ah cool cool oh oh not just too late the fastest compiler ever i do like that they do make speed improvements i do it's easy to make fun of the russ compiler and i do it often but i also appreciate that it's a complex piece of software and people are content they're trying to make it better so 100 i mean there's that also it gives you a lot of guarantees like it's a it's a basically a kind of like trade-off between what it offers you versus like how fast yeah definitely a compulsory yeah it's it's all right um let's see what happens okay all right painting tasks really cool but then still i i have no idea what oh no that makes no that makes sense because this is this is a blocking this thing is still running and this blocks and this block so it worked basically i guess if we added that we don't need to do it but if we checked and the cash cube is empty we could just exit right yeah something like that yeah we could we could do that but in theory like let's say if you're running a web server you wouldn't want to do that for instance because you can get traffic at any point in time yeah it may not be a very good web server cool so yeah that's basically that's basically that um yeah that's that's everything all right that was enjoyable no that was that was great like i think you know going over all the primitives and then seeing how they're actually implemented under the hood it's you know like it's not oily complicated even though it can feel like it and i think you did a very great job of just breaking it down and showing us the primitives that we have to work with and it really for me personally just cemented a lot of those things that i had big question marks on i'm like oh like those now make a lot more sense to me to me that was a really valuable demo i thought it was great and i know awesome how the executors work it used to be something i would shy away from wow that's like this function that has a pull on it and stuff like that so yeah very very useful awesome thanks all right well we are approaching time uh we got a lovely comment there from russell saying that was an awesome demo thank you very much david i mean i'll take some of the credit for sure but it was definitely a hundred percent oh daniel so thank you for taking time out of your day for joining us and for doing live coding which i know is no easy feat so i don't know especially my first time so it wasn't too god damn it oh well you did great thank you very much all right well i'll get back to your date any final words before we say goodbye cool i'm good all right awesome i'll speak to you again soon have a wonderful day thank you all for watching [Music] [Applause] thank you for watching [Music] you
Info
Channel: Rawkode Academy
Views: 601
Rating: undefined out of 5
Keywords:
Id: qy6CiixMahw
Channel Id: undefined
Length: 84min 23sec (5063 seconds)
Published: Tue Aug 17 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.