OTP24 and Elixir 1.12 Release bonanza! - Robert Ellen

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone um most of you probably know me but um yeah for those who don't robert i'm another developer at um at alembic and yeah do a lot of elixir stuff with them and yeah like uh i yeah we we like to live close to the bleeding edge um this is right this talk is right on the the bleeding edge and as paul alluded to yes this is you know i'm i consider this a glorified news section rather than a uh a complete talk but um uh nonetheless um it uh it is a there's a fair amount of stuff there so um yes just um for you know coincidentally i guess um there's uh erlang has had a re um very recently had a um a big release otp 24 and um at the same time elixir 1.12 is very close to coming out it's not quite not quite there and there's a couple of other things um in the elixir ecosystem that are sort of happening at the at the top all at once at the moment as well too so yes may is quite exciting um so yeah so back on in may 12 um otp 24 came out um and it is the first release that has the um the new jit compiler being asm um as default um and um a whole bunch of other things uh in in otp 24 that are fairly low level um as the licks are developers we you know depending on what you're doing if you're doing a lot of phoenix um kind of stuff or you know at that higher level um a lot of them are hidden away um if you do happen to call out to some of the elixir libraries like x for instance there are some better error messages which are nice because um yeah they're um some of them could be quite terse at the same time um yes elixir 112 is so close release candidate one so that's the second release candidate um and uh apart from being compatible and integrating with otp 24 um there's a whole bunch of other uh things in there um so the things i want to go through with that uh yeah some of the improvements to scripting um uh and uh some uh yeah other other little niceties and then outside of um the core uh library there's uh live book which i'll talk about uh very soon and also axon which is part of uh nx in fact live bookers as well too so um i just gonna move that out of the way because that is a bit annoying even though you can't see it um uh so this presentation that i'm actually in is a live is one of these live books and it's up on it's going to all the files will be up on github so what is live book so live book is a um sort of an official interactive code notebook for elixir it's built with live view and it's built um it's sort of associated with the nx team and nx being the numerical elixir project which i talked about a couple of months ago uh i was using a sort of a uh you know an independent uh li uh live uh notebook uh called nix but i uh so i don't know what's going to happen with that but because um yeah live book has come out and it already is you know quite ahead of where um nix was um yeah and as as you can see it's part of the elixir nx project go check it out there all right so otp 24. um so yes the environment is a code notebook you can mix marker uh markdown i should say with um with elixir fragments and then we can come along here and evaluate them so yeah just proving that yes i am indeed running on otp24 i wish i could pipe that into something that would generate a logo for the popular tv series or give give us random keith sullivan sutherland um images but i didn't ran out of time um so the big change in otp 24 is the bringing beam as in by default as uh as you know the new just in time compiler um so what beam asm does is um yeah it converts that low time beam files into native x8664 and yeah rather than rather than dispatching different um beam instructions to functions that um that are that are potentially handwritten and slow um yeah it can um uh it can instead um dispatch out to um just in time natively compiled code and um some of the advantages with you know adjusting time compilers in general that um that beam asm allows is the specialization on types and things like that so there's some really good documentation if you're interested about how it works on the erlang dot org website but just a little bit of detail about what uh what it is under the covers there's an another project um called asmjit which is used for a few runtimes but i would say erlang's probably the biggest project that that's using it and what asmjit is is a c-plus plus just-in-time compiler our code generator should say um so it's just the back end it's much simpler and much faster than um the compiler infrastructures like llvm it just does the code generation so you have to optimize everything by hand um but it generates code a lot faster and um it itself is much smaller than llvm is massive um so um yeah beam as a long time coming there are different people in erlang have for many years tried to uh implement a just-in-time compiler to replace hype which was a um an earlier implementation and um yeah after many returns to the drawing board they finally uh arrived at something that seems good um so yes just in the time the just-in-time compilation is enabled by default and otp 24 for the first time so if we um if we get that bit of information out of our run time we can see it says jit and jit will also come up if you open an iex terminal as well too so once um you roll folder your versions you'll eventually see that when you're in you know iex sessions um so i mean the main reason to do just-in-time compilation is performance um i haven't looked into the um the legit in a great amount of detail or bred the documentation above here much so there's a bit of um yeah most of the information i've seen is anecdotal in like meetups or conference talks or whatever um so um the one of the goals in terms of performance was to not blow out the code memory size what happens with some just-in-time compilation or compilers is that you end up having a lot more memory allocated for your you know many many native instructions compared to um a few um interpreter instructions and so you would you lose some of the benefits um if you start getting cash misses etc so they've tried to keep that at a minimum and the the other day i was on a meet-up uh elixir meetup in where was it birmingham uh alabama i think um that just happened to line up time zone wise and rob verding was there and uh he um well yeah actually yeah some um yeah well how long ago was that anyway it was what it was a while ago now actually maybe a month or two and he was staying on the order of 10 um code size that's not too bad you you know your um your code memory increases by 10 to get a benefit of just-in-time uh performance on the cpu side again i haven't um seen any really um i guess scientific benchmarks um uh and really you know looking doing benchmarks on the the compiler across you know real workloads is a as it is more than a talk in itself so i took um a really basic um example from benchy which is a better benchmarking library in elixir put some a bit more maths in it so that the expression is a bit more complex and then did some lists and um some list uh manipulation just to um just to see so i ran this um this benchmark on uh uh elixir 11 with otp 23 and on um uh that should be 1.1.11 and 1.12 um so yeah ron ran it with 1.12 on otp 24 and benchy says that um uh the the newer version is seven percent faster based on this entire little micro benchmark and as we all know you you know they they hardly mean anything so yeah a lot more work i think would need to be done by someone to to um to see what the benefits are or you know maybe that has been done i just didn't find it um so yeah if you are interested you may have to go searching for the benefits but yeah i've as i said up here um i've had widely differing speed ups quoted 30 50 100 hundreds of percent for some things which i find hard to believe but um yeah we will see uh what happens but i mean seven percent for something you know for just upgrading a version it's nothing to um yeah that yeah that's that's uh yeah that's not bad um so that's the justin time compiler another so interesting and uh welcome uh change in http 24 is this eep54 which i believe stands for erlang extension process 54 bit like a jsr or whatever they were what would they um java something or others anyway um yeah erlang have the same kind of thing um and this particular one um extends error information in a lot of the built-in functions and so that's that's been implemented and available in otp 24 and yeah the the obligatory example of how this will improve our lives is um good old argument error where literally from many um uh many uh function calls in in um in erlang um you'll you'll simply get um if you do something wrong in this case you know x insert you get a you get an argument error but it doesn't tell you which argument is wrong or why um so if yes if i uh run that now however in um on otp 24 we get something that's slightly better so in this case it says the first argument is the problem and this table identifier reference doesn't exist so we haven't created the table first um the the etfs table so then if we have something wrong with the second argument if um if the second argument to inserting into ets has to be a tuple there's key because it's the ats is a key value pair uh effectively store key value pair store um and yeah it's telling us it's not uh the second argument's not a tuple so that's much more useful than just saying you've got an argument error go away fix it and i'm not going to tell you what's wrong and yeah apparently that's that's that's all that's been implemented all across the code base so yeah that that's that's nice um and yeah there's lots of other lower level changes and bug fixes etc uh in a um sort of the top level way high level way in a news item on the erlang website so for otp 24 that's about as much as i sort of got through and thought was useful so we'll move on to elixir 112. so at the moment um yeah there's rc1 of 112 available and that is um that's the second and i assume final uh version uh one of the big changes is the introduction of of a mix function called mix install um and you call it with a list of dependencies like you would in your mix file and you in you call that in your scripts and you that means you don't have to set up a mixed project um etc for the mix for those dependencies to be downloaded and available in your script so i've got some code examples that sort of are a bit um a bit opaque um because you you can call mix install directly in um in live book but it behaves differently too than what i wanted to demonstrate here so what i'm doing is actually calling out to a separate elixir process with some code um to to show um to show the benefit of all what mix install does so if we don't have in this case poison json library installed and we've run this code um it's you know our script will would come back and say um poison and code is undefined because the module poison is not available and you know we haven't um we haven't installed it um however if we run the same code but uh uh prepend mix install um of the library and this is pretty much the same format as um uh as you would need in um in your mix file except in this case because we're not worried about versions we just put put the um the symbol there and we run that uh it's going to think for a while uh and potentially download uh poison compile it all up and then we get we actually are able to encode hello world into json and print it out so yes as i say in the comments here this whole thing i've done here is a bit of a hack but if you if you need to do elixir scripts um yeah or um you uh install libraries into a live book um mix install is the way to go um sorry if you re-evaluate that right now does it kind of re-download uh it will in my case but um i think because i have forced true in there if not i don't think it does um uh yeah uh so if in fact yeah at some point it's um it does you know build up a cache somewhere in um so i believe you have to have you have to have mix installed and mix has um various paths and so libraries get downloaded into that global like global cache um but i believe you know force blows that away um each time um so um yes you run your script once and it will take a long time but subsequent um subsequent runs uh will be should be fast cool um then yeah in elixir there's a whole bunch of new functions or you know fixes so i thought i'd run through some very quickly um so we can we can trap signals now um basically just got something here that's going to send and the um see user too um so again a bit of hacky code here to actually get all this to work you know in the context of a live book but you know i'm sending the kill command um to to um uh the the os pid not the not the erlang pid of my off of this live book here um getting the message and printing it out so hopefully that will run yeah got to use it too so basically yeah trapping that trapping that signal and doing something getting it back in into myself and yeah again all this this gymnastics gymnastics with processors and that's the thing is purely to get it working in this in this demo but um yeah uh trapping of signals is i i did in a previous life have to worry about trapping signals um because of the you know where where my stuff was getting deployed into and was a bit of a pain in the butt and the way it was getting executed so yeah i think the uh yeah being able to trap signals and do stuff gracefully with ot with um like yeah otp like gen servers and and um that sort of thing would be uh yeah would make certain people's especially in opsi world their life a lot a lot lot nicer in the kernel library there's a couple of new functions then and tap so um there and they're both to help with making pipelines less messy so if we if we have a a pipeline where where we want to um convert some map into a into a struct the problem is that there is a like a a struct function in in kernel but the um uh the the thing that you typically typically want to weave through a pipeline this map is actually the second the second argument so in yeah so uh what um what uh this this first example is doing is sort of the the old way without then um and uh basically you know i'm writing an anonymous function that swap swaps the arguments kind of um and you know of course if we had we if we had flip we we wouldn't need to worry about it but um yeah so there's a little bit of anonymous function gymnastics happening there um but if we try that again uh with using then um we you know it works and um yeah it's a it's a lot it's a lot more compact we can pass struct in as an anonymous function and um yeah it will we we can put that that argument that we're threading through this map anywhere we want in this call of struct so yeah that's kind of nice tap is a bit more a bit more intuitive what tap does is it allows you to in a pipeline uh you know tap off the value to some uh some other function uh typically you it would if it's side effective but um the function itself doesn't return um the value it's operating on so if we have this side effect function here that takes some argument and does some side effect with it but it doesn't return um doesn't return the the argument to continue on in the in this pipeline we can use tap to call out to that side effect and then keep and keep going so if we yeah again yeah if we run this um you know we've we've passed hello through it's been reversed and printed out but at the same time we've run the side effect which is send sent a message to ourselves and we um receive that down here and um uh yeah we print that out and i've got the the timeout there just in case um things go wrong so if i run that um yeah the time it does work and in fact if we don't have this after at all i think um sort of a meta node about live book is that it actually it actually does kill stuff um if it if it finds that there's some kind of runaway process um so you don't yeah it's kind of safe to do you know low-level things like receive inside this live environment um another feature step ranges so this is saying giving me the range one to ten um in step by two so and this one is zero to ten by two um and yeah we're doubling the numbers so this first one does start at one um you know so that's one three five seven nine and this one is doubling you know zero two four six eight ten um uh i haven't looked at the full reason that they're pushing this but pretty much this is going to be the new way of doing ranges so at some point range range without a step is i think is going to be deprecated if i understand the um what i've read and heard correctly so yeah um so this is this this avoids having to um having to do a reduce and skip steps etc um long awaited zip width functions uh in um uh enum and stream so we can um yeah we can uh z z zip two or more um innumerables together um with with some function uh he would just yeah add it adding the the um respective um elements together and yeah stream also has it as well too um and again like otp 24 this yeah there's massive numbers of amounts of bug fixes and other little uh little improvements um and yeah it's pretty well documented in the last two release candidate releases on github so i'm not sure if that by default that shows it shows the pre-releases but if you go to the elixir-lang github releases page and click on pre-releases you'll get [Music] the last two rc1 and rc0 mainly rc0 actually um have a lot of detail about all the little fixes there's lots of um you know tiny little tweaks here and there to the um to a lot of the standard library and associated things and last but not least this month or last month i can't remember exactly when a library called axon was was released by the nx the elixir nx team so axon is a neural net library and it's the one that we everyone's been asking for and um and and waiting for so sean um moriarty um who did a lot of work on the bass elixir nx has released axon something yeah it's on the same umbrella repo was as the rest of nx and um it's um yeah it basically uh abstracts away all the low-level stuff you need to do with nx to to build um to build neural nets i only have part of an example um i ran out of time but he his mix install how you'd actually use it um like if you're doing your own code notebooks or your own scripts so this could be an exs file or what have you um so we're installing axon and and nx and you can see that um yeah it's pretty much the same the same um uh format as um what you need in in your um in your mix.exe files for dependencies um so if i evaluate that um so i've already yeah already downloaded axon and nx in there sitting and sitting in my cage somewhere so um but you um the first time that this runs it would output all the you know all the information about downloading these and compiling etc and all the associated warnings if there were any um and then yeah we've built a model so basically um we have you know an um this model is an input and it has an input layer and a couple of um a couple of uh dense layers um and um an activation function and yeah the the um this through this sort of fluent interface um pipeline we get um yeah it spits out a little description of what of what that model is um and so um the um you know the shape and uh parameters etc and then yeah there are other function calls to um to actually train models and uh and test etc um yeah i ran out of time to um to put all that in and also i don't have a link maybe it's in here but yeah jose um did a a talk where he redid his original like sort of nx announcement talk but he redid it with axon to show the um the benefits of um yeah of operating at this higher level this is going to run in a gpu when when nx supports gpus uh yes it will yeah so yeah i mean nx does uh i um i haven't still haven't um haven't played with it um the the the um uh the uh kakuda um with the uh the xla google xla implementation exla i think it is and there's also one for um torch as well so yeah in in this repo in elixir nx there's a couple of other um a couple other libraries that do the that do the gpu stuff and it's just a matter of um uh yeah of making sure you can compile that on your system and then setting the um configuration um there's a few different points where you can say i want this particular part of my um of my you know tensor calculations to be to be done with um done on the gpu and you can also choose to like keep keep the results the tenses on the gpu to avoid having to sort of the you know the thrashing of memory between the cpu and the gpu so and that that that's what produces the you know thousand times you know 4 000 times speed up or whatever it is over over you know elixir and i think it's even four thousand times over using cmd instructions so yeah it's you know super fast um if you can get all that working um yeah so i mean that stuff is sort of working now i just haven't had a chance to to play with and get you know get it um get the libraries compiled on my system um but yeah it's exciting uh exciting times so yeah that's all i had um and yeah a bit more than a glorified um news item i guess but um yeah hopefully there's some interesting stuff in there and yeah check out the release notes etc for um yeah to see what else other interesting things that might help you are in those uh in those releases so yes thank you very much for your attention and i'll open the floor any more questions thanks very much rob if anyone's got any questions yeah please uh please feel free to unmute yourself and also if you possibly can turn your video on as well and uh and yeah fire away hey rob i was wondering do you run into problems with um libraries that require compile-time app config sort of stuff uh when you're using them from a script like can you um define those configurations prior to sort of like compiling and running those that is a good question i haven't tried that um uh i um it has been a while since i've written any any elixir scripts but i yeah i mean i i seem to recall having to start otp applications manually and stuff like that you um bit like with um with x unit that's the thing or when you have to yeah ensure all started or whatever the um the flag is uh yeah i presume it would be it would be um similar to that but that yeah that's a good that's a good good question i um uh i i don't know if there is a um yeah if i if that if that would um trip you up it probably would yeah it's been so yeah it's been a it's been a yeah i think right at the very start of my time at telstra i did a couple of things with scripps but um yeah since then it i've purged most of that from my from my from my working memory well if you didn't run into any problems with it to get this hard it can't be too bad
Info
Channel: Elixir Australia
Views: 304
Rating: undefined out of 5
Keywords:
Id: ucgYT3YUVS8
Channel Id: undefined
Length: 32min 43sec (1963 seconds)
Published: Wed May 19 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.