Batching for the Modern Enterprise

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right welcome back everyone we are excited to have our next presentation lined up for you uh we're here at spring one on the beginner friendly track i'm bob brinley with you all day i did a little wardrobe change for you just to get my inner share or you know barbra streisand out uh still representing spring of course because that's what we're here for all things spring and we want to thank you again for joining us and let's go on to our next presentation coming with us with batching for the modern enterprise is the one and the only michael mannella take it away mike thanks bob let me go ahead and share my screen now let's get started um i want to start off with a thank you i realize that right now the world is crazy with the global pandemic and all kinds of other things going on and i just want to take a minute and say thank you thank you for taking a few minutes out of your uh day your quite frankly crazy life that that we're considering this new normal and spending it with us so i hope uh over the next hour or so you can get something useful um if not there are other tracks but please stay here for mine um but yeah hopefully you're enjoying the conference and it will continue a little bit about myself my name is michael manella i'm the co-lead of spring batch i'm the founder and uh lead of spring cloud task i also do a whole bunch of other stuff none of which you probably care about these are the ways of getting a whole getting a hold of me so i've got i'm available via three different twitter accounts so the first one is my personal account the second one is obviously the account for the spring batch project and the third one is for a podcast that i'm on called offheap um follow and and uh if you want to get a hold of me these are the best ways to hold me i do have a book out um it's called the definitive guide to spring batch with a forward by the good doctor dr dave sire so please if any of this sounds interesting to you uh check it out it's available where you know good tech books are available are sold i'm also on a podcast called offheap or javaoff.com uh this is a pundit podcast so it is uh a handful of guys hanging out normally at a bar but because of kovid we're doing it remote um and we talk about what's the latest and greatest of happenings in uh the jvm and uh it industry in general so if you want to hear a group of people just having a casual chats about uh what's going on in the ecosystem i encourage you to check that out so i'm really excited uh you've chosen to spend your next hour listening to the most exciting talk in all of spring one we're going to spend the next hour talking about batch processing i mean we've only been hearing for the past what 30ish years that batch is dead or batch is dying replace your batch systems with new stuff and yet even the cloud native computing foundation when they wrote their serverless white paper even they called out batch jobs or scheduled tasks as a targeted workload for that kind of runtime because we're seeing as workloads move from traditional infrastructure to the cloud batch jobs are following so what are we going to talk about today we're going to cover two really two things first of all we'll talk about what is batch processing let's kind of understand what is the workload we're describing what are the problems we're talking about before we dig into bits and bytes then we'll get into actual spring batch that'll be the framework that we'll be talking about today so what is batch processing i define it as the following batch processing is defined as the processing of a finite amount of data without interaction or interruption let's break this down a little bit first of all a finite amount of data in order to do batch processing you need a batch of something right batch processes are finite they need to have an expected end and so in order for you to have a batch process you need to have a finite amount of data you're not going to use spring batch or any other batch framework to process let's say never-ending streams of data without interaction or interruption this is the one that i feel that's actually even more important or the bigger distinguisher between spring ba between batch processing and other workloads when you think of other workloads let's say whether they be web applications functions on running out of faz message based microservices all of them have one thing in common which is they all depend on external stimulus so let's take a web application for example when you build your web app you know you play it onto your servers or the cloud or whatever you get it running and it sits if nobody sends a request it sits there right it sits and waits for an incoming request then it does the processing it does its work returns response and then it sits so if your application isn't very busy obviously you're essentially wasting infrastructure hopefully you're not hopefully your applications are busy and doing a great work batch is different though batch you get all the data together in at the start and then you say go and the batch process does all this work and then it shuts down leaving all that infrastructure open for you and really one of the main drivers for this processing model is the efficiency of it you get to use only the processing you need when you need it freeing up that keep up those capabilities for other things at different times batch processing goes back to the dawn of computing the enac was essentially a big batch processing machine and that hasn't changed to today even today we see batch processing moving to the cloud whether it be cloud foundry kubernetes etc so with a kind of general understanding of what batch processing is in mind let's dig into a spring batch so spring batch is obviously the batch processing framework within the spring portfolio it's extremely popular in fact you could argue that is kind of the de facto standard for batch processing on the jvm it was started out of a partnership between accenture and spring source accenture obviously was doing a lot of mainframe transformations and and had a lot of mainframe experience and so the good doctor dr sire was was the founder of the project him and other colleagues at accenture built uh the initial versions of spring batch in collaboration with spring source spring batch went 1.0 in march of 2008 so 2008 12 years ago and since then has been used in every vertical you can think of entertainment government finance retail iot literally any vertical you can think of batch processing with spring batch has been used spring batch even served as the inspiration for the what is considered the java standard for batch jsr 352. on the left is the specification document for jsr 352. on the right is spring batch's documentation when it comes to the domain of batch so the main components of batch processing and how they interact they're exactly the same i'm gonna be talking about a lot of different terms that if you're not familiar with batch processing may be new to you so let me spend a few minutes kind of defining what those mean let's start off with a job so i mentioned a minute ago that with with batch processing you get all your data together and then you say go to some kind of process right the job is that process that is the thing you're going to kick off and it's going to run from start to finish a job can be made up of multiple steps each step is an independent processing unit that can be run in sequence in parallel and orchestrated really however you want within a job there are multiple kinds of steps the two typical ones you'll see with spring batch are a tasklet base step and a chunk based step a task clip-based step is a step that you have an interface called tasklit and you implement that interface spring batch will then run that for you within the scope of a transaction either once or in a loop until you say otherwise so it's kind of your roll your own step if you will a chunk based step is really the idea of processing a number of items so here we're looping over something we could be looping over records in a database we could be looping over records in a file something along those lines when we talk about chunk based steps one of the important things to consider is an item an item is that individual thing you're looping over so it could be like i said records in a database it could be records in a file there's a bit of an art to defining what you the item for your batch process is think of a banking example you could have the transaction record as an item you could have the account as an item you could have a customer as an item that has multiple accounts and multiple transactions it really depends on how you're processing and what that processing needs in order to make that happen now spring batch clumps these items together into chunks let's say i've got a million items to process for a million records doing those all in a single transaction probably isn't the best architectural decision to do so we break those up into chunks so let's say i've got a million records i may process one through a thousand in one chunk one thousand one to two thousand another chunk two thousand and one to three thousand another chunk for each chunk we start a transaction process the chunk and commit the transaction then we start another transaction process another trunk and commit the transaction and that repeats over and over until the date is exhausted this is what a spring batch job might look like so this is kind of the flow we would expect to see so we've got three steps step one step two and step three each one of these steps has up to three components uh one of them is optional then reader the processor and the writer so the item reader it provides the input for this step so you've got a flat file item reader you got a jdbc one etc the item processor is where you apply essentially your business logic so you're going to make transformations you're going to do enrichments et cetera the item writer is responsible for persisting the data so it's going to generate the output so write to a database right to a file etc the item processor in each one of these steps is an optional component for a chunk based step now with this flow these are three sequential steps so you're going to start step one it's going to start run and finish then step two we'll start run and finish and then step three will start run and finish within a chunk based step the interaction between those three components looks something like this so the step is going to call the item reader it has a single method called read it's going to call that until the chunk is complete typically a finite number so let's going back to what i was saying before let's say it calls it a thousand times so it's gonna chunk up a thousand items it's then gonna loop over that list of items passing each one to the process method on the item processor if there is one once that's done it's going to take the results of all the item processor items and pass them all as a single call to the item writer so item reader and item processor item based item right there the item writer is chunk based the reason we do that is because there are optimizations that we can make on the item writer side that we can't make on the other two think of things like batch inserts on jdbc calls or bulk flushing to a file system in order to run a spring batch job you need some infrastructure so you need some things to manage the statefulness of a batch job you need to handle the orchestration of steps and whatnot and this is kind of how that lays out so it all starts off with a job launcher the job launcher is responsible for basically figuring out the job parameters to pass to the job and then launch the job when the job launcher launches the job it's going to report to the job repository hey i've launched a job the job repository is essentially a data store that stores the state of your job in in our cases it's going to be a relational database that is stored where you can keep track of what jobs have run what steps of run skips reads all that kind of stuff we'll get into that more detail in a bit so the job launcher tells the job repository i'm launching this job with these job parameters the job then uh reports the job repository i'm running and i'm going to launch this step it could be step a step b step three and so forth the step is going to report to the job repository i'm i've read this many items i've written this many items this exception happened those kinds of things all those are available for you after the job ran via the job repository within the job repository we have the concepts of job instances and job executions and so a job instance is a logical representation of your run let's say you've got a spring batch application that's going that's running once a night every day of the week so you got a run for monday run for tuesday run for wednesday so and so forth and the job parameter you pass in is the date so what is the date for for that run the job instance is identified by the job name and that parameter passed in so you'll have one job instance for each day each logical run of the job every time you launch the job you will also get a job execution the job execution is the physical run of the job a job instance can only be run once to completion the job execution is how we keep track of all those runs until it's been completed you may only have one job execution if the job was started and it ran great you have one execution if you have to restart your job you'll get multiple job executions linked together by that job instance each job execution will have one or more step executions based on how many steps that job execution runs all right that's enough of me blabbing for a minute let's go ahead and actually take a look at some code and it is basically the law of computer science that we start off with a hello world uh job so we're gonna go ahead and start off and do that so here i am my id i've got a basic spring boot application if you're familiar with spring boot this should all look familiar to you so you've got your main method that bootstraps the application i've got my spring boot application annotation and then i've got this enable batch processing annotation so this is going to handle all the that infrastructure i mentioned before with regards to um what spring batch needs to run so just dropping that in rotation on on your application that's going to handle bootstrapping that infrastructure for you our hello world application looks something like this so i've got a configuration class spring batch provides a number of builders for building various things in the framework in this case we've got a builder factory for building jobs and building steps so i'm going to create a job bean it's going to take one parameter it's a step i'm going to call this.jobbuilderfactory.get the name of the job i've got this run i've got this incrementer here and so what this is going to do is remember how i mentioned a job instance can only be run once to completion that i add this so that spring batch will automatically every time i run my job will automatically give me a unique set of parameters that way i don't have to worry about changing a parameter or to run it more than once we'll this will be more important when we look at error handling later then i have a single step so i say start and then the step i want to run and then i call build then i've got down here a beam that's of type step so here i call this.stepbuilderfactory.getstep1 and we're into a tasklet step now this could be a lambda if you want i'm just using the interface so you can see it better but it's got a single method called execute that we implement here i've got a simple system.out to println and then it returns a repeat status so a task that can do one of two things it'll either basically it will run until i return repeat status finished the alternative is continuable and if i return continuable spring batch is like okay you want me to run that again and it will keep doing that until you return finished let's go ahead and run this application and there so we can see spring boot by default i didn't have to do anything ran my job on startup with a run id equals one so that was that extra unique parameter that the run id incrementer added for me it executed my step step one it said hello to all of you amazing people and then it ended and you can see my job ended with those parameters with the status completed now while that's cute uh most of your batch jobs are not going to be a single step so let's go ahead and take a look at one with multiple steps there's a couple different options here so first we'll look at the one with the syntactic sugar for running them in sequence so here i've got three steps instead of one so they're all configured the same if they all have a system.out to print lin that says what stuff they were executed and here i do start step one and then i can do dot next step two next step three and what spring batch will do is assuming step one completed successfully it'll run step two once two is completed successfully it'll run step three so if we go and run this one we can see my multi-step job ran it executed step one executed step two and it executed step three again nice but what if i want to have something more complicated with regards to how those steps are navigated spring batch has you covered so here i've got again three steps and this time i've got again i've got each step is the same configuration as before here i've got uh the check on what the exit status of the spring batch step was and based on that exit status it will indicate to me when where what step to run um so i've got start a step one if it failed so on failed go to step two from step one on star so anything else go to step three right now this step will complete successfully so let's go ahead and run it and we should see step one and step three run and so we can see step one ran and step three ran step two did not in this case now if we really quickly make step one fail and run it again now we will expect to see step one run and then step two run so you can see step one ran my exception was thrown spring batch handles that step one ends and then step two is run this time instead of step three obviously you can get a lot more complex with this stuff but that kind of gives you an introductory to what uh specifically spring batch can do now we talked about chunk-based steps earlier right so there are three there are four main interfaces that go into that step uh there's the item reader the unprocessor and the item writer this is the interface for the item reader so it's a simple interface it's got a single method that's uh that is called read it returns one item to be processed it will continue returning one item until the input is exhausted at which point it returns null out of the box we have 19 different implementations of item readers with just about every type of i o you can think of available so you've got things like am amqp kafka flat files xml files jdbc and a number of different options hibernate jpa we've got you covered the next interface is the item processor interface so here again it's got a single method called process it takes one item in and it returns an item an important thing to remember about your item processors when you're implementing these is that they're expected to be idempotent so if you run if you have an iron processor and there are errors conditions within spring batch which we'll get to in a bit your item processor may be run more than once so if you need to do some type of processing in here make sure that it's okay to run that more than once out of the black spring batch has 10 item processors most of these are fall into two buckets validation and adapting of some kind so we're gonna provide validation mechanisms whether it be beating validation or other validating mechanisms or we help you adapt other things to the item processor interface most of the times you'll be writing your own item processors the last main interface uh of the three that we mentioned before is the item writer so here you'll notice instead of taking a single item it takes a list of items again out of the box spring batch has 18 different item writers we've got flat file gemfire hibernate kafka even email so again chances are if you need to write something we've got you covered now there's an important additional interface that you typically won't interact with very much but it's important to know about that's the item stream interface the item stream interface provides a life cycle to each one of the um each one of these stateful components it makes them stateful when a step starts the open method is called with what's called the execution context that allows that component to reset the state from a previous execution after each chunk is completed we call the update method that allows the component to persist what data it needs to for restart to the job repository and finally there's the close method the close method allows you to clean up any resources that you might have used that are no longer needed the execution context is that piece that we pass in it's essentially a map of key value pairs the reason i copy the interface here is because there's a whole collection of adapter or helper methods on it for type things like type conversion so get bully and get long get etc etc so i'd want to paste that whole thing here think of it as a map that stores the state for your job let's take a look at a chunk base step a little cleanup i'm actually going to go ahead and blow away my database as well create and so here so again i've got my basic spring batch application i'm tweaking things here slightly i'm if i were to do this via spring boot build the uber jar and run this from a command line i would add batch.input equals slash input.csv as my way of passing in what my input file is i'm not doing that because i'm bet i'm running all these for my ide so i'm cheating a little bit but this is how you pass in a job parameter for my file configuration i've got to configure my reader my processor and my writer so here i've got a flat file item reader it is step scoped what does step step scope mean step scope is essentially means to lazily initialize this component until it's within the step it's running the idea here is specifically for job parameters i can't build that on startup because i haven't pro i haven't run my job yet so i haven't parsed the parameters yet so we have to delay initialization of these components so anytime you're dealing directly with job parameters you'll be setting the scope of the component to be step scoped so in this case i'm building my flat file item reader so i've got a resource that i'm going to read that's the file i'm going to read i have a name that i need to pass it this is a delimited file so it's a csv it's comma delimited i provide a name for each one of the columns in there and then i provide the target type item and so by using this this these names in the target type spring batch will call set first set last and set phone on my item type if we take a quick look at the input all it is is uh there's a thousand records in here and it's got a first name last name and phone number we're gonna this job is going to uppercase the last name and then insert it into a database so we the flat file item reader will handle the reading of the file the um item processor will handle the upper casing so here i've got an implementation of my item processor the process method you'll notice i'm returning a new instance of item so that way my item processor is item potent here i'm passing in get first get last.2 uppercase and then phone number so it uppercases that last last name then i've got my item writer here i'm using the jdbc batch item writer so here i provided a data source uh the sql for the insert that i need to do i specify that this is going to be beam mapped so by doing this spring batch is going to call get first to populate the first that field get last to populate that field and get phone to populate that field and then i call build my job is exactly the same as before in my step i instead of saying dot task lit here i say chunk with my chunk size so i've got a thousand records in my file so i set the chunk size to 100 that means it will process 100 records per transaction then i specify my reader processor in writer and i call build this is the item that we're actually passing around so it's got three strings first last and phone uh the getters and setters and then two constructors so let's go ahead and run this you saw before i deleted and created my database so nothing up my sleeve and you can see my job ran it completed and if we go now over here you can see i've got a thousand rows in it with first name last name is capitalized and the phone number the demo gods have smiled upon me so i don't know about you i'm sure you write perfect code every time i don't um so because of that error handling frame within a framework is really important to me within spring batch there's a whole collection of different ways of handling errors but we're going to focus today on three of them or talk about three of them the first one is restart so if your job is processing an exception is thrown the first option is it just stops it fails the nice thing about spring batch is when you restart it assuming you're using stateful components and follow their rules it will restart where it left off no additional code needed the second option is to skip it so let's say you've got a file and let's say you know occasionally you'll get a record that is that is bad to parse you can handle uh in the morning when you get in uh so you can just skip those exceptions we allow you to do that you can configure exceptions to skip as well as how many to skip the reason we make you configure the number of skips for to handle skipping is because there's a difference between a handful of bad records in a file and having a corrupt file i'm using file as an example but these applied all inputs um there's a difference between one or two records being bad or having an entire let's say corrupt file um and we want you to be able to determine what what the difference is so we ask you to configure a maximum skip count on it as well the last option is retry so let's say you've got an exception when you call a rest api let's say it's flaky it's sometimes it it's down so i need to uh i need to try again usually it'll come right back so the next attempts not a problem you can configure the exceptions to be retriable so spring batch will retry that up to a configured max so that your it gives your batch application the opportunity to recover intelligently on its own let's go and take a look at these different examples go ahead and blow away my database again something on my sleeve and so let's start off i'm going to get my same spring boot application um we'll start off with restart so restart um this is the same batch application that we just ran for the chunking based one the only difference is i've got some invalid input so on line 536 of this file i've got a entry that is too big to fit into the database column the database column is 50 characters wide and this is beyond that so this will throw an exception we actually don't need to do anything to our job this is configured the exact same way it was before that's the nice thing about spring batch is so when we go ahead and run this so we'll go ahead and run it data truncated data too long column first row column first at row one that's a problem so our job failed so we can see job restart job executed with these parameters with the following status failed so now let's go ahead and take a look at our job repository so select star from patch page instance so you see we've got one job instance and we have one job execution now the job execution record i apologize for the view on this but you can see it's got start time end time all those kinds of things the batch status the exit status and then it actually stores the exception that caused the problem so when you come in on the following morning you can take a look and say oh okay let me go look at the file and figure things out now to run this again so that it let me actually check one other thing which is our actual data right we want to see uh where i left off so i've got 500 rows out of the thousand rows actually made it into the database what happened was spring batch processed up to 537 537 failed obviously so it rolled back to the best last known good state so now when we restart it should pick up where left off once we fix the file now i mentioned before that that run id incrementer uh every time you run the job it'll give you a new one we don't want that to happen this time we want to restart the old one so all we need to do is pass in the old one and spring batch will take care of that for you so that's the logic we need to restart now the only thing we need to do is we actually need to fix the file so we'll do josh for example go ahead and run that again and so now if we run it our job completed with the status of complete we now have a thousand records in our database as we expect as well as we can see we still have one job instance so we had one logical run we now have two job executions so we had two physical runs the first one ended with the failed sets we looked at before the second one is marked as completed so that's restartability so now let's take a look at skipping so if we go over here and we remove this one from context and we come over here oops today we're skipping we do need a little bit of configuration the reader processor writer those are all the same the job is the same nothing to change here however on the step we need to add a couple extra lines first we need to call dot fault tolerant this is what han adds in that extra error handling capabilities so skip retry all these are handled once you specify that your step is fault tolerant in this case i'm just going to skip exceptions in general you can get more specific if you want um you can list multiple you can list individuals it's all up to you um in this case uh you know demo where so do as i say not as i do i'm just gonna skip all exceptions and i'm set to skip limit to five our file only has one bad record in it so if i got more than five i know there's something wrong with my file and call build that's really all i need to do so now if i go and run my job what i'm expecting to happen is the job will run it'll run successfully however i will only have 999 records in my database the reason for that is it will have skipped one so if you go to select start batch oops i forgot to delete them but you get the idea so now i've got 1999 that i forgot to delete them last time um [Music] but so you can see that uh it skipped that one record and processed them all again i'll process all the other ones one other thing we can take a look at is in the job repository there's that step execution so here i've got three rows because we've run uh three steps we ran that first step with the uh on the restart we ran the second step on the restart the third one is the more interesting one though so this is the one for the last one so uh you can see it records the status so exit status and batch status this is the number of commits the number of items read the number of items written the number of items that were skipped during the write so we have one that was skipped and two we had two rollbacks why two roblox we only skipped one item the reason we have two two rollbacks is the way uh skipping works within spring batch so we had an error in the right segment i pass in the entire chunk into my right item writer how do i know which one was the one that caused the problem i don't so what happens is i first pass in the entire chunk it tries it fails i roll back the transaction that's rollback number one now i set the commit count to one and i'm going to try each item one at a time until i get the bad record i get the bad record i'm going to roll that back and then continue on and then set the transaction back up to 100 and continue on so that's where those two rollbacks come from the first one is the big one the second one is for the individual item that was the actual problem so that's error handling there's a whole laundry list of other features within spring batch that i would love to be able to go uh go through but to be honest some of these i literally have entire talks on so we don't have time um one of these is listeners so listeners are the ability to inject logic into different spots of your application so if you've got a batch job you've got you've got the job execution listener that allows you to run logic at the start of a job or the end of a job we've got the step execution listener that allows you to inject logic at the start of the step or the end of a step we've got the chunk listener the skip listener the read listener rightlister processor processor listener the list goes on and on you can inject logic into just about any point of a spring batch application you can think of we also have a number of different ways of scaling batch applications one of the reasons you run batch applications as opposed to other workloads is for the efficiency so you're typically running a lot more data through a batch application than you will be through other applications your web app is handling one request at a time yeah it probably handles a bunch in parallel but the number of requests a day or a second is probably nothing compared to what a batch application is going to process and so to be able to handle the volume that a batch application typically needs we have a variety of different scaling capabilities based on your use case the five options are first of all multi-threaded steps so we talked about chunk based steps chunk based steps are you process your chunks in sequence if you add in a a task executor we can launch chunks in parallel so you can be processing multiple chunks in parallel we also have the ability to run parallel steps parallel steps is where i've got two independent steps that i can run without any type of interaction between the two so i would run them in parallel via threads there's the async item processor and async item writer here where the item processor is the bottleneck i can run that call in a different thread via future and then i pass in all of the chunk of futures to the async item writer the async item writer will unwrap each one of these futures and then pass those results to the underlying item writer we've got partitioning so in spring batch you can partition workloads and have a manager worker relationship work that out you can do partitioning in two ways you can do it within the jvm using threads or you can do it externally via either messaging middleware or stringcloudtask allows you to launch these workers dynamically on a platform finally we've got remote chunking so partitioning uh the manager sends metadata to each worker and says you handle it you handled that for me remote chunking the manager reads the data and sends it out over messaging middleware to each one of the workers to process and persist locally remote chunking is a lot more i o intensive than partitioning so it's typically only used when processing is the bottleneck whereas partitioning you're typically looking to optimize on the i o perspective i do a whole talk with mahmoud the the other lead of spring batch on these five capabilities you can find it on youtube here the last consideration around batch processing is orchestration so i've got a batch job i've got multiple batch jobs i've got um one that depends on another there's dependencies between them how do i manage that so there's a couple different tools within the spring portfolio to manage that first we've got spring cloud task spring cloud task is going to enable you to run these on a cloud platform natively so that's going to handle things like persisting things that are native to the uh to the platform like what run id at the platform level was involved in running that batch job it's also going to give you batch capabilities for things like running those partition workers as tasks or jobs on your platform so you get dynamic scalability of your batch jobs where you can launch workers they run and shut back down when you need them coming up in the next version of spring cloud task which will be coming out later this year we also have a single step batch job starter so earlier we did a file to jdbc batch job we read in a file we sent it to jdbc spring cloud task with this new starter you won't have to write that code you'll be able to configure that entire job just via properties spring cloud data flow is the other piece with regards to orchestration that's what dataflow does dataflow orchestrates microservices whether they be streaming applications dealing with messaging middleware or via or tasks being ephemerable ephemeral microservices if you want to hear about dataflow i highly recommend tomorrow checking out the walking through spring cloud dataflow talk being given by glenn and elia they're going to walk through the entire dataflow system streaming tasks and give you an idea of what its capabilities can be i also recommend if this talk has been interesting to you to check out literally in the next slot over in the intermediate or advanced spring track mahmood's talk on what's new in spring batch he'll be covering the new and cool things we've got coming out in spring batch 4.3 and with that i want to say thank you thank you for your time and i'll see you in slack hopefully bob's tweeting about the uh uh talk hey we're live that happened michael thank you so much i was actually literally on my twitter feed trying to upload my fur baby uh lars uh over the netherlands and posted you know hey put your fur baby up on twitter so i was going through my phone trying to get my picture of my dog sleeping as we're spring one in here so thank you all again for watching michael thank you so much for that wonderful presentation if you have any questions for him head on over to the spring slack channel and check out the beginner spring sub channel and they'll have lots more information so you can talk to michael if you have any more questions with him we have our next presenter getting lined up we'll be about 15 minutes until that happens i think it's five minutes past the hour is when deshawn carter comes on i'm excited he's warming up i see him he's doing the stretches he's getting limber and i can't wait for this to happen he's one of my favorites everybody uh so we'll see you in about 15 minutes thank you [Music] you
Info
Channel: SpringDeveloper
Views: 3,659
Rating: 4.9523811 out of 5
Keywords: Core Framework, Data/ Databases, Modernization/Refactoring
Id: dIx81HYdpq4
Channel Id: undefined
Length: 44min 14sec (2654 seconds)
Published: Fri Sep 25 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.