6. Laravel Job Batching | Upload million records | Queue job to upload big file

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
now one thing is there even after uploading all the data we have the temporary file what we can do we can actually delete the file also so after having all the file uploaded we can simply say unlink of the path plus the name and is it the same thing we need to do okay no the thing is actually it's a file okay so we need to upload and then we need to unlink it but we need to do all these things behind the scene that means in the queue worker if you don't know about the queue thing in laravel you can go to the documentation here we are into the documentation and here i can search for directly queue like here and it says digging deeper cue introduction so how we can get started with the queues so let's go at the top and let's scroll down first we need to have that q table so copy this thing and run it it has created a migration file so let's check this migration file inside database migration we have this migration file which is going to hold the information related to the queue whatever the queue we are putting so this is the one because we are using the database one obviously if we are using the redis we don't have to do all these migration kind of things but we are going to use it database queue so we have migrated that means in our database we have the jobs table amazing after that what we have to do we need to put or create a new job so psja edison make a new job okay so one more time php edson make make make make job called sales csv process okay hit enter and now it has created so open sales csv process this is the jobs which is actually inside i can show you which is inside app jobs and sales csv process this is the one now inside this we have a constructor and we have a handle so whatever we want we need to do inside the handle method so we have this sales controller inside sales controller we have the store method where we are getting all the files from this temporary directory and then storing it so i can cut all these things from here and paste it what we need we need to import the sales model and i think that's done this is done and now we need to cue this job how we can dispatch actually so you can go to the top and here we have dispatching the job let's click here and you can use the name of the job and then dispatch okay so here we have the sales controller the store method i can say sales csv process colon colon dispatch and since we don't need anything for now we are not having anything here and that's done actually we just need to trigger this but whenever we want to have this jobs into the queue so syncing dispatch where is the queueing part yeah no not this one so actually we need to uh set the database driver actually okay so it's not uh there so in your tnb file you can see the q connection is sync we need to convert it into database because which is the one we are using since we are using level 8 you can see the server is automatically get restarted ok so we are dispatching and then what we need we need to try this one more time go on the store data we already have all the csv inside our resources temp and we have all these 50 csvs now one more time hit this route you can see instantly it's not get blocked we can easily access the home page and all the things are now into this jobs table so we have this job how we can process this job because since sales is filled with 50k job with with the previous one i'm going to empty it so now sales table is not having anything we have one job which is there to trigger the job we need to process the job and here we can say running queue worker it's very easy phpers and queue work so let's say php all right i can use p p a for php adson q work and i'm not going to give any tries so it's now processing you can see it's processing processing processing it's over the background we can still access our level application it's fine it's working absolutely fine and now this job is in attempt one it's attempting that means if you go to sales directory or sales table we can see the data is increasing and this is really amazing thing why because we are having this q worker which is taking the heavy job and our user is happy because the file is uploading behind the scene and this is incredible thing so we are uploading we need to just wait for this process to complete after some time it just get killed let's see why this happened so if i reload it says approx 20 000 records are there and after that it just killed so jobs doesn't have anything but we have a failed job in the field job it's having some error and error is actually the same thing maximum try out the time out thing you can say so because it's also a php process which is running just in the background but it's a process so php process if it is taking too much time it's going to automatically get killed instead of doing all the things in a single uh job what we are going to do one more time i am going to cut everything from here lets cut all these things from there and lets undo things here see we are having we are reading each file now each file hold thousand records so inside the record inside the file whatever we are doing we can move that into separate jobs this means this is going to cut from here and paste it here one more time we need to import now what we need this time we need something on the sales process so i say sales csv csv process dispatch but when we are dispatching we need to first pass the file name also we need to pass the key is there anything else um we need header also actually we do don't need header because we we can do this header part and this reading part here okay we just want to do the store part in the job okay so instead of this now we have to get only the data and the header and we don't want to unlink here we want to unlink here also okay so what we are doing we are reading the file we are having the header data then we need to pass the data and we need to pass the header also and then finally we are unlinking let's accept these two things the header and the data so public data data is like this public public header and then here comes data comes header data header then this dot data is equal to data and then we have a header this dot header is header so here instead of dollar data we are going to say this arrow data this arrow header and that's done okay one more time let's see what we have done we are having all the things back to the store method we are getting the temporary directory having all the files from that then we are looping through each file grabbing the data from each file which is we have inside the stem directory all these files then we are setting the header from the first file only then we are putting all the data plus the header into the queue job then unlinking the file that means we are going to remove the file also this time we are also removing the file so let's see what's going to happen first i'm going to remove the stamp directory because i'm going to create another one i'm going to upload one more time so let's try this we have to run the q worker first time i will go on the upload because i'm going to upload so let's have the 50k record click on upload it's going to chunk all these file it's actually not uploading it's chunking we have all the 50 records here then i go to store data and this is going to put everything into the queues and it's now processing and you can see we have over 52 cues that's great and each queue this is the old one and each queue is adding the job or data into the database so if i refresh you can see the data is going to increase and this is not going to have any time out because each process is very using very less time so there is no time out so you can see we are having the process completing the process new process and then completing the process in this way we are uploading all these things and that's amazing amazing amazing and finally after some time we can see we have processed all the 52 jobs and now inside our jobs table we don't have any job in a failed job we have only one previously failed job and the sales table is now filled with almost 70 000 data which is 50 000 from now and previously 20 000 and this is how we are going to complete this now since we are doing all these things here we can actually do the same thing once we are chunking or uploading the file so here also we can do the same thing so i'm going to cut these things and paste it here and one more time this is a path which is already there so we can move the path outside if we want yep we have this path outside and we don't need to re initialize the path okay so since we have this and we don't need this store method now we don't need this store data route anymore because everything is going to be the same i'm going to clear out the sales table one more time so so that's done and we don't have any failed job absolutely fine and go to the upload and as soon as we upload it's going to chunk it it's going to have inside our temp directory which is you can see now it is empty because we have unlinked all these files okay so we are going to upload and that's uploading and then uploading and then say stored that means everything is now inside the background we still have a once again have this jobs we are filling the sales directory table amazing so this is the way you can store super big file using the sku jobs inside the level but we are not done with this we want to have the process you can say progress details how we can do that
Info
Channel: Bitfumes
Views: 33,980
Rating: undefined out of 5
Keywords: laravel 8, laravel queue jobs, laravel jobs and queues, laravel jobs tutorial, laravel job batching, laravel queue, laravel tutorial, laravel queue work, laravel file upload to database, laravel file upload and download, laravel file upload s3, upload csv file in laravel, laravel upload csv to database, laravel upload csv
Id: r_zAB5wJCVw
Channel Id: undefined
Length: 13min 55sec (835 seconds)
Published: Fri Dec 11 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.