End to end Object Detection Project Implementation Using YOLOv5 with Deployment | AWS | Azure🚀

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone my name is and welcome to my YouTube channel so guys in this video we are gonna Implement one amazing project and uh here we'll be implementing one uh end-to-end object detection project using yellow V5 so here I am going to show you each and every components of an end-to-end project whenever you are trying to this kind of object detection project because object detection pipeline would be slightly different as per your machine learning or deep learning pipeline so you will get the entire idea how to implement this kind of project with the help of yellow V5 so here I'm going to follow one particular problem statement but you can use any kinds of problem statement you can use any kinds of data this pipeline will remain same so guys this is my request just stay with the video try to see each and every uh components I will be implementing here you just try to also Implement with me make sure you are watching this video till the end uh and at the end of the video actually we'll be learning a lot even this video will help you to crack any kinds of computer vision interview I think whenever you will be attempting in future so yes guys uh this is the introduction part of the video so instead of talking too much let's start with our implementation now guys you can ask me like what would be the prerequisite for this uh project okay so here I'm expecting you are familiar with Python Programming at least object oriented concept actually you know okay because here I'm going to write modular coding and everything I will be writing as python class so object orientation uh so here actually object orientation programming is very much important okay and I'm expecting you are familiar with computer vision concept at least uh basics of object detection like what is object detection and all about okay uh that should be sufficient for this video okay then I'm expecting you have this AWS and Azure account okay because uh I'm going to show you the deployment okay and I'm expecting you have this kinds of account and uh at the last actually what you need your dedication just put your dedication as much as you can after implementing this project just pick up any problem statement try to use the same template okay and try to implement your own project okay so guys now let me tell you like what is uh West okay West uh detection so as you can see uh West uh are unwanted or unusual materials okay waste is any uh substance or are discarded after primary use or it or is worthless okay and uh defective and Noise Okay let's say what are the things actually you don't use okay which is like unusual materials okay so we call them West so guys here if you see I have 13 and like you can say classes of West okay in this project as you can see I have like a waste banana then chili then uh drink can okay then drink pack then I have food can okay then paper bag plastic bag okay so these are the thing actually I am considering as West okay so these are the image actually I collected so let's say if you want to integrate uh like uh some more you can the waste material you can also integrate okay you can collect like collect those images and you can also include inside your data okay so in this project I will be only using uh like 13 classes of waste materials okay but you can add like mode in future okay if you can so yes guys this is all about our West uh like you can say like detection okay uh this is the problem statement because uh you can use this kinds of you can say application let's say in any kinds of uh you can say restaurant or uh in your kitchen as well okay let's say if there is any waste okay it will detect the waste and it will give you some kinds of atom so that actually you can clean these are the thing okay so you can use this application like many bar just think about it like where you need this kinds of application okay but my intention is to show you like this object detection pipeline okay so once you learn the pipeline you can integrate any kinds of problem statement with the same pipeline okay so guys before uh starting with our project implementation I want to show you the demo of this project okay I have implemented so let me show you the demo okay like how it will look like so guys you can see this is the web application uh okay I have uh like you can say develop so here actually what you need to do you need to upload one image so let me upload let's say I will upload banana here and if I click on predict you will see it will do the prediction and it will give you the prediction here see guys it's detecting as banana let's upload some more uh waste okay so guys now let's upload some can images so I will upload let's say this boot can predict it see guys it is detecting okay so that's how actually you can give any kinds of waste okay uh waste material it will able to predict that's I will give all of them together again I trained like very lazy box okay this model is not pretty good I just trained 50 epochs okay and still it is able to detect okay all of the materials you have uh but whenever you are trying to train your own model just try to increase the epoch size okay we'll get a good model okay so just for demonstration purpose I trained uh 50 bucks and I just picked that model and I'm doing the prediction on top of it as you can see it is able to uh you can say predict all of the waste okay as you can see plastic bag is also there so now uh let me also show you like how we can uh do the live prediction so to do the live prediction here you just need to give slash first of all see if you want to train the model okay if you want to train the model if you want to start a training pipeline you need to give slash train here okay if you are giving slash train here and press enter it will automatically start your training pipeline first of all it will pick the data okay and after that it will do the validation on on the data then it will prepare the data okay it will start the training and after the training it will give you the model okay and after that what you can do you can do the prediction okay and if you want to do like live prediction just give slash live here okay slash live if you just give slash live so guys now if I show you if I give slash live here okay after sometimes you will see it will launch the camera so guys as you can see it has opened the camera now if you just uh show any kinds of waste okay let's say I will show this uh can okay again see this scan is a little bit different uh from my training image that's why again it's not able to recognize but if you have any kinds of waste okay so if you just show it here it will able to detect okay uh I could have used like Coca-Cola can but I don't have it uh currently so I'm extremely sorry because I can't show any uh like uh West okay now I have on my table so that's why I don't have but at least I'm just showing you like how to open the camera okay and if you have any kinds of waste just try to show okay here yeah like uh which is similar to your training data you will able to see it will detect that okay so yes guys uh this was the demo of our project okay and uh don't worry guys I will show you each and everything like how to prepare the data okay even how to annotate the data everything I will show you okay in this project uh let's say here we'll be using yellow V5 okay framework so for this actually I need to annotate my data as yellow V5 format which is nothing but txt annotation okay I will also show you like what kinds of tool you can use as free okay and you can also do The annotation of your data I will show you each and everything so guys now first thing I'm going to do I'm going to uh create one GitHub repository for this project okay and after that I will create my project template okay so let me do it so guys I will first of all uh go to my GitHub so I'm expecting you have the GitHub account so try to just log in with your GitHub account and here I will click on Repository okay so here I will create one new Repository I will click here and I will name this repository as end-to-end uh West detection okay using yellow V5 okay so this is the repository name I will be using you can give any name okay uh whichever you like I will give this name and I will keep it this uh public okay you can also make it as private okay if this project is like very uh you can say private projects so you can keep it as private and I will add redmi 5 here and also add git ignodes so I'll use python is Python Programming and license let's take MIT license okay you can take any license okay so then I will create the Repository okay so now what I need to do I need to clone this repository in my local folder so let me open the folder so here I will uh first of all open my git Bash and guys uh one thing actually I just want to tell you uh because this is going to be uh like object detection uh like you can say project okay this is going to be Pro like object detection problem statement so here actually we'll be using a framework okay and if I want to use this kind of framework and if I want to train my model okay so I need some bash script okay uh you need to like run some bash script bash script for that okay and to run this kind of bash script actually uh you need something called git bash okay install in your system so make sure you have this git bash installed unit system and also just configure your anaconda with your git bash okay if you just look for any kinds of tutorial in Internet okay if you just look for any blog you will see like how to set up like Anaconda with Git bash and just try to keep it as ready so for me I have already selected gitbash as you can see gitmash I already have with that I have already configured my anaconda okay so if I run any kinds of anaconda command okay let's say conduct uh EnV okay EnV list okay anything if I run deleted contact command okay sorry it should be EnV so it would be able to run okay see you can see these are my EnV it will give okay environment it will give me so basically I need conda configured with your git bash okay and this is the bash uh you can say terminal you can execute any kind of bash bash command from here okay so LS is a Linux batch command okay see I'm able to execute from my gate bash okay so that is why uh this git bash terminal is needed okay instead of using command prompt and any other and I want a prompt you are using okay try to use git bash at least for the object detection project okay but if you are doing classification project or let's say an LP project at that time it's not required okay I already showed you in my previous project implementation you can check there okay so the first thing what I will do I will uh um like Chrome this repository I'll just write gate clone okay I copied the link from here and I will paste it here and let let me clone the repository first of all okay it has already grown as you can see end-to-end uh voice detection using yellow V5 now I will go inside the folder I'll just write n to end West detection using yellow V5 okay this is the folder now I'm inside this folder as you can see okay let me open the terminal and I will open my baseboard here okay I'll just write code space Dot and all like also you can what you can do just right click and click on show more you can open up your V support here okay and uh if you're using some other code editor it's completely fine okay you can still use it but I will be using voice code because I like personality code a lot okay so I'll just write code space dot so it will open the Visa code here okay I will close this welcome screen okay so the first thing what I need to do I need to uh create the template of our project okay so let me create the template so for this actually what I will do I will create one template file but before that let me tell you like why I'm creating this template file okay like separately can I use this other button and can I create the full structure yeah you can create but what will happen actually it will take a lot of time okay again it would be manual approach you need to click on the file again create one file again you need to create one folder inside that again you need to create one folder so it's like very time taking things and it will it would be manual okay uh everything should be manual approach okay so instead of that I will be using python script to create my entire project folder structure okay and it's just a one-time effort if you can write this template dot Pi okay you can use it as it is okay in your every project okay only you just need to change the project name okay and you can use that template as it is in your every project okay and there would there won't be any effort okay only you just need to like you can say run that template.pi it will automatically generate those files and for this for you but you need to write the code first of all okay for it so it's just a one-time effort that's why I told you okay so I'll create one file here for template dot pi okay so here uh first thing I need to import some of the libraries so let me import I need OS okay operating system module I also need path from path leave so I'll just write prompt path okay I'll tell you like why I'm using path and path why I'm using this path okay this from this path clip I will tell you then I also need login okay I will import login because I also want to show the logs okay uh like every run I also see the logs okay from my terminal that's why I'm creating this login okay so first of all to define the login I need to Define One login screen here so I'll just write login Dot basicconfig okay inside that you need to provide the login screen you want okay so I usually follow One login screen so let me just show you guys this is the login screen I usually follow so basically what I'm telling I am first of all defining the level log level okay so it is information related log and I'm also depending the format of the logs first of all it will uh like log the ASCII time like which time you executed the code and which date actually we actually executed the code it will track those information okay after that it will show you the log message okay so this is going to be just rather you can say terminal log okay here I'm not going to create any log file in in here inside my project okay so that log file I'll be creating whenever I'll be writing my custom blocks okay as of now for this template dot Pi I need this log okay because I want to log some information I want to see them okay whenever I will execute this template.pi that's it no first of all I need to Define my project name project underscore name okay so I'll name this project as West detection let's see if you are getting some other uh project just give the name okay with respect to that voice detection okay so basically what will happen uh I will create once you can say uh folder called voice detection okay inside that I will keep all of my components okay all of the configuration related of my project okay because this is going to be a root folder root project folder inside that it will have all the like configuration or components related of your project okay so that's how we can organize the code so voice detection now I need some list of file sell this right list of files okay it should be one list then inside that I'll Define what are the files and folder I need okay so first of all I need dot uh dot GitHub okay dot GitHub folder why I need.github folder because this dot GitHub folder is need for your CI CD deployment okay so inside that will be creating one main dot ml file and whenever you will push the chances okay uh in your GitHub so it will automatically get deployed okay in your Cloud so that's why this dot GitHub is important okay this one GitHub action inside that I will create another folder called workflows okay so I'll just write workflows and inside that I will just keep one file dot dot gatek why I'm keeping this file because I can't push empty folder in my GitHub at least it should have some file okay that's why I'm creating this dot git file and later on whenever I will be creating that main.tml file I'll remove this file okay that is why I'm creating this thing okay now I need another folder called Data so inside that again I will keep this file because this is going to be empty folder so inside data actually what I will do uh see I already showed you that application okay like application demo there I was uploading one image okay input image so that input image actually uh was uh passing inside my backend okay and it was saved inside the data folder okay so that is why this data folder I'm creating okay so whatever user will put okay like as input what are the image actually user will put I will take those image and I will save it inside my data folder that's why this data folder is needed okay yeah now with that I will create uh my positive working directory first of all which is nothing but first of all I will Define one F string inside that I'll pass it here and uh inside that I will create one file I'll understand the screen it okay underscore underscore dot Pi this is the Constructor file uh you need okay because why you need this Constructor file because this is going to be my local package Okay because I'm going to install this folder as my local package and whatever components I have inside that folder okay I can import them okay in my other file as well so to import them I need to install it first of all as my local package and to install it you need to Define this Constructor file okay inside that so that is the thing here okay I'm doing here so now what I'll do uh inside that again I will create uh project name I'll create another folder called components okay components uh first of all let me create all of the folders and file then I will discuss like why I'm creating these components okay then why I'm creating this uh in it okay everything I will discuss don't worry okay so I'll create one folder called components and uh inside components again I will be creating another Constructor file okay because again this components would be one local package for me okay so that's why this Constructor file is needed then inside that I will create another file called data ingestion okay the transition would be my first component then inside that again I will create another file called data validation so this is going to be my second component okay then I will create another file called model trainer so these three components I only need inside my object detection project because a data transformation is not required because your yellow V5 automatically will handle this thing okay I'll be using yellow V5 framework and it will automatically like handle this thing okay I don't need to separately like you can say transform my data okay only these three components I need okay for any kinds of object detection pipeline or let's have the detection positive will be implementing in future okay so yeah so these are the component I only need now what I will do I will create another another folder here called constant okay inside that I will create another Constructor file okay so let's keep it as constant only okay yeah so what is constant everything I will discuss no issue okay first of all let me create them then again inside constant I will create another folder called training pipeline so it will contain all of the constraint of my trading pipeline okay training pipeline again it will have one Constructor file then inside constant I will create another file called application application.pi okay then I will create another folder here called entity and TT okay so the first entity I will be creating which is nothing but our config entity config underscore entity okay and the second entity I need artifacts entity okay artificial entity so what is config entity what is artificial entity everything I'll be discussing about don't worry okay then I'll be creating another folder called exception inside exception I will create another Constructor file okay it will contend our custom exception okay then I also need the logger it will contain our custom log and with that I I also need to create the pipeline it will contain our training pipeline pipe pipeline inside that I will create another file called training pipeline printing underscore pipeline yeah and I also need utils utils folder okay inside that I will again create one Constructor file say every folder will have the Constructor file okay and uh I'll create one file inside utils called Main utils it will contain all of the utility related code okay utility delete function then here I am going to use flask okay flask for my you can say web application okay so for this actually I need one uh you can save folder called template it should be templates okay make sure inside that I will be creating one HTML file called index.html because as you already seen like there I was launching my web app and it has some like HTML code okay this HTML code I need to write inside that okay then I need my endpoint which is nothing but my app.pi and uh I already told you I will tokerize my code so I need Docker file I also need requirement.txt file with that I also need setup.pi okay so yeah Guys these are the only files and folder I need in this project if I need anything okay else I will create later on okay but now let me show you like how we can create these are the folders and files okay as pythonic way so for this actually what I will do I will Define one for Loop for a file path okay in list of file so it will give me this record one by one okay so first thing I will Define my file path as well pass my file path inside my path class okay I Define on top so why I'm giving to this path class as you can see it is the path class we have imported from pathlib because as you can see I have defined everywhere forward slash but here I'm using Windows operating system and as you already know in Windows actually we Define like we Define the path okay as backward slash okay but here I have demand as forward slash okay we usually Define forward slash inside our Linux or Max system okay but if you are passing this thing as it is so it might throw some error like this path is not defined okay correctly so that is why to handle this path okay if you are using path library from path leave and if you're passing this path inside path okay so but it will do it will automatically convert to Windows path okay it will detect first of all your operating system with that it will tell you this is the windows part okay we have converted now you can use it okay so to show one demo of it okay so let me activate my let me clear it let me activate my python okay so first of all I will import this uh path from pathly I will import okay so it should be formed per sleep import s okay now let me let me Define One path here so I'll just write path equal to let's say I have one path called x y Z okay then I have one file inside that called txt dot Pi okay let's say this is my file I have this is the path okay and this is the file I have inside that okay now what I will do I will pass this thing inside this path class par okay now if I execute it now we'll see it will automatically detect okay it's a Windows path okay it has automatically converted to Windows path okay I don't need to give it as backward slash okay let's see if you're running this project in your Linux machine as well it will automatically convert it to Linux path okay and it will like handle everything okay in the back end so that's why I'm using this path here okay so after converting this path as my windows path so what I will do I need to separate out these folders and file okay as you can see this is the folders and inside that I have some files okay so first of all I'll be creating the folders then after that I'll be getting the files okay so for this what I will do I will use another method from which Library called split okay so it will return two things first of all it will return uh file directory okay so fine there and also it will return the file name okay so I will use voice dot path okay dot split inside that I will pass my file path okay now what will happen let me show you so I'll open my terminal again so let's say this is your path you have already find so let me import voice first or 12. okay I will call Voice dot path dot speak okay inside that if I pass my path see it will return me two things this is the okay folded that means file directory and this is the file name okay it is returning two thing okay as a tuple so that thing I am separating out okay in this two variable and after that what I will do first of all I will create the file directory okay so for this I will write one condition if my file directory is not empty okay it's not empty is not equal to empty so what I'll do I will create this folder so I'll just write OS dot make directories okay inside that I will pass my file directly okay and I'll pass one parameter called as exist okay is equal to True okay because let's say if you already have this folders okay it won't be created okay and if it is not present it will create okay this parameter will help you to do it okay now I will lock the information I'll just write login info okay I'll tell uh rating sorry it should be every string small f itting or directory file directory for the file name okay so it will create the um like directory for me okay after getting the directory I also need to create the uh files okay create the files inside folder okay so now let me create so I'll again Define another FS condition so if I'll just write one condition not OS dot path uh dot exist okay I will pass my file name I'll Define another condition or OS Dot get size it should be I think it should be voice dot path Dot get size okay uh I'll pass my file name okay this file should be equal equal 0 get size okay that means what I am telling I'm telling first of all it will check whether these files are presented in that folder or not okay if it is present if it is not present and this file size is empty okay if you just pass anything any file in this get size method it will give you the size okay of the file as you can see this is my data me.md file so it has some text here okay so if you just pass it to get size it will give you some of the kilobyte okay and if it is not zero that means this file is already exist and this file has some code okay if it is has then what I will do I will ignore that file okay and I will create whatever file side missing okay in that folder so that is the logic I'm writing here okay I'll create the file I'll just write with open I'll pass my file directory okay it should be file path I think file path I will open it as write mode as if then I'll simply pass it because I only want to create the file file okay then what I will do I will log the information I'll just write login dot info so here uh I will mention waiting empty file okay I'll give the file name okay then inside else condition okay this file uh is already exist and then I will give the message info uh sorry if a string is not Curry file name is already created okay yeah so yes guys this is the simple logic we have written okay so this will automatically create our files and folder okay now to test it let me show you so what I will do I will open my terminal okay so first of all let me exit on my python yeah now as you can see left hand side okay uh there is nothing okay there is no nothing of this folders and files as you can see this this thing is not created yet okay now if I run this template.pi I'll just write python template dot Pi now you'll see the magic okay what will happen so I'll just execute it now guys left hand side you can see uh it has automatically created okay the folders and file for me but one issue I can see here uh okay uh everywhere it has given F because uh because I have defined in a wrong way so what I need to do I need to pass this F out of this quotation okay so let me do it I will select all the if okay and here I need to mention F okay now it should work so first of all Let Me Now delete everything I'll open my folder manually and here uh I will delete I'll delete this thing only uh yeah I think everything is fine now now if I execute it again template.pi now see it has created successfully okay now as you can see uh everything I have mentioned here like let's say components inside components I have data injection data validation model trainer and this underscore under screen dot Pi instead constant I have training pipeline even I have this Constructor entity exception logger pipeline okay utils everything I have created as you can see Docker file requirement.txt setup dot Pi everything has been created automatically okay let's say in future you need some other file here so just give at the file in the list okay I'll add it that's the I I need something called test dot Pi okay test dot pi I need okay infuser so I'll add it here and I'll simply what I will do I'll simply execute my template.pi okay and you will see the magic again I'll execute template.pi now you'll see it will automatically create test.pi for you without okay without hurting these are the file okay because I'm writing one logic here okay it will check the file size first of all if it is 0 then it will replace otherwise it won't be replaced okay as you can see my readme was having some text okay it is still present here okay it doesn't replace yet okay so that is the thing here okay so I hope you got it like how to create this entire uh you can say project structure uh in a pythonic way okay using this template dot Pi now uh yeah so this is going to be my project structure guys okay and I will discuss this and everything what is component what is constant what is entity what is exception okay I will discuss each and everything detail okay don't worry so in this steps obviously I just wanted to show you like how we can create a posit for the structure okay that was my intention now what I will do I will quickly push these changes to my GitHub so what I will do I'll just clear the terminal and I will write git add okay space Dot and uh gate commit IP name folder structure edit so here I'm providing the message structure added and I will simply do git push git push origin main okay this is the main branch we have created now let me push it so inside source control as you can see everything will everything will disappear because we have already committed okay now if I go to my GitHub and refresh the page here you will see all the reflect see guys okay that means everything is fine as of now we have successfully created our project folder structure so now uh what I need to do uh now actually we'll be doing this project setup and requirement installation okay so for this actually what I will do I will open my requirement.txt so here I need to add some requirements actually I need for this project so guys I already prepared all the requirements uh so just let me show you what are the requirements I need so these are the basics requirement I need okay for my projects and for YOLO V5 you need these are the requirements okay so this is the requirements for our yellow V5 okay because here I'm going to use yellow V5 uh like you can say framework okay and to install this uh to use this framework correctly I need these are the requirements as you can see uh it needs this Pi EML and ALS and it needs also it starts Library okay pytos Library dots Vision okay everything it will install and it has also specific specified the version okay everything it has initialized and I will tell you like where I got this requirements okay with this specific version whenever I will clone the like you can see all the V5 repository at that time I'll show you like how I got this requirements okay as of now these are the requirements actually you need to install okay in your environment okay then also need tensor board and Panda c bond as you can see as at the last I have had some more extra libraries and at the last I have also added hypn e space dot okay what is this hypn space dot okay this is the uh like you can say uh uh like comment for to install the setup.pi okay because the setup.pi will contain your local package installation code okay and how we we can install this local package I will show you okay but before that let me delete this test file because this is not required we unnecessarily we created this I will also remove from here this is fine okay yeah okay one thing I forget to mention here which is nothing but our resource uh okay a resource folder because inside research actually I'll keep my notebook right so research slash inside that I will create one file called uh trials Dot ip1b okay research yeah so now let me create it again so I'll open my terminal I will again execute that command called python template dot Pi so it should create the research inside that I have uh uh okay why it's giving like that okay here I need to provide the comma that's why so again let's delete [Music] okay now again I will open my terminal and execute the same command python tablet.pi it should create the resource folder for me inside research I have trials.appan V5 okay so here I will keep all the notebook experiment okay I'll be doing yeah so that's it now let me go back uh to my requirement.txt yeah so these are the requirements actually I need to install as you can see I am using flask to make this flask I'll be using to uh you can say create a web application and G down I'm using because I will keep my data in my Google Drive okay uh see guys this is the alternative I just want to show you like many people have already asked me uh if I have like more data okay if I my data size is like use instead of GitHub like which source I can use okay instead of database like let's say uh you don't have any paid database okay or let's say you don't have any uh like you can say AWS account so you can't keep your data in S3 bucket okay so instead of keeping my data in S3 bucket I will show you today uh how we can keep your data in your Google Drive okay and how we can download the data from your Google drive itself okay so that is the alternative I kept here because many people will will um because many people own having this kind account paid account so at least you can do the job with me that's why I kept this alternative here okay so here I'm going to use Google Drive for keeping my data okay so yeah that's why this G down package is needed okay so yeah so these are the packages I already need now what I'll do I will uh open my uh terminal but before that also let me write this setup.pi okay so this setup.pi will contain your local package installation code so this is the code actually you need to mention here so basically first of all you need to give the project folder name okay West detection version of your project if you are author just give your name here okay just also mention the author email address and this find package okay you need to give here so basically what it will do this fine package will look for this Constructor file in every folder okay wherever this Constructor file is there it will try to install that folder as my local package okay so so that actually we can import let's say I want to import data in addition from this components what I'll write I'll just write from voice detection dot components import data in addition that's how I can do it okay and if you're not installing as the local package that's and if you are not installing this thing as local package so this job actually you won't be able to do it okay that's why this setup.pi I need to install Okay as my local package now let me save it and now I will install it okay and to install the setup.pi you need to add this line okay this line will look for the server.pi and it will install it so I will open my terminal let me clear it yeah so first of all you need to create one uh Panda environment okay virtual environment so to create it I'll just write conda right type in just give the name of your environment so I'll give it as West okay because this is the waste detection project West uh then you need to specify the version python version you need I will lead I will mention python 3.7 because I'm going to use Python 3.7 in this project and type in y that means you want to create okay you can give any name okay of your environment it's completely fine now let me create the environment okay now you can activate the environment using the command second West now you can see it will change through voice talking now let me clear it sorry okay now let me do the installation of my requirements so I'll just write peep install requirement.txt so it will automatically take those requirement and it will install everything so it may take some time guys so let's wait okay and once it is done I will come back so guys as you can see this requirement we have installed successfully there is no error that means everything is fine okay uh that means we have successfully installed our requirements now what I will do uh we'll start with our logging utils and exception module okay we'll be writing them one by one okay because uh before implementing my actual components I need those are the themes uh this logging module exception model and in utils module okay so first of all I need to write them okay after that I will start with my project implementation but before that what you need to do you need to also commit the changes because as you can see these are the new file you have created okay so let me do the commit so I'll open my terminal okay so let me clear it now if I first of all I will just write gitad then git commit IPM requirements edit okay and let's do git push origin mean yeah it's done now if I click on Source control see it will disappear because it has already committed now if I open my GitHub and refresh the page see all the changes are there okay requirements added now if you want to verify it just click here you'll see all the requirements are there okay so that means everything is fine so everything is fine now let's start with our login module first of all so guys before starting our notebook experiment okay first of all I just want to show you like uh how to prepare the data set okay like if you have collected this kind of images then how you need to do The annotation okay because annotation is required in object detection so first of all let me show you like which tool you can use to do The annotation okay so let's say you have collected uh these kinds of waste images okay I kept only three images in this folder because just for demonstration purpose okay but for your case uh you you just need to collect like uh many images here okay and you just create a folder called new folder inside that just keep all of the images okay let's say I have all the images presented in the folder so the first thing what I will do I will divide this image through my Trend uh validation like you can say uh say like you can say folder so for this section what I will do I will create a folder here so I will name it as spring okay then again I'll create one another folder called value okay yeah since I train I will create another folder called images and with that I will create another folder called elements okay so make sure you are creating the folded structure like that yeah so the same thing I will just copy paste in my validation as well yeah so now what I need to do uh now what I need to do I need to take my trading images Okay so let's say you have uh like lots of banana images lots of drink and images lots of like you can say ring pack images so make sure whatever image actually you were putting inside your trading uh the same classes also you just need to put here inside your validation as well otherwise let's say you have copied all the bananas in your training inside validation there is no banana okay otherwise so this kinds of you can say train test Fleet actually you want to work okay so you just need to verify whether all the classes are presented inside your Trend uh even validation as well okay so what I'll do let's say I will copy this banana and drink can in mind training folder inside images I will keep it here or otherwise what I will do I will copy all of them and keep it inside my image okay and same three images I will copy inside my validation but whenever you are doing it make sure you are splitting out okay because I have less images that's why I'm doing like that foreign first of all I need to install the tool so I will go to Google and search for level mg level mg GitHub okay features level mg GitHub so you need to visit the first link so this like open source tool data nutrition tool so as you can see uh you can like annotate your data's bounding box annotation using level mg okay as you can see this is the ey of this tool so first of all you need to install it so there is the install guideline so if you are using Mac OS or Ubuntu so all the steps are there you need to execute but I will use a Windows machine here so I'll use pip to install this uh to in my system okay so I'll copy the First Command and uh I will execute it here so make sure you have removed you are removing these three because by default we have Python 3. X okay now I'll install it so make sure you have activated your same environment okay we created okay it's done now after installing you just need to execute this command to launch your level empty so after some times actually we'll see one yeah white window this is the level mg you can see y okay so here first of all what I will do I will open my directory like which uh like emails actually want to annotate so I'll click on contact ready and it is present inside new folder first of all I will annotate for my training images okay you need to select this image folder inside training okay okay let me do it again yeah so now you can see these three measures are coming okay now one thing you need to do just click on View and activate this thing okay uh Auto saving mode and here you need to change change the like step directory as well like after doing The annotation why this annotation file will save okay so I'll change it to my level so inside train I have levels folder okay I will give this folder location yeah it's completely fine now at the left hand side you can see uh it is uh the format of this annotation is now Pascal VSC okay but I am using yellow V5 so I need to Connected my data as here the V5 format so I'll click here now you can see it's your look if you again click it will say you will see little chance to create ml again Pascal VOC I will select the yellow okay after selecting yellow you need to uh click on this rectangle box now you just need to annotate the data like that okay just make sure you are selecting the objects now just give the label so it is uh so it is banana so I'll give banana again so once it is done what you need to do just click on Save okay now if you open that folder insert 10 levels you will see that annotation file okay see this is the kxt file it will save and inside that it will save all The annotation uh like coordinates C this is the four coordinates if I can open so these four coordinates information it will save okay and this coordinates uh will help me to do the training okay it will identify the objects okay and this is the class okay this is the Banana class because banana we have annotated very first that's why this class is zero so by default YOLO will take care these are the things okay you don't have to worry about and if you want to also see the like level name so there is a classes dot txt file it has also created you can see the level name okay then again what I will do I will open my tool and click on next image then I will again entertain it okay create rectangle select the portion you want so it is like a drink can I think okay now click on ok now if I click on Save now again if I open my folder you can see dreamcan has been also annotated okay and this class is one okay and it has also added inside my class.txt okay so this is the order this zero this is one okay then after that uh it will follow the order so like that you need to connect it for all the images actually you have uh actually in in this folder okay now let me do it yeah so it is put can or uh yeah so put uh food pack I think it's put back I can annotate okay now if I click on Save now it will present yeah yeah so once let's say you have done for your uh like training images okay now you need to do for your testing images okay your validation user again you need to open the directory for your validation image so it's better if you can just reopen it again so I'll uh stop the execution of this tool I'll press Ctrl C are yeah so it will restart now what I will do I'll go back this is my validation folder again let's open my terminal I'll clear it and I'll execute the same command level mg okay now let's open my directory uh it is inside new folder validation image okay okay now again you need to do the same thing since save directory to my validation levels now select yellow and create a rectangle box okay it would be one okay then you need to save it now if you if I open this validation one levels okay you will see the levels okay that's how you need to complete for all the images okay then you need to delete this classes dot txt again from your training folder as well because this file is not needed yeah so if you just delete it now your data is ready okay and here you need to create another file called data.yaml okay let me show you so like that you need to create one data.aml file here so if I open this data.yaml file let me open with Notepad so it will contain your first of all training image path which is nothing but train inside I have images okay so this is the folder I am giving and after that validation as well okay so this is for validation validation image as you can see validation image and you need to like specify your number of classes okay so I have 13 classes in this project so I have been like mentioned 13 and all of the name of the classes you have Okay so let's say banana chilies a drink and drink pack okay everything I have given here okay so this is my actual annotation uh you can say data.tml file that's why I'm showing you this one but let's say if you have some other uh like you answer levels you need to manually all right write it here okay so this file actually you need so once you have completed this file okay so you can create this data.tml file using your vs code otherwise let me show you the easiest option to create it so I will close it so yeah if I open my uh like this code here just right click new file okay let's say I want to create our PML file as gift test Dot yaml okay now see it has been created that's how you can create this yaml file I will remove it yeah okay so that's how you need to prepare Delta after preparing so let me close this execution okay you need to zip this uh like uh file so okay so for this actually just select everything right click so here I'm using 7zip you can also install 7zip okay now here you can see add to zip file okay I will add it now you you can name your data okay I will rename it too voice data okay Western expert data okay so once it is done now this thing actually you need to upload inside your Google Drive okay so let me show you so guys now what you need to do just simply drag and drop inside your Google Drive like that see here I have created one data folder inside that I have just uploaded my data as you can see this is the zip file I have uploaded okay then after that what you need to do you need to copy this link address okay and you need to give the permission to anyone with this link okay so if I go to the permission so make sure you are selecting to anyone uh with this link okay can access this uh data so after that you can copy the link address and just keep it with you okay so this link actually I'll be using to download the data from my uh you can see Google Drive okay so this is the entire actually uh process like how we can get our data set okay like how we can prepare like our data set how we can annotate it okay everything I have showed you now what you can do if you are generating your own data you can follow the same technique okay otherwise I will also share my data with you okay this Google Drive Link I will share you can download the data from there okay so guys now uh what I will do uh my data set is ready now I can start our notebook experiment okay so first of all the entire uh like you can say training I will show you this collab notebook then after that I will try to uh like you can say convert it to our modular coding uh so guys now let's start with our notebook experiment okay so for this actually what I have done as you can see I have already uploaded my data in my Google Drive okay and here I prepared one uh like you can say collab notebook as you can see voice detection using yellow V5 IPO and V5 okay so let me open this file so you have to open this thing using Google collab okay so if you're if you have already uploaded uh this thing in your Google Drive so here actually you just need to double click okay it will open up with the Google collab so the first thing I will do I will select runtime at GPU okay so change run type type to GPU and uh yeah yeah connect if you're also using free caller it's completely fine okay it will work fine for you now I'll just connect my notebook okay guys so now you can see uh it's connected now it's initializing that's right yeah it's connected okay now this is the command actually to check like which GPU you got here so if you just run this command Nvidia hyphen SMI so here I got Tesla D4 okay yeah so the first thing what I'll do I'll clone this uh yellow V5 depository Okay so you just go to Yellow V5 okay if you just starts on Google yellow V pipe uh yellow V5 GitHub okay so you will see the first link okay so this is the research from Ultra analytics okay as you can see so they have published this uh algorithm okay yellow V5 algorithm this is like uh one instead of that against architecture and uh what they also did they created one framework called yellow V5 uh okay this is the repository and they have already implemented this architecture okay uh in this repository as you can see so for this actually I uh we don't need to actually write the code from scratch and uh we'll take uh help from this repository okay and we'll try to do our job okay so as you can see this is the state of the art algorithm um like you can see this is the yellow V5 results okay all the graphs and all and uh as you can see like how to install and everything they have already given each and everything even they have also given the models you see this is the model Zoo pre 10 checkpoints so these are the models are available okay in yellow uh V5 so yellow V5 s means this is the smallest sized model and uh m means medium size then l means this is a large then X means this is like uh like bigger than this model okay that's how it has like different different variants okay so based on your task actually you can select the architecture let's say you want some faster inferencing power at a time just go with the smaller version model as you can see this uh model is very small okay as you can see the parameter size is like uh 7.2 million only and this is the map is code and this is the speed okay this is speed actually you get and let's say uh you want some good accuracy you want some authentic accuracy let's say you are working in medical domain at the time accuracy is matter okay at a time just go with bigger model as you can see accurate map score is like very high and uh inference time is like very high here because this is the bigger model okay and it has like you can say 46.5 Millions parameter okay this is like very bigger model so actually that's why speed is like very low here okay so that's that's how much you can select your model if you need the faster speed just go with smaller model and if you need like uh very high uh you can say high accuracy at the 10 go with bigger model okay that's how you can select the model so in this project actually I will use this model the other v5s model OKAY smaller smaller uh version model uh okay I will show you because uh it uh because again training will take time okay that's why I will take the smaller model okay and this model will work fine okay in this case the detection actually we're gonna perform so yes guys this is the repository um of yellow V5 okay so so yellow V5 is nothing but it's just uh architecture instead of the art algorithm you can talk about and uh yellow V5 has also this uh framework okay uh from alter analytics they have already implemented the algorithm okay so to clone this thing you just need to click on code and copy this link address okay and just paste it here okay I've already done now I'm going inside CD as you can see I'm first of all let me clone and tell you like what will happen so let me execute so if you clone it so it will create one folder called yellow V5 inside that it will clone all the repository as you can see it has grown everything okay and I'm going inside this folder called CD okay let me change the directory to this folder and inside that I have something called requirement.txt okay now if you open up the requirement.txt see guys I think you are already familiar with these are the requirements okay if I open my vs code okay so as you can see uh these are the requirements I told you it is for uh yellow okay now I just copy pasted these are the requirements only okay I I've just copied all the requirements and I've placed it inside my requirement.txt okay I create it in my vs code okay so that's actually you can also do it okay uh yeah so I will close this thing so first of all I also need to install the requirements in my collab so let me execute this command Okay so first of all it will look for this requirement.txt file and it will install everything so okay guys uh it's done now what I need to do uh if I do PWD uh now you can see I'm inside uh yellow V5 folder only okay now I will go back to the content okay now I'm inside this location now here first of all I will uh download the data okay from this GitHub like from this Google Drive okay we have like already uploaded our data here right so guys uh how we can download these kinds of you can say uh file from our Google Drive okay so for this actually let me show you one demo okay so that actually it should be very much cleared so for this actually uh let's do some experiments so I will come inside research and here you can see I have created one trials.ip NB file okay so let me open this file so inside that actually I'll do the experiment so the first thing I need to select the kernel so let me select python environment so I created this West environment as you can see so I'll select this one okay now uh here first of all let me import G down okay as I already told you this package actually we need uh to access our Google Drive okay so let me import this Library so we have already added this Library here G down here okay now what I need first of all I need the URL okay so URL equal to so where you will get the url url is present here okay just right click on top of your data like the data I want to download so if you just right click you will see copy link address okay just copy and make sure this is uh you are setting with anyone with this link address okay I have already set up now I'll just copy now if you come here uh sorry I will go back to my yeah notebook and I'll paste the link here simply okay so this is my uh data link okay this is my data link of uh inside my Google Drive okay if I search for it also if I open a new window uh if I open let's say in Incognito tab okay now here if I search it see I can access this data that means it is publicly visible okay now let me close it yeah so now what I need to do uh I'll execute this cell yeah so here first of all I need to get back this ID okay as you can see there is a ID okay so this is the file ID so somehow actually if I can get this idea out okay from this URL I will add this ID with a downloader URL okay and I can download the file easily from my Google Drive okay so how we can do it so for this actually I'll take one variable called file ID okay so inside file ID uh I will do one split operation I'll just write URL dot split because this is a string right yeah so I will do the split with respect to this part slash because as you can see everywhere I have forward slash okay now what it will return first of all let me show you okay so if I print file ID right now yeah so as you can see first of all it will return https after that you have slash okay it will uh like you can see split here then again it will give you google.com then again file then D after that you can see I have this IDs okay now again at the last I have what okay View then again you can see I have views okay then sharing this thing okay now how I can get this ID out from this entire list okay so as you can see if I count like from the last Okay so I have what first of all I have this view uh sharing this uh this one okay as you can see then after that I have this ID okay so now if I do like that if I take the last uh you can say uh second index I'll just write minus 2 here okay so it will give me this ID okay now if I print it see guys it's giving this ID only okay now let's say you want the last last one okay if you just give minus one it will give give you that one okay that's how we can do the indexing okay but I only need this one because this URL will remain same for because this URL structure will remain same for every Google drive file okay if you just copy you can say copy the link and just try to do the experiment you will see this will remain like you can say common in every uh URL lucky you'll be copying so this is the logic actually will work in this case okay now I have the IDS now what I will do I will take one prefix here okay prefix means uh this is the okay download the uh string as you can see so if you want to download any Google Drive okay so if I come here let's say I will copy this link if I search it here now if I click on download so guys on top of that you can see one URL okay so let me copy the URL and uh if I go back to my notebook and if I paste the link here now just see the difference guys so what I'm taking I'm taking little tinders.com okay then I'm removing these two line okay as you can see then I'm taking this UC okay after that what I'm doing I'm taking the last one okay this this thing only now after that I'm giving this one uh equal sign okay then after that what I will do I will add this ID so it will automatically download uh this file uh okay for me so let me show you how it will download so let me just remove it okay so this is the prefix we have taken now I'll use G down okay and there is a method called download so inside that you need to provide this thing okay first of all you need to provide this prefix okay and with that I will add this bye lady because this this uh strain okay I can add it now here you need to provide the output directory okay you want to uh like you store that data okay so let's say uh I want to use this folder okay to store my data so I will give give the location okay make sure this folder is present so here I'll just write data slash and here you just need to give the file name you want to download okay any name actually you can give that name actually it will use to download the data okay so in this case I will give West underscore data okay dot zip file okay this is going to be zip file okay so yeah that's it now let me execute and let me show you whether it's working or not okay so now if I hit enter now see guys it's downloading the data okay so it's giving cannot find the specific data uh okay why it's not coming okay because I'm inside resource folder if I show you if I just write PWD I mean said resource folder okay so what I will do uh I will uh simply go back I'll just uh come here I will import voice okay and here I will just write OS Dot change directory dot dot slash okay if you just do it and if you execute it you will see it will go back One Directory before okay now if I write PWD I will see I'm inside my project folder okay now if I execute this one it will run see guys it's downloading our data okay as you can see the percentage and the data size is 40 MB okay so that means we are able to uh successfully download the data set from our Google Drive so this technique actually I'll be using to download my data okay as you can see it is downloading so let's wait so guys as you can see my data has been downloaded okay see this is the data okay uh yeah that means you are able to successfully download the data okay and now what I will do uh I will use this technique and I will try to download a data okay so before that let me delete this data from my data folder okay yeah so let's use the same technique here so as you can see I have written the same code here okay now if I execute it it will download the data again okay it has downloaded now if I refresh here see guys this ZIP file has been downloaded okay so first of all I need to unzip this file okay and after unzipping I will remove this ZIP file okay so that is the uh like thing I'm doing here so these two command the First Command will unzip the data so I'm using unzip command okay and this is the you can say Linux environment okay in this color that's how uh that that's why actually you can execute all kinds of Linux command here okay because as you can see by default uh in inside your collab actually it is using Linux kernel okay and using RM command RM means like you want to remove okay you want to remove this ZIP file okay and you are giving the permission and you are giving the location of the file okay and it will remove the zip file because once I have um like unzipped okay once I have unzip this file I don't need the zip file okay that is why I'll remove this thing now let me execute and show you see it is unzipping okay now if we refresh here see it has unzipped and inside that I had my training folder validation folder okay and this is the data.yaml okay the same way we like uploaded our data in our Google Drive okay as you can see everything is present let me see it's working perfectly and after that this ZIP file has been also like you can say removed okay from this machine now the first thing I will read my data.aml file so as you can see this is the content inside my data.tml file okay so using cat command you can read okay read the file and now what I will do I will extract this number of classes okay from this data.yaml and I will store it okay inside this variable so first of all as you can see using yaml okay I'm reading this data.tml file uh in writing mode okay reading mode after that as you can see I'm extracting I'm loading it okay then I'm extracting this NC okay as you can see this is the NC if you just call this key it will return you this value okay I'm calling it and it will store this 13 okay inside number of classes now let me execute now if I also execute now uh Nam underscore classes as you can see 13 okay that means we are able to also extract this uh one classes okay now what I will do uh to train your uh yellow V5 model first of all you need to uh prepare one configuration file okay the model you are using with respect to that you need to select the configuration file let's say I will be using a model uh I already told you in my models you I will be using this model yellow v5s model Okay so I have already cloned this repository now if you go to the yellow V5 there is a model section okay just go inside model and here you have all the model configuration okay as you can see so I'll be using yellow v5s model that means this this model so I'll open this ml file now in this yaml file you will see by default this NC is set to 80 because this yellow is already trained on Coco data set and Coco has 80 number of classes okay so here we are doing the fine tuning okay that means we are using the pretend models on top of that actually we'll be doing our custom data training okay so for this actually I need to change this class this number of classes only okay and by default every architecture will remain same okay so I just need to change this number up and see that's it so for this what I will do first of all you can again read using CAD commands so I'm giving this yellow v5s location only just copy the location copy path and paste it here okay so it will read it and it will give you all the content inside that so this this is the magic code actually okay so this code actually will help you to change this uh okay change this number of classes inside your notebook only okay so because here I'm not using any python script here I am doing everything on the notebook okay in a Cell so this magic code actually will help you to do it okay in the cell okay so that's why I will execute this one now here first of all what I am doing I'm creating first of all I'm copying the same file okay I'm copying the same file this yellow B5 s dot EML and I'm creating another file okay and I'm naming it as custom underscore yellow v5s that means what I'm doing I'm just making one copy of this same file okay give uh giving a different name only okay as you can see custom underscore yellow v5s model now after that the class I am getting okay here the class I am getting I'm just replacing the class here okay and all the architecture I'm keeping as same okay here as you can see now if I execute the code now if you refresh here since add model you will see it will create another file called custom underscore yellow v5s now if I open this one now we'll see if initially it has number of classes uh was like you can say t now it has been 13 okay because we have already changed it okay see I haven't done any changes in my original file okay I just made one copy and with that copy I did the changes okay that's how this is the good practice okay don't uh play with that like original configuration file okay just first of all make one copy then after that just try to change it okay so that is what I have done now once you have done this thing now you are ready to uh like 10 year model okay so these are the parameter actually you can set as you can see you have image size uh you have batch size Epoch size okay data path configuration where it's okay the upwards you want to use so these are the things you'll need to set so first of all what I will do uh I'll go to inside yellow V5 folder okay as you can see I'm going inside yellow V5 for that and here I have one file called trend.pi okay so this is the training file like you can say code they have already written and see all the implementation are already there you don't need to do anything okay you only just need to run this file and you just need you just need to provide some of the arguments okay that's it that's why actually uh we prefer this yellow V5 a lot because this repository like maintains all the thing okay if you see the last comment was like 20 20 hours ago okay this is like very active research this community is like very active if there is any bugs okay it would be fixed as soon as possible I mean even if you are facing any issues just try to create one uh you can say uh try to create one issue here okay as you can see issue section is there try to create create one issue it would be like instantly solved okay so this is like very active research as you can see okay so if you have any kinds of problem just uh go there press the issue uh so they will definitely help you okay with that so that's why uh you don't need to like write tons of code only you just need to process another problem statement okay and everything can be done using yellow V5 okay now as you can see I'm running this standard Pi using python command now I'm defining my image size as 416 okay by default just take this size only because it will work because if you just check your data size also it's close to this like for uh 16 okay but let's see if you're working with HD images at this time you just increase the size a little bit okay if you are decreasing the size that means uh you are like reducing the computational power okay so first of all start with uh this number if it is working fine just keep it otherwise just change the number okay now I'm defining my batch size as of now I'll be taking 16 because here again I'm using uh like like Google collab okay and most of you won't be having like premium uh like Google collab that's why just keep it at 16. otherwise memory will get full now number of box here uh I'm keeping as 50 because I only want to Train 50 box okay that's why and if you're doing actual trading just try to increase the epoch size at least train with thousands okay 50 500 to 1000 epochs you will get a good model okay then after that I am providing my data ml path okay so as you can see this is the data.ml path the HTML path I'm giving here okay and after that I'm also passing my configuration file okay it is present inside my model so if I again expand this one inside model I have this custom underscore yellow v5s okay this is the path I'm defining and after that you need to provide them words name okay which words actually want to use okay as I already told you I'll be using this model audio v5s okay only just need to give the name inside a string it will automatically download the model OKAY from the internet now this is the name of the output folder so it will create one runs folder inside that it will create this folder and it will save every logs okay inside that every artifacts inside that okay yeah so this is the command now let me run and show you see guys first of all it will download the model okay see it's downloading the model and after that this is the model architecture it will print okay now it will look for your training and validation image okay and once you've got the images see uh it will start the training okay as you can see and this log is pretty good okay everything you can see here every loss and accuracy score everything you can see this is the object lost is the Box loss okay this is the classes loss and it will also give you the map score okay at the last as you can see this is the MVP score okay every iteration as you can see map score is also there so guys our training will take some time okay and one thing I just want to also show you now if you refresh left hand side uh here so here you can see it will create a create one runs folder okay inside runs folder it will save all the artifacts okay as you can see Trends folder inside that yellow v5s is there now see it is also saving it is loss then uh like your batch images okay everything it will save okay if you want to see the loss you just open the CSV file so every laws are there okay everything it will log see if you don't need to do anything manually so all the thing it will save here only you just need to check it okay and after training the model uh it will keep the model here called base.pt okay so let's wait uh I will come back when this training is done uh so guys as you can see my training is done and uh these are the uh like you can say AP score then your uh MEP score then Precision recall everything you can see for all the classes okay and uh your results will save in this folder only now if I refresh so inside runs folder now you can see your words okay this so this is the model actually we'll be using okay for our inferencing and these are like your F1 card okay as you can see if you can open you can open this thing and see this is the F1 card okay and you have also confusion metrics for multi-class see everything they will uh generate and give you inside the folder and if you want to want to see the loss okay you can see from here okay see Bridge Run recall every metrics are there okay uh and even it will also generate some batch uh predicted images okay see it has done some detection okay that means everything is working fine so far detection score is less because I only trained 50 bucks okay that's why now you can also launch the tensor board to see the logs okay training logs so this line of code actually will launch up your transfer board and it will give you that graph okay just a minute see guys uh this is the like you can say uh this day phone card okay and this is the Precision card then you have confusion Matrix from here also you can see then loss Matrix everything you can see just export this tool okay this is like pretty much right like uh this like very good tool okay uh even you can change the scale here even you can also sense the mode here okay so everything it will show here and if you also want to like uh visualize the results so there is a PNG file in that runs folder here results dot PNG okay so this thing actually I'm uh rendering here okay as you can see so this is your um uh loss okay uh training loss box loss as per your Epoch is increasing your loss is also decreasing okay with that that is the mfp score your uh like epox is also increasing like accuracy also is engaging okay that means your model is running in a better way okay as you can see there is no zigzag problem now uh if you want to see like now here if you want to check your account data okay you can also check these are the gear down to data and you can also uh check your augmented one okay so see basically what it will do in runtime actually it will also apply some augmentation technique you can see this brightness is little bit different even it has combined uh these four Images together so we call it Mosaic augmentation okay so it will apply some of the runtime augmentation to increase the data size okay in the runtime only and it will give you some of the images that's how now if you want to check inside your bits so these are the weights actually you have now what I will do uh I will start now actually let's do some experiment okay now let's do some uh testing okay on my test data set so again you will see one file inside this yellow called detect okay detect underscore Pi so all the code are already written okay so all you just need to pass up some of the parameter so detect dot Pi you need to pass your training weights okay you have already trained so it is inside my runs folder inside train I have yellow v5s razors okay inside weights I have this base dot pity okay this is the model location I'm giving so just copy the path okay and paste it here now image size I will give the same thing as I did in my training Confidence Code I am keeping as 50 percent okay that means I'm telling to my model if you are 50 sure okay it's like that this classes then try to show the prediction otherwise don't show it okay you can also increase the decrease the size but by default uh I usually use a 0.5 okay now Source emails actually I have my validation image inside this validation folder valid folder inside image okay this is the location I'm giving now if I execute it so it will take all the image one by one inside validation okay as you can see these are the images and it will do the prediction and the prediction results it will save inside runs folder again okay so let me show you first of all yeah it has run now if I open my runs folder so here it will create another folder called detect inside that you will see experiment okay now inside that actually you can see all the prediction now if I open the first prediction okay see it has detected it's the banana okay with the confidence score of 0.56 okay it's less because I tend to it 50 bucks that's why okay and why it is giving experiment now let's say if I want to execute this same code again it will create another folder experiment two okay inside detection okay now inside that actually it will save those experiment that's actually instead of replacing those file it is creating multiple experiment okay inside that it is having saving the prediction okay that's actually the other V5 handles all the thing now if you want to download the model just try to mount it with your Google Drive and this is the command okay here you are just copying this base.pt to your uh like uh where uh to your like Google Drive okay just give the path with respect to that and you can also download like that just uh let me show you how you can download it uh results okay now here we'll get a three bar dot button click here and click on download okay if you just click on download so it will start download okay so that's how um that's how actually we can do like custom Training uh on my custom data okay using yellow V5 okay and this is the entire notebook actually uh okay uh everything I have added here this training process Evolution process everything I have added here okay so now what I'll do I'll convert this notebook to our modular coding okay and we'll be like implementing this thing as a component okay we'll be creating training pipeline separate way so uh what I'll do I will download this notebook and I'll keep it inside resource folder so that I can give you as a reference this notebook okay you can also try yeah now I will uh like close the runtime because it's already running so I'll let me close it okay all right now if I open my project folder and uh let me open my project folder again okay inside research I will keep it I'll simply paste it okay yeah so yeah so guys our notebook experiment is done now we can start writing our components uh so guys before starting with our components implementation uh as I already told you uh I need some like helper tool okay so uh if I talk about like what are the helper tools actually you need you need actually logger exception and your utils okay so these are the module actually you need okay before implementing your actual components so uh what I will do first of all I will write my custom logger okay then after that I will write my custom exception then at the last I will write my uh custom utils okay so this three modules actually I'll prepare then I will start with our components implementation so first of all what I will do uh because see we have completed this notebook experiment so first of all let let me comment these other changes in my GitHub so I'll open my terminal so here I'll just write keypad in case space Dot now git commit ipnn m uh notebook experiment and it now let's push let's change this git push origin mean uh yeah so I have successfully pushed the changes now if I click here so it will disappear okay so now let's start with our logger so but before that let me close these are the theme okay so here I have created one uh logger folder inside voice detection okay so here I have created one a Constructor file okay so let me open this file so here I'm going to write the login so first thing what I will do uh I will uh import some of the libraries okay so first of all I'm importing login so login is already like you can see in build uh module in Python so using login actually I will create my custom blocks okay then I also need operating system module okay uh using this module I'll be creating the folder okay then I also need Deadtime uh module okay why I need the time module because I will create one folder here first of all log inside that I will create multiple timestamp folder okay inside timestamp folder I'll be creating my login file okay so that what will happen let's say you have run your code like uh at a specific time okay so that time like we'll say in the folder okay folder and inside the folder it will set the log directory and if you have run the code again okay in a different timestamp again it will track the timestamp okay again it will generate another folder and it will save the log file there so that what will happen you can actually track the time like which date actually which time you actually executed at code and what was the issue at that time okay so this is like the best way I have to create the logs okay so you can also follow this strategy so to create it first of all I will Define one time string here so as you can see first of all I am taking one variable log file inside that I am calling this date time okay this model and I'm telling I want the current timestamp okay then from that actually what I'm extracting I'm extracting the month date year okay then hours minute and seconds okay then after that I'm giving one extension con called DOT log okay so basically it will create this order for me okay inside that it will save the log okay so I'll show you like how it will do it uh so before that let me Define the log path so this is going to be a log path okay first of all I'm like checking the root directory okay then I'm creating a folder with that I'm creating the login file okay then I will first of all create a directory so using make directories you can create a directories and exist is equal to true that means if there is already present okay it won't be creating now I will Define my log file path okay so this is the log file path as you can see I'm just uh integrating these two thing together okay using voice dot part of join so first of all it will take the log path and it will take the log file then after that I will Define my login string so using this login okay I'm creating my custom log I'm using basic config first of all I'm giving my file name okay this file name I'm passing after that I'm like defining the log format okay so first of all it will uh like uh uh extract first of all so first of all it will save the timestamp okay ASCII time okay um of your execution then it will uh like save the name okay like which model it has run the like log then it will also save the log level name okay like whether it's information log or let's say any bug log okay it will save the information and after that it will save the log message okay then I am defining the level here okay so I will execute it and I will show you like how it will generate the log okay so let me save it yeah so now to test this one what I will do uh I will open my app.pi okay so let me use this file only okay so first of all let me import the logs so but before that I think yeah it is already uh selected as you can see uh okay so first of all make sure you have selected your environment okay so my environment name is West so let me refresh and select it yeah so first of all let me import the login so I'll just write from uh waste detection okay dot uh logger import login okay yeah so now let let me log one just uh basic um okay example login dot info I'll just print uh welcome you might custom log and guys here if you see I'm getting one suggestion okay because I have installed tab nine okay tab 9 is a AI powered you can say extension as you can see if you just search in this extension Marketplace tab nine okay so this is the like uh AI generated Auto completion tool okay you can install it so you will also get the um like Auto completion suggestion okay now let me save this file now let me execute my app.fi and let me show you like how we teleported so I'll clear the terminal I'll just execute python app.pi okay now if I execute it and I will see it will create one folder called logs okay inside that again it has created one folder okay with the same timestamp and again it has created One log file with the same timestamp as you can see okay now if I open this log file now first of all you will see it will save your ASCII time okay because here we have mentioned first of all it will save the key time uh here then after that it will save the like uh this my name okay like which file you run it so I I've run this thing from the root folder okay that's why it's root now it will save the level name okay so this is nothing but information related log then after that it will save the message okay so this is the message we have given welcome to my custom log as you can see here okay so this is the log format now let's say if I execute for the second time okay so what will happen let's see now if I go to my lock folder now you can see it has created another folder with the same like current timestamp and again inside that it has created the log file okay now if I open it now see this is the same thing okay so that's what I'm doing I'm separating out my uh execution okay uh with the separate with a separate separate log file okay and I'm tracking my code so this is like a good practice instead of using one log file you can create uh with a timestamp okay so this thing I have updated in this project okay you can also follow okay if you're like this this approach also it's completely fine okay so this is all about our custom log now uh let me prepare our custom term exception as well okay so I'll open this exception inside that I've again created one Constructor file okay so let me open this one so before that let me close this thing to prepare the custom exception actually you need one function okay so this is the function I have defined called error message detail okay so it will take your error details okay and it will use system module to find out like why it got there okay which line it got there every information it will return as you can see here okay now this is your final class called app exception okay so inside that as you can see I'm calling this method okay and I'm inheriting with our custom exception okay like in Python we already have one pre-built custom exception module okay so with that actually I'm inheriting okay and I'm using the same class of the exception using super okay and I'm on top of that actually I'm preparing my custom exception okay so this is the simple exception I have prepared okay if you also want to learn about like exception custom exception just go to Google just go to Google and just start support uh exception python documentation okay if it starts it so just visit python documentation here so you'll see you can Define your custom exception in many way okay so I referred this documentation okay and I prepared my custom exception here okay so you can use this term like exception as it is in your every project okay it will work completely uh in your project okay every project it will work fine so uh yeah guys this is the custom exception now also let me show you like how it will work so I'll open my app.pi again so here uh let me delete this thing so let's say uh what I will do first of all I will Define one try except block okay so inside that I will do some job let's say a equal to three uh pre uh let's say I will do divide with this string okay so it will definitely give me one added okay now you'll see it will save uh all the like you can say uh information okay so now I also need to uh Define my exception okay and I also need to my import my exception okay so let me import so I'll just write form voice detection dot exception import app exception okay now I also need to input CIS module okay now yeah now let me execute my code and let me show you like what it will do so if I open my app.pi now see guys as you can see uh first of all it's giving voice detection exception okay error occurred python script app dot Pi okay app.pi line number seven okay as you can see line number seven and unsupported operand type s for this divide operation basically you are trying to do integer divided by string okay so this is the message okay it's giving so basically what you are getting you are getting the file name which line you got the error even what is the error message you are getting okay using your custom log instead of using like simple exception if you are using this kind of custom exception okay it will give you all the information related the exception let's say you have written like a very bigger code okay you have taken lots of file if you are not getting the proper exception like masses okay so you won't be able to fix the issue but if you are getting proper exception message then it would be very easy for you to figure out the bugs and you will be quickly uh fixing this thing okay so that's why we have prepared this custom exception okay all right now what I will do I will also prepare my utils okay so what is utils guys utils is nothing but like those functionality will be using frequently in our code okay so instead of writing them in a separate separate component I will write them in a one file called many pills and whenever I need it okay I will use it from my menu till itself okay so now first of all uh so first of all I will import some of the libraries here so these are the libraries are needed okay and I'm also importing my exception and login okay because I'll be also using my exception and login here so first method I need which is nothing but radial okay because radioml I also want to read yaml file okay and to read the ml file sorry and to read the ml file actually you need this function okay so I'll provide my my email file path here okay and it will return me the content okay and once I've like uh read it the yaml file I will also log the information let's say read email file successfully okay and I'm also uh using my custom exception here as you can see here okay then the second one I need idml file okay so if you want to let's say write an EML file okay you can use this uh like function okay so you just need to provide the path file path the content okay and this one okay if you by default it would be false okay and just keep it as false okay you just need to provide the path and content so if you just give anything it will save as EML file okay uh I won't be using this thing uh but if you want in any other project okay you can use this one I have kept this as a reference here okay now I also need my decode image and encode image okay why I need this to function so as I already showed you whenever I was passing that image okay so it was not directly passing in my backend so what I was doing I was first of all so that actually first of all I did like decode that image okay and I saved it at best164 string okay after that whenever I was returning okay I was returning in the back end I was saving that base64 as mine JPC file okay so using this in code so you using this encode image uh like method okay I was doing that thing okay so that's why this thing is needed okay I will explain this thing whenever I will be creating my web application okay so as of now let's consider these are the utility file uh these are the utility function I need okay for this project so yes guys we have successfully uh like completed these are the three modules okay exception and my logger and my uh my neutrals okay now we are ready to write our component but before that let me uh push these are the changes in my GitHub so I'll clear the terminal and I'll simply write git and okay then gate commit type name logger exception and utils added okay now let's push the changes with push origin I mean yeah so if you want to see it just uh press Ctrl and click on this link so it will open up that browser for you okay now see I have actually added this thing okay now let's start writing our first component which is nothing but data ingestion uh so guys uh before starting our data in decision first of all uh let me tell you like the project workflows actually will be following because uh there are lots of file and folder as you can see like we have constant we have entity we have uh like you can say pipeline okay so we have different different actually uh like section here so to create our actual components okay so which file I need to change first so we need to follow one workflows okay first of all let me tell you the workflows actually I'll be following to implement this entire project okay so here let me write down the workflows so the first uh like file we will be updating called uh constant okay uh constant then the second file I'm going to update which is nothing but entity okay as a third file will be operating called components and fourth we'll be updating our pipeline and fifth we will be updating our app dot file okay so as you can see left hand side first of all I will update my constant okay so this constant uh like file then after that I will update my entity so as you can see I have also entity I have artifacts entity and config entity I will tell you what is this entity and uh then I will be updating my components okay comparison is nothing bad about data injection validation and model trainer and pipeline okay as you can see I will integrate this components with this pipeline okay and I will create one training pipeline after that I will like update my endpoint which is nothing but app.pi okay because user will run this app.pi and they will get the results okay so yes this is the project overflows actually I'm gonna follow okay so what I will do I will close these are the theme because we have already completed okay now uh let me start with our first component which is nothing but data interesting okay but before that what I need to do um I think I already told you I need to update the constant okay so let me open the constant so here I have another uh like you can say training pipeline okay so I will open this training pipeline folder and inside that I have one Constructor file okay so let me open it up so I already prepared some flowcharts okay so I'll be using this four shots to implement my entire like you can say components or status okay so first of all let me show you the data ingestion studies so guys as you can see this is my data in addition stays flowcharts okay so first of all what I will do I will create my data in addition configuration okay basically I need some configuration let's say our data ingestion directory okay like where I want to ingest a data okay so it will create one artifacts folder inside that it will again create one data transition directory folder okay inside that it will ingest the data then again it will create another folder called feature stored okay after unzipping those data it will uh like uh remove the data to the featured store okay and download URL okay if you want to download the data then you also need to download URL that means what are the actually necessary path and uh let's say variable I need okay to continue my data injection stays I will be keeping everything inside data in addition config okay and this config actually I'm going to write inside my entity as you can see I have something called config entity so inside config entity I will Define these are the configuration okay and the hard coded variable okay I'll be taking from my constant okay so inside constant I'll be writing all the I'll be writing all the path okay and from there actually I'll be keep like keeping inside my config entity okay then after that I will initiate my data in decision so it will get the data from your Google Drive okay so here uh I have added GitHub because this is the flowchart from my previous project there I was using GitHub but instead of GitHub let's consider here you have Google Drive okay it will get the data from the Google drive after that it will unzip the data okay and it will move the data to the featured store folder okay as you can see I will I will also give the features to folder and these are the folder would be available okay a train instead of test the data is like valid okay and data.aml so these three files should be present okay inside feature stores and after that I will return okay I will return my zip file path okay that means my zip file data directory path with that I also returned this three file path okay which is nothing but features profile path okay so this thing actually I will be returning as my data ingestion order picks okay so this artifacts actually I'll be defining inside my artifacts entity okay an artificial entity will return result thing okay basically artifex entity should be written type of any function okay we'll be creating okay so then I'll be like uh passing this uh data injection artifacts to my next stage which is nothing but data validation okay so data validation will take these are the path it will get the data from here okay and it will do the next job that's how that's how I will Define uh some return type of every method I'll be defining okay and that will return like some of the themes okay and that we can I will take from from my another like you can say uh stays okay like my next is and I'll like get get my necessary files from there okay and I will uh like continue with my next stage okay like that actually I'm going to continue this thing okay so this thing should be very much Clear uh after implementing all the components okay so let's start our uh implementation okay so first thing I need to Define my constant okay so first of all I will be depending my artifex directory okay basically it will create one folder in here called artificial inside artifices it will save all the artifacts okay all the generated files okay from my every stories so this is the artificial I'm defining okay as you can see this is the folder name and this is the variable I'm defining and this is the string type variable okay you can also mention the type okay then the first thing I'll be defining called my data injection related um constant okay so let me Define so guys uh this is my data injection related constraint as you can see here I have commented out already so this thing what are the things it will return okay first of all it will return data in decision directory name okay data transition directly name so if I show you as you can see data injection directory name then data injection feature restore directory if I again open the PNG file feature restore file path okay then it will also return download a data download URL okay this is the URL and uh if I show you the flowchart so this is the editor uh download URL okay so these are the three like you can say variable I'm returning okay and you can see I'm giving all the content okay in this variable so this is the same Google Drive like link I've given here okay so instead of hard coding inside my components okay I'm keeping it here let's say in feature you want to add some other URL here okay just open your constant file and keep this link here so it will reflect in your every code okay you don't need to open that code manually and sends it okay so that is that is why actually we create this constant separately okay and this is my featured store path and this is my detonation part okay now let's save it now what I will do uh after updating my constant I need to update my entity okay so let me open my config entity inside my entity folder so here I have entity so I will open this config entity okay so as I already told you I need to Define these are the configuration okay because variable we have already fine now I need to define the configuration okay data injection directory feature restore file path and data download URL okay so with the help of this constraint I will Define this config entity okay so first thing I need to import some of the libraries okay so these are the libraries are needed so here if you see I'm importing from my constant okay training pipeline that means from my constant training pipeline I'm importing everything that means what are the things are there I'm importing everything here okay here okay then I'm also importing get time okay then I'm also importing data class okay I will tell you why this data class is needed because I will be creating something called Data class okay here so the first thing I will do I will Define my artifacts directory okay I will create a one data class called training pipeline inside that as you can see I'm defining my artifacts directory okay and from where this artifacts directed is coming now if you just press Ctrl and right click on it okay see it is redirecting here that means it is coming from my constant okay why it is coming from my constant because I have imported everything here as you can see okay that's how you can take all the access of this variable okay okay so everything I am importing from here that's why I can access all of the variable present inside my constant okay now here I'm initialing initializing one object called training pipeline config and after that here if you see I'm mentioning the type okay of this uh objects okay so this should be the disk type then I'm initializing the class okay now actually I can access this what are the variable I am defining okay inside this class I can access all the variables okay so after that what I will do I will and finally create mine data induction config okay so let me show you like how I prepared the transition config so guys again I have created one data class okay so data class is nothing but um let's say okay if you want to access some variables okay instead of like writing the self okay uh so you can you can use this data class decorator okay as you can see I've already imported and you just Define some like you can say simple class inside there just try to mention uh like class variable okay instead of any self okay and you can access these are the variables okay from the other file itself okay that's why we use something called Data class okay now here if you see uh I've created one class called Data ingestion configuration so first the thing actually I'm defining which is nothing but data in decision directory okay now if I open my PNG file once again okay as you can see data injection directory okay with the same name I'm defining and now how I'm preparing the data injection directory as you can see first of all I'm using voice.part.join okay I'm calling my training con like uh pipeline config then from that I'm reading this artifacts directory okay so what it will return it will return this artifacts okay artifex folder then inside artifics I'm creating something called Data induction directory name okay now where it is coming if I again press Ctrl and right click on top of it see it is redirecting here that means data injection uh like uh okay add it on the screen description okay so what is happening exactly so first of all it's coming artifacts okay it's coming let me write it down artifics okay artifics slash data underscore ingestion okay so this kinds of path It Was Written okay as you can see this kind of path it will return okay so what I will do using voice dot make directories I will create this path so it will create this path here artifacts inside data induction and inside that it will ingest the data okay I hope you got it like it will download that CSV file see previously for the demonstration purpose I downloaded my data inside data folder okay but now actually I will download my data inside dataization folder okay after that again I'm preparing another variable as you can see feature restore file path okay another configuration again I'm calling my data injection directory okay as you can see and the transition directory means uh it will return your uh this directory only inside that I'm creating my feature install directly okay if I again redirect to here so this is the feature restore folder okay so it will create another folder inside that and inside that it will unzip the data okay so now let me delete and whenever I will be executing you will get to know like what is happening here okay with that I am also defining my data download okay because in my uh project also I also mentioned this thing okay so these are the configuration already as you can see data download URL again it's coming from here so yeah this is nothing but our uh config entity okay so these are the configuration we have prepared now what I can do uh I also need to prepare my artifacts entity as well because um I need to return these are the like path okay after ingesting my data because this should be a return type of any function okay so let me Define it as well so what I need a data underscore zip file path and feature installed path okay so I'll open my artifacts entity okay so again here I will uh Define one data class only and I named it at data injection RDX okay inside that as you can see I'm returning my data zip file file path which is nothing but string again file uh feature restore path okay again it's it should be string okay so yeah so this is the return type of my uh components okay now what I will do uh I will start with my components okay so inside components I will open my data in addition so here uh first of all I will be importing some of the libraries okay so these are the libraries are needed and with that it should be artifacts entity okay see guys here I'm importing my login okay also I'm importing my app exception because I will be using this to help out okay because I am going to use this login and exception and with that I'm also importing from my config entity I'm inputting my data injection config that means if I go to my company okay this is the configuration class I am importing with that I'm also importing my data in just an rdpix okay from the artifics itself I'm also importing this thing as you can see uh voice detection entity artifacts entity input this thing okay now first thing what I will do I will Define one class called Data injection so let me Define data in addition okay so this data indication class will take data injection configuration okay as as you can see I'm already like giving this uh okay class update here now inside that I'm creating another like object called Data underscore ingestion config okay then I'm calling this one this object okay then everything I'm handling with the try accept block then inside that I will Define the same function okay as we used in our notebook so sorry um so that actually I was writing I'm sorry so there actually I was writing as uh you can see script but here I have prepared one function but inside the function you will see I'm following the same logic only Okay so the name of the function I have kept it download data okay and here if you see uh first of all I'm taking the data City URL okay from my data indication configuration I'm calling that data like download URL okay if you click here so it will redirect here okay so that's how I'm taking from my configuration then again I'm preparing my zip file okay as you can see zip file then after that I'm creating this folder okay basically it will create one folder called uh artifacts inside that data injection okay this line of code then after that I'm naming my data okay like which name I want to download the data I have kept data.z file okay you you need to download it then I'm preparing the entire like you can say path that means where it should be download the data okay with that the file name also okay then after that I'm logging the information okay as you can see later downloading data from okay this e button to this folder after that I'm using the same code okay first of all I am taking that ID okay from my URL as you can see data City URL I'm doing the split operation then I'm defining the prefix okay after that using G down I'm downloading the data set i'm giving my prefix and file ID with that I am giving my location like why I need to download the data okay and once I have downloaded I'm again logging the information like downloaded the data from this URL to this folder okay then after that what I'm doing I'm returning the zip file path okay so this thing actually I'm breaking then what I will do after downloading the data I also need to extract the data okay in the feature store folder so this function will do it so here if you see I have prepared another uh like uh function method called extract zip file see basically it will take your zip file so this ZIP file I'm returning here as you can see okay it will take the zip file azifile directory okay and after that what you'll do it will prepare this feature restore file path basically it will create another folder feature is stored okay inside that it will unzip the data okay as you can see using zip file module I am unzipping the data okay unzipping the data.zip file after that I am logging okay like extracting zip file from this folder to this folder then after that I am returning my feature store file path okay that's it now again I am doing the handling then again I am handling my exception okay then after that I will prepare one final method called initiate data injection okay as you can see initial data insertion okay so inside initial data in addition okay I'll be calling this two method I have defined here but as I already told you my final method should have some return type okay which is nothing but data injection directory in this case Okay retain decision artifacts okay as we already prepared it in decision artificial basically it was written these two variable for me okay as you can see inside that first of all I am calling data download data with this method okay once I downloaded the data it will return the zip file okay zip file file path then I am calling the extract file path okay this one then I am passing the zip file path then it will extract the zip file after that I am returning my these two variable data zip file path and my feature store file path inside this data industration artifacts as you can see we have already defined the return type okay that's how we can Define the return type okay okay so that is what I'm doing here then after that I am logging the information then I am returning my data injection artifacts okay this artifacts object I am returning so that I can like take this uh object in my other components okay and I can get out of these two variables okay then after that I am like adding my exception okay that's it okay that's it my components okay so this is little bit different technique from my previous project okay so this technique is also good you can follow as it is okay now what I will do uh to test this one I need to add this thing in my pipeline okay so as you can see I mentioned now I need to prepare my pipeline so let's open my pipeline inside pipeline I have something called training pipeline okay I will open the training pipeline so here first of all I'll be importing some of the libraries okay I'm importing my login with that I will also import my exception okay then I also need to import my data in addition okay these components so let's import so I'll just write from voice detection dot components okay dot data injection I'll import uh if I open this one data in addition this class okay here all right so with that I also need to import my so with that I also need to import my data injection config and netend is not Apex okay so let me copy it as it is I'll open this trading Pipeline and I'll paste it here okay but here one thing I will do instead of importing like that I will import inside this bracket okay so that I can import uh all the configuration okay here only I'll just keep comma here and I will import the next configuration like that I think all right now here I will Define my training pipeline class okay inside that I am defining my data injection configuration okay then I will Define my start data ingestion class here okay basically it will start the data Edition okay so this is the method again this is the return type data Edition artifacts okay as I already told you if you if you like see the flowchart again data injection this is the return type okay I'm defining here on the then after that I am login like started okay entered in returnation and the transition started then after that I am initializing the data injection okay class here and this is the object from this object I'm calling this initiate the transition if I right click press Ctrl and right click if you see it is redirecting here okay basically this is the main method okay if you call this method it will execute these are the method okay so that's why I'm calling this method here and once I called it see it has again logged some information like this uh determination completed okay and exited this pipeline after that I'm returning my data ingestion artifacts okay that's it okay layer then this thing to start this method okay I will Define another method called run pipeline okay run pipeline you said rotten pipeline I will call my data in the machine okay then I will close my exception okay yeah so here in PC I'm calling this start data injection this uh okay uh object this method here okay as you can see uh uh yeah now let's uh test it whether it's working or not now to test this one I need to call this thing inside my endpoint which is nothing but app.pi okay so I'll uh import this trading pipeline inside my app.pipe so let me open the app.pi so first of all let me I'll just write from waste detection dot uh pipeline training pipeline import this class training pipeline okay then I will Define one object I'll call this training pipeline okay and from this object I'll call my run pipeline okay yeah now it should work so as you can see this is the main function we have defined here okay now this is the app.pi we have prepared now let's see whether it's working on networking so I'll open my terminal and uh let me clear it so I'll just write python map reply so I'm expecting it should work see it has created artifics and see it it is starting downloading the data okay now if I open my artifacts as you can see it has created one data in decision folder okay inside that it is downloading the zip file okay as you can see it's downloading a zip file now let's wait after downloading the zip file it will unzip and it will create another product called feature historic inside that it will uh save those three files called crane valid and my database and my data.aml okay okay so this uh component has run successfully now see inside artifacts data Edition C this is the data.zip file it has downloaded then it has created feature store folder inside that c it has saved all this data okay training now effect expensive these are the images and these are the levels I had okay that means everything is working fine if I when my DOT EML yeah so that means it is able to uh like insist a data successfully inside my rdb exporter okay that means this uh stage is working fine okay so now what I will do I will start with my next stage which is nothing but uh data validation okay I'll show you like uh what is what are the things I will perform inside my data validation basically I will check my data format okay here if I show you so inside artifics uh insert picture see this is the data format basically it should have training valid and this data Channel this free file should be required okay to start the training so this verification I will do inside my data validation okay but before that let me comment these are the changes in my GitHub if I click on Source control as you can see uh it is also trying to push these are the data okay because we haven't mentioned this artifacts folder inside my git ignore so what I will do I will open this git ignore and here I will add also my arctic C what are the things I have inside my artifacts I don't want to push it okay that's why uh I'll ignore this file art Rd effects okay I'm giving start because I want to uh like ignore all the things okay now okay now as you can see uh it has ignored those are the data now it will only push these other changes okay so as of now what I was doing I was opening my terminal and here I and here I was writing like lots of command like gate ad then get commit okay then again I was pushing the like code here so instead of like writing the command if you're using vs code you can also push the changes okay from here only so just click on Source control and here you just need to give the commit message okay so let me give so here I will give our data ingestion added okay now if you just click on Commit see commit is done now if you just click on sync changes okay now it will push it see it is giving the push one commit to original mean okay now if I click on push see it will push all the thing in my GitHub okay so it may ask for you the authentication with your GitHub for the first time if you're doing for the first time it may ask you the authentication just uh like give the path permission okay just add with your GitHub here so you will be also able to do it okay and if you are not able to do it just write the command manually as I already showed okay in my terminal also that's how you can also do it okay but this thing is like very much easy and it will it takes like a very less time that's why I prefer this one okay both why I showed it now if I go to my GitHub okay now if I refresh the page here I will see the changes okay as you can see data injection added okay that means we have completed successfully uh so guys uh our data induction is completed now let's start with our next component which is nothing but data validation okay this is the next stage so for this actually what I will do first of all I will close these are the thing because we have already completed our data in addition part all right so uh first of all let me show you the flow set actually I'm going to follow okay so this is the data validation flowchart guys as you can see so I will follow the same technique okay as I did in my data edition first of all I will Define uh some of the configuration okay related our data and validation as you can see it should be data validation directory data validation status file directory and record file list okay I will show you like one of the things I'll be doing planning and after that this com uh configuration should be passed inside my initial detection okay then it will validate all the exist file basically it will check okay all the three three files are available or not okay basically if you open this feature store it will check whether this train valid and data.yaml is present or not okay if this three file is not present okay in this feature restore file path okay so what will happen that means my data is not in correct format okay I can't start the training okay if any file is missing from here your training will give you some error okay so at that point of time as you can see if it's exist Okay so if it is not exist okay if it is false so I will return one validation status okay I will generate one txt file inside that it was written it is a false okay that means your data is not in correct format and if it is true that means all files are exist okay it will return as true okay then after that I will generate the artifics okay so artifics it will give you the start as well just start us only okay so it will give you the value status only so this status I will check for my training so if this status is true then I will start that training okay otherwise I won't be studying the training so that is the logic actually I'll be implementing here right now okay so let's uh start so first of all what I will do I will follow the same like workflows so first of all I need to update my constraint okay so let me open the constant uh here I have training pipeline so this is the constraint okay so here first of all I will be defining mine data validation related constant okay so let me show you what are the constraint actually you need so these are the data validation related constant okay as you can see so first thing I am defining my data validation directory basically what it will do inside artifx folder it will create another folder called Data validation okay then inside that it will create one file called startups.txt inside that it will give you the status by that uh it's a false or true okay and after that I'm also returning all the required file list basically I'm telling uh it should be present these are the files okay as you can see train valid and data 30 ml 10 valid and data dot EML okay so this is one list okay I'm returning so it first of all I will go through the list and I will say one by one this files are available or not if it is available then I will return this status as true otherwise I'll be turning registered as calls okay that's it now this thing only I will be writing inside my configuration okay so now let me open the configuration config entity has open now I already prepared one data validation config so after data animation I'll just initialize my data validation config okay so basically I'm like returning the same thing data validation directory validation status file directory and required file list okay as you can see if you open this database Edition PNG file yeah so these are the like three configuration I'm returning only okay so using voice dot particle joint I am just joining the path together okay and validation directory again I'm like joining the path together and this is the required file list okay so I'm taking as it is from here yeah so that's it now I also need to define the return type of this uh method okay so which is nothing but uh data ingestionary fix it will only return validation status okay so now for this actually development effects entity and I will Define this one okay again I will take one data class sorry okay data validation artifics and validation status it should be Boolean type okay variable that's why uh yeah so that's why you can also like give the data type okay like uh what is the data type okay what is the like variable type you can also Define now this thing is done now let me see I can update my components okay so now I'll open my data validation components here okay here so again I will uh import some of the libraries your table Edition so I'm importing these other libraries only app exception login with that I'm also importing my data like you can say validation configuration from my config entity and I'm also importing my RT page circuit as you can see I'm importing my data on ingestion artifics with that I'm off importing my data validation rdpix okay why I'm importing data in decision artifacts okay I think you remember uh inside my data injection okay what I was returning okay if I open this data injection one I was returning these two variable data zip file path and feature restore file path okay so from this actually I need this path okay feature restore file path that means I need this path okay if I get this path I can easily check these are the three files okay in this folder so this path actually I'll be taking from my okay from my data in decision artifacts because artificial written these are the thing only okay that's it okay that's that's why I was like returning the return type of My Method okay now what I will do I will initially I use one data data validation class here okay data validation class inside that I'm initializing my injection artifics with that I'm off initializing my validation configuration okay as you can see I'm also taking some class variables okay then after that I will write one function called validate all files okay all required files or exist or not okay so this is the like you can say simple python code as you can see so what I'm doing first of all I'm initializing one variable called validation status initially it would be none so then I'm what I'm doing I'm like checking okay I'm giving this path as you can see picture restore file path and why it is coming it is coming from my data in decision okay data injection artifics then I'm passing this thing inside list directory so basically I'm passing this path and I'm listing the directory basically I want to check what are the files or folders that are available in this directory okay so it will return me three things train valid and data.yaml okay so I'm looping through it okay as you can see I'm just initializing one for Loop then I'm checking okay whether this required file list okay as you can see required file list I already defined I think you remember if I open my one so again I'm telling if it is not present here okay I'm telling if it is not present here okay if any file is missing okay if any files and folder are missing just write validity status as false okay create the directory called presentation directory and create one file inside that called validation status okay and just write the status there that's it okay and if and if it is like present okay if this three files and folders are present okay just make this various status as true create again at the query called valuation status okay validation directory inside that create another file called validation status and give the valuation status as true okay then I am returning the valuation status that's it okay now if you want to check it so you need to again prepare one final method called initiate data validation okay because every class will have the main method okay so inside that I'm calling the same thing only validate all file exist okay and I'm returning this validation status okay this status I am returning okay because this is my return type of the function validation artifacts okay as I already defined here okay if I open this uh validation PNG one so this is the return type okay that's why I'm returning as you can see I'm returning here okay then I'm logging the information after that what I'm doing this data.zip file okay I'm copying okay using shuttle okay shuttle is a package if you use it has one copy method okay using that copy method what I'm doing I am copying this data data.za file in my root folder okay which is nothing but my project folder okay if I open my project folder manually okay see initially my data will present here my data.zip file is here as you can see this is the zip file so what I'm doing I'm just copying inside my root folder okay why I'm copying inside my root folder because uh in that notebook experiment okay that research in that notebook experiment what I was doing inside collab actually I was downloading my data inside my content okay that means my root folder okay so to make my like analysis easy instead of just reading my data from my rdpix okay if I can copy outside of my like project folder I can do my job in a easy way okay I don't need to like take care of the path okay again and again so that's why I'm copying this data.zip file here okay and after doing all the experiment after doing all the training I will simply delete that data.z file from this folder on it okay using the like it has a batch script command Okay as you remember I was uh running something called uh RM hyphen RF okay then I was providing the zip file path and it was removing that data for me right so this is the same script command actually I will remove all the like generated file I have I will generate here okay so this is the logic actually I have implemented here okay as you can see it's doing the shuttle.com like copy so basically I'm giving this ZIP file path that means this is the file path I'm giving and I'm also telling just copy this thing inside get CWD that means get current operating directory okay which is nothing but this is my project folder after that I'm returning my validation artifacts that's it okay then this is my exception block now let's add it in my pipeline so I'll open the pipeline and let me open the pipeline training pipeline okay so here first of all I will import my data validation so let me import so I'll just write it from West detection okay instead of data injections just write data underscore validation okay and import my data validation okay this class now again I also need to import my data validation configuring so let me import data validation config with that I will also import my written in additional config sorry data validation artifacts okay this is database only now here I will initialize my data validation configuration okay why I have defined this thing as you've already saw like your this class okay data injection if I redirect here this class actually will take data injection configuration okay this is the like input okay this is the like input it will take if you are initializing this class only okay so that is what I'm defining everything here and I'm passing inside this thing inside my cluster transition class okay and I've already defined my data validation configs now what I will do I will prepare one class for start data validation so pipeline should be like in order okay otherwise it might create some issue so data start data validation so here inside that actually we see this is the return type okay I'm defending the return type this should be the artifics inside that I'm calling my data validation this class okay and what this data validation will take if I come here so it will take two things your detonation RDX and data validation config okay if I go to here data injection artificial data resolution config listing I'm like giving after that I'm calling initiate data validation that means this is the final method or this one data validation initiate data validation okay this method I am calling and uh yeah it will far from the jobs then I'm logging the information then I'm returning the data validation artificial that's it now to execute this data validation again I need to call this inside run pipeline okay so let me call it yeah so I'll call inside that and here if you see I'm passing my data injection artifacts why because this will take your data in additional artifacts okay because from the artifacts only it will get to know this path right so that is why this transition artifacts path actually I'm giving here okay inside my data validation okay that's it now let me uh execute and see whether my code is running Point form so I'll open my terminal again okay see you can also remove this artifacts okay if you're not removing it's completely fine it will replace okay these are the thing only cell player now see initially I I don't have any data validation folder here okay only I have data in decision called and I will see after installing the data it will also generate data validation for me so let me run the python pi see again it has started the data in this team okay it will replace the file don't worry about okay and guys if you open the logs see every time stem log it will also same okay see it is downloading the data right now see guys it has execute executed successfully now if I open my device see it has invested our data okay feature I have the data now it has already created data validation now if I open it now see status.txt file is there now if I open it up now it's telling true because I have everything okay what are the required file I mentioned here everything I present uh I have present okay inside this visual store folder that's why this status is true okay let's say if you remove some of the thing from your zip file okay like in that data like format you define if you remove any file you will it will see you will see it will be false okay so that's how if it is true that means everything is fine I can start a training otherwise if it is false okay I can uh like stop the training okay I will tell your data is not in correct format just check the data okay then start a training because training is costly okay if you are starting the training that means uh you are ending up with a lots of cost so before starting your training actually first of all make sure your data is in correct format okay so I already added a simple like uh like you can say logic here to check my data level Edition you can also add some more layer let's say inside train also you can check whether this image file are there or not inside image I have all the images present or not okay this kind of layer actually you can also add in this function okay so I'll give this thing as an assignment okay you can like perform okay just add more layer just to verify verify okay whether this data is in correct format or not okay so this thing actually you can also perform okay you can also see the size okay you can also see the size um like what is the size of this file what is the size of this file if it has some size okay that means all the content are there otherwise if this size is zero that means there is nothing okay that's how you can add some more con like layout okay to verify your data so yes guys our data validation is also running okay now let's uh push the changes in my GitHub so I'll click here Source control I'll just uh before that let me delete this ZIP file because it has also copied the zip file here okay from this data Edition this ZIP file it has copy okay in my root folder okay see as of now I haven't deleted this thing because this thing I will be deleting after the training okay so whenever I'll be writing my training uh stays okay at that time I'll be deleting now to commit the changes I will delete this manually delete it and uh now I'll click here search control yeah so I'll give data validation so add it okay now let's commit and seeing the changes all right now let's go back and refresh okay see database edition added okay so now let's start with our next component uh and final component which is nothing but model trainer okay so the same like uh process will be following as we followed in our notebook so if you open the notebook so the same technique I will be following I will be just replicating this uh like notebook only okay but I'll be writing everything inside function okay so that's the thing I'll be doing okay now let's start with our like model Trader part okay uh so guys uh we have completed our data validation as well now let's start with our model trainer but before that let me close these are the things trading pipeline I will keep this other thing only because this thing is needed yeah so but before that let me show you the flowcharts actually I have prepared okay for this model trainer so guys this is the simple closet so what I will do first of all I will prepare some of the training configuration OKAY model training configuration let's say uh what is your model directory and what's name okay number of box okay batch size these are so these are the thing actually I need uh to train my model and after that I will uh initiate model trainer so it will change the model and after training the model it will generate one bits.pt okay so as I already showed you in my notebook experiment so this base.pt actually we will be taking and I will be using that bizarre activity for our inferencing okay and uh after that it will also return the model trainer RDX which is nothing that your Trend model file path that means this page.pt path actually it will return okay so this is a simple actually model trainer uh like you can say uh flowchart okay now what I'll do uh first of all uh I will prepare my constraint of my model changer so let me open my constraint okay this is the constraint so I already prepared this model trainer constraint the same technique I will be using here also yeah so first of all I need model trainer basically inside artifacts actually it will create one order called Model trainer inside that it will save this model OKAY like the model you will be training and this is the weight name actually I will be using a yellow v5s model as I already showed you in the notebook experiment okay so it will download this model and on top of that I will bring the training of my custom data the number of epochs uh initially I'm taking just one because again it will take time to train in my local system okay so just for demonstration purpose I am keeping it as one okay if you want to get a good model just try to increase the epoch size okay we'll get a good model and at least go with 500 to 1000 epochs again then a model bash size okay I'm taking 16 okay so yeah you can also increase decrease the size okay it's up to you but I will be taking this thing only now what I'll do uh I'll prepare my okay so now let me open my config entity so the same thing actually I will be returning from my entity itself as you can see model trainer config and I'm returning my model trainer directly what's name epochs okay number of box and Bash size okay these are the thing I'm already returning okay if you see the flow set these are the variables only okay I was taking as configuration see okay these are the thing only now this thing is done now let's also prepare our artifacts okay so I'll open my Pips so it will only return your trade model file path and there should be one string type variable okay yeah this thing only now what I'll do I will prepare my components okay now to prepare the components so first of all I will open my model trainer folder so I'll open my model trainer file and here I will import some of the libraries so these are the libraries that needed login exception read email okay why I need redml as I already showed you I was reading the yaml file I think you'll remember right this email file I was reading and I was taking the number of classes automatically okay that's why actually I already prepared this 3D ml inside my details okay I'm importing from here and I'll be using it okay then model trainer config I need to import and model trainer artifacts as well so it should be art effects okay now what I'll do first of all I will initialize one class called Model trainer okay so model trainer so uh to make you understand this thing better okay let me execute my uh code again because it will uh it will like copy one data data.z file here I think you'll remember okay so let me generate that zip file I'll again run my pattern app.pi okay it's done now here if you see it is generated or zip file okay it has copied now yeah so now I think I can explain better uh you will be able to understand what I'm doing okay so I already defined one like model trainer function and here I have written uh all the necessary command okay so let me show you like how I prepared okay all this best scripting command okay so this is the final method called initiate model trainer okay now just refer your notebook uh I think that notebook I think I already showed you so initially what I was doing uh yeah so here after downloading the data so here in this case actually I already got the data which is nothing data.zip file okay what I was doing I was unzipping that file okay after that I'm removing that zip file okay so same thing I'm doing here if you can see here first of all I'm logging okay I'm logging like is unzipping the data so here I'm running the unzip command and if you want to execute any Linux command okay or let's say any command you can use voice.system okay inside that you just provide the command Okay as it is okay this command LED you don't need to give this uh like it's an exclamation mark only just need to provide the command it will automatically understand okay which command you are running and it will execute okay in your Windows machine or Mac machine or any any machine if you're running it will automatically understand okay using this operating system module then after that I'm removing that zip file okay as you can see I'm removing the zip file okay okay so this file will be removed then after that what I was doing I was uh reading that yaml file and I was extracting the number of classes okay the same thing I'm doing I was reading the ml file okay and I'm extracting this number of classes okay as you can see number of classes I'm extracting okay and with that also what I'm doing I'm taking my weight name from my here as you can see this is the weight name I'm taking and what I'm doing I'm splitting out I'm only taking this name only okay this name only I'm taking because if you see because if you see uh this extension is in dot PT format okay but if if I want to like change the configuration okay so to change the configuration what you need to do you need to clone the yellow V5 repository here okay so I haven't pruned it yet so let me cloned it and let me show you like what I'm doing okay so for this just open this notebook and if you just go up so there is a yellow V5 Chrome like repository command is there just copy the command okay and uh open up your git bash again open up your terminal again okay so make sure you are inside your project folder now just paste the command here git clone if you just write it so here it will clone the repository see it's running so it may take some time let's wait so guys as you can see cloning is done okay now one thing you need to do just open up your project folder manually okay here if you see yellow V5 has been grown now go inside and remove this dot gate okay this dot Git You need to remove it because we have already created dot gate okay here so that's why you need to remove it manually okay now let me open my base code again yeah so now you can see this yellow V5 is there okay now what I will do I will go back yeah so here what I'm doing I'm actually taking the model name actually model config file name okay basically if you see this is the model name okay and if I come here inside yellow V5 I have model okay see this is the same name only but the extension is different okay so that is why I'm taking the name only instead of this PT okay then after that what I'm doing uh I'm trying to create one uh custom okay yaml file that means I'm copying the same file only inside this one okay yellow v5s this model basically I am defining this model here okay I'm copying the same name and I'm creating another file okay called a custom that name dot ml file okay the same thing we did in our notebook but I'm doing it in a pythonic way here okay and there I was running everything inside the cell only okay there's a different now as you can see after doing that what I'm doing I'm changing this number of classes okay as you can see number of classes I'm changing and this configuration I am dumping it again okay using yaml dot dump if you do it so what it will do it will open up this model okay and inside that it will again say if you see them providing the location okay okay inside model it will create another custom uml file and it will change the like number of classes okay what would be the number of classes as you as I already showed you yellow v5s I have 18 classes okay so it would be changed to 13 classes okay I think I had 13 classes and in my data set okay so this would be chance then after that I'm dumping it then as you can see these are the same thing only then here what I was doing I was running my training command after as you can see after changing my this number of classes uh see there actually I was running lots of code to change the like classes but here I'm doing is pythonic way that's why it's a little bit uh lesser code okay then I was running the final command first of all I'm going inside ELO V5 so as you can see using CD command I'm going inside yellow V5 then I'm executing the same command pythontrain.pi then I'm providing msi's bash size and MSI is best size I'm checking my tracking from my configuration as you can see model trainer configuration number of epochs batch size okay I am I have taken one F string okay and with that actually I'm taking all of them okay as you can see it's all the like arguments I'm giving okay the same way I'm giving the arguments but here I did the hard coded okay this is the same command I'm running and after that what it will do it will train your model okay then after training the model I am copying okay your model will be present inside webs folder I'm copying this Biz dot pity okay this model inside my yellow V5 folder that means inside this yellow V5 folder I'll be copying my best.pt model okay then after that what I'm doing I'm creating my model trainer directory inside artifacts okay and after that I'm copying this base.pt inside my my model trainer folder inside artifacts okay that model should be present there also that's why I'm copying then after that I am removing everything okay so basically after unzipping your data okay up so basically after training your model I think you remember it will generate one but it will generate one the runs folder okay so this runs folder I'm removing then I'm also removing my training folder my training image because after unzipping this data it will generate trending folder okay as well as the validation folder as well as the data.aml okay so this three thing I am removing okay using RM RF command as you can see okay now after that I am returning my model file path okay inside my rdpix and I am logging the information my model trainer is completed okay and I'm ready like it turning my monitor and artifacts that's it okay the simple like uh conversion I only did okay everything I did in the notebook the same thing I just converted at this script here and if you see I'm running some of the bash command here okay so these are the bash command you can run using this wizard system here okay so this is the changes I have done here now to test this one what I will do I will add this thing inside our pipeline okay so let me open the pipeline trading pipeline okay so first of all let me import my model trainer so I'll just write from OS detection uh model trainer OKAY model trainer now I will also input my model trainer config okay as well as my model trainer artifacts okay now this thing is done now I also need to initialize it here model trainer config all right now I'll create a one method called start model trainer after data validation I'll Define this method to start model trainer basically here I am calling my model trainer object and I'm passing my model trainer configuration and I'm initiating my model trainer as you can see inside my model trainer okay here so guys now if you see I'm calling this model trainer class and inside that I'm passing the configuration and after that actually I'm calling this initiate model trainer okay from my model trainer class okay yeah so that's it now let's add it in my run pipeline as well so but before adding that it to my run pipeline as I already told you first of all then you will check whether my data validation is successfully ran or not okay if the status is fall that that means there is some issue with my data I won't be starting my model trainer okay and if it is fine okay if it is true that means I will start a model trainer so what I will do I will write one Apple's condition here so just let me show you yeah so this is the simple condition I have written what I'm doing customer I'm taking my data injection artifics validation status is true okay if it is true that means inside that I'm starting my model train that okay as you can see and if it is false that means I am giving your data is not in correct format okay so this is the simple logic I have written okay now let me test whether it's working or not so what I'll do uh I will unexpand these are the theme okay so let me remove all the theme and show you whether everything is working fine or not I'll also remove my Architects from my project folder okay now let me open my base code again okay now let me run and just observe at the left hand side okay what will happen so I'll just write python apply see guys Dayton decision is happening see guys now it is unzipping the data as you can see it has copied the data here and after that it has unzipped and removed it okay and now after unzipping now see you have your training okay validation and this data.tml okay yeah see everything I have done automated so after unzipping everything it will remove and all okay you don't need to do anything so now your model uh trainer will start okay see so guys as you can see it's giving one error because it's telling Ultra analytics model not found so the repository you have like chroned it so they have added this new uh like package called Ultra analytics okay so what I will do I will copy this thing as it is okay I'll copy and I'll open my uh see I'm copying from this yellow V pipe folder okay then I will add it to my requirement.txt make sure you are copying from that same uh like repository or cloning and you are replacing this thing here okay because this requirements uh I taken from my previous project okay that is why I'll save it here okay now let me install my requirements one once again okay now I think it will work so clear but before that let me check whether it has generated runs folder or not okay there is no runs folder and I think everything is fine till now okay so now let me install the requirements first of all so I'll just write peep installed type an ad requirement txt see it's done now let's delete my artifics okay and uh let me see whether I have any generated file or not here yeah everything is fine now let me explain see whether it's working or not okay I will open my terminal again I will run python app dot pi now I think it should work so you just started generating the data it's downloading the data now it is unzipping as you can see and it has General operand zipping you will get these are the files and your zip file will be moved okay now it will start the model trainer you can open the components and you can see okay everything it's running one by one here now see guys now we're just downloading the model okay because I already mentioned I will be using your v5s model so now it's downloading the model and why it is downloading if you expand the yellow V pipe folder uh the last as you can see it is downloading that okay now after some times you will see the model see this is the model it's downloading now see uh yellow P5 has model has been downloaded okay now it is looking for my data my data is present here see uh so it may take some time let's wait okay see training has started you can see the loss and class loss and everything so I trained for only one epochs here okay uh if you want a good model just increase the epoch size in that constraint folder let's wait so guys you can see training is done now uh after getting the model it will remove this trans folder see it has removed okay now see uh it's done okay now if you want to see the model just expand this yellow V5 now here you will see something called base.pd okay so this is our Trend model and it is also present inside my artifacts okay see it has also created model trainer inside that wiz.pt is there okay that means everything is working fine okay uh there is no issue with that and we are able to train our model as well okay so our training pipeline is completed guys uh completely we have completed a training pipeline now what I need to do I need to prepare my prediction pipeline as well okay prediction python is nothing but I will just execute this detect.pi and I will do the inferencing okay using my Trend model okay but this model won't be very good because I Trend with one epochs okay but uh here I will keep one of my model actually I Trend with uh around uh like you can see one 100 box okay so this model I will keep it and I will at least show you some prediction okay but for your case if you are training your own model just try to increase the epox size and get a good model okay but as of now what I will do I will quickly push the changes to my GitHub so I will click on Source control so these are the file it will push okay so let me push it I'll just write model trainer added now let's sing okay it's done now let's go back to my GitHub refresh it okay uh it's done okay now let's start with our uh like uh uh testing part and uh with that actually I will add my user app as well okay using flask I will be creating one basic user app okay as I already showed you my demonstration okay uh all right so that actually will be or uploading my emails okay and it will give me the prediction okay then after that actually uh we'll be writing our Docker file then we'll try to reply this project okay yeah uh so guys we have completed our uh like all the components implementation so now what I will do I will write my app.pi okay uh so here actually I will be uh like implementing one user app okay with that I will give the prediction route as well okay and I will also give you uh and here I will also provide one uh like live prediction route as well let's see if you want to launch your camera okay so you can also launch it so what I will do uh first of all I will close these are the theme okay so the first thing I will open my app.pi okay now here uh I need to import some of the libraries okay so these are the libraries are needed so here we see first of all I'm importing voice and says and then training pipeline okay from my pipeline itself so we have created the training pipeline here so this is the final class okay so from this training pipeline we'll call this run pipeline okay once I will call that out then I am also importing this decode and coding mesh from my main utils okay if I go inside menu tips as you can see uh we created to Method okay decode and encode image so this thing I'm importing because uh what I need to do I need to encode and decode my image okay the user uh image will input okay so that image actually I need to record and encode then I'm also importing flask okay because using flask only I will be creating the API okay and uh I'm also importing across origin from flask and here I'm also importing my application uh you can say app host and app Port okay so these two thing I need to Define inside my um OS detection constant okay inside the constant I created another file I think you remember affliction.pi so let's open this file so here let me mention this uh two thing okay so my first should be this is the localhost and this is the port I have taken okay we can take any port okay it's up to you now if I come here now it should be accessible yeah the first thing what I will do I will uh initialize my flask with that I will create one class called Client app okay so whatever email is actually user will input okay it would be saved as input image.jpc okay that's why I'm giving the file name here so the first route I will be taking called training route okay so if you just give slash strain so you didn't start the training okay if you know basics of like task I think you'll be getting like what I'm telling okay uh so this is the route actually and inside route you have the function okay so if you give slash screen that means this uh function will start running okay and here I'm initializing my training uh class okay and this is the object and from this option I'm calling this run pipeline okay this is the final method I was coming okay and if you run this method actually it will start the pipeline okay and once it is done it will return as standing successful now I also need to Define one route a default route okay if you are not giving anything that means uh it will launch one route okay and it will launch one HTML page here okay render it will render one HTML page but uh this HTML page actually we haven't created yet so let me create it so inside template I think you remember we created another folder called template so let me see where this template is yeah template inside that we have index.html okay so I think you uh like remember uh whenever I was showing that uh like demo of our project okay at the time you you saw like one user app right basic user app so that user app actually uh has some HTML code okay this kind of HTML code you need to mention here okay so you don't need to like uh learn a HTML code for it I already copy paste it HTML code from the internet itself okay so if you search for bootstrap website okay there actually will get a lot of template okay so from there you can get the HTML code okay so I already prepared one HTML like you can say Basics uh ey so you just use it as template okay as it is in your code okay it will work completely new completely fine okay and if you want to like get uh I mean uh your um own like Custom Design template so just visit bootstrap okay if you just search bootstrap so here this is the first link you will get and here you have the example right so in the example section uh you will see lots of examples are there okay you can pick up any any example from here okay and uh if you want to like learn more about like uh this yellow V5 or let's say if you want to also learn yellow V6 yellow V7 so I have one YouTube channel called yes with bug P okay so just visit uh youtube.com this is the tag of my channel as you can see okay so here if you just come to playlist so here I have lots of playlists okay as you can see I have a playlist on this yellow V8 as well then yellow V7 yellow V6 okay then yellow V5 also is there okay so you can learn about it if you want to learn more about so you can export my YouTube channel as well even I have lots of project implementation deployment video everything I have if you like this okay you can also subscribe to my channel so now what I'll do uh I'll go back to my project now this HTML page is prepaid now let me go back to my app.pi now this default route is done then I will create prediction route so this is the simple prediction route so here I'm not doing anything I'm giving one prediction route and is the breakdown first of all it will get the image okay from the back end okay user will pass the image it will take the image then it will decode the image okay because initially it will be best 64. uh it will convert it to Ms format okay then this is the name actually I was calling from my client app okay I will initialize this client F later on then I think you remember in the notebook I was running one command to uh like do the prediction in yellow V5 I have one file called Pi okay as you can see direct.pi so this thing only I am doing as you can see I'm uh going inside this yellow V5 folder and I'm executing this and and means like double end one you are giving let's say you want to run multiple command Okay in one time you just need to give this symbol to divide your color like command Okay so that's why after running this command I want to run this command so that's why I'm recording this command then I'm calling python detect.pi then uh what I'm doing uh I'm passing the weights okay and here uh for your case you need to pass this base.pt but again here users need to give best.p like that okay best.pt but why I have given main main not my uh my underscore model PT because I already told you this model is not good okay I Trend with just one epochs okay but I have another model okay uh which is already trend on uh around 100 to 200 bucks okay and this model is a little bit good so I'll provide this model here okay so that actually you can also test on my model okay but let's see if you are training your own model just uh replace it here called base.pt and just give the name here okay whatever name you will be in that model actually it will be picking up okay so base.pt so I will just keep it my model only okay because I will be using my model uh initially you can't see the model here I will just move it my model here okay then I'm giving my msi's confidence score and Source okay like why you have the data okay so initially what will happen whenever user will pass any input image okay so it will save inside data folder so that is the location I'm giving okay after that it will do the prediction and I will again encode the image basically I will uh like convert it to base64 okay then I will take it as Json format okay then what I will do I will uh pass it to my uh like front-end part okay as you can see Jason if I am passing to the front end point and after doing the prediction I am removing that runs folder okay and if it any exception is there so it is the exception okay so this is the simple logic I have written now what I will do let me replace the model inside yellow V5 folder okay I have already printed so guys inside that I will paste the model so this is my model my model on my underscore model.pt now you can see the model okay now it will pick up that model okay now the last route I will be keeping my live route okay so this is the image prediction now let's say if you want to do the live prediction as well using your web camera so this is the route you need to execute for live slash okay and here I am running the same command only but instead of giving the image source what I'm giving source is equal to 0 so 0 is like a default ID of your camera if you let's say if you're using multiple camera uh you just need to pass zero one like that okay let's say I want to access my second camera so what I will do is give one here okay but by default if you are using laptop it should be zero okay and if you are using like let's say one web camera so you just need to give zero here okay then after that I it will do the prediction again I will remove the runs folder okay it was written camera is starting okay and any accession is there it will like is the exception okay now I'll initialize my client app foreign okay okay so I think my app is ready now let's test it whether it's working or not okay so what I will do will unexpand this thing yeah so now let me open my terminal and let me see whether it's working or not okay so I'll open my terminal so here first of all Let Me Clear and I will just execute my python after pi if you are getting these kinds of Windows just allow access okay now I think it should run let me bring up my browser and here I will search localhost port number 880 okay see this is the application now let me upload some image here let's upload this uh hold back see you guys okay sorry this is the ring pack okay and confidence score is 0.93 okay that means it is able to predict successfully okay now let's say if you want to train your model just give slash train here okay and if you press enter so it will start the like pipeline okay I'm not doing it because again it will take time you can test it okay and let's say if you want to do like live prediction just give slash live here okay it's less like so I'm not doing it because again uh it will take time to launch my camera because I'm using like HDR camera okay so you can try it okay so these two live route I have kept uh one route for trending and one route for your live prediction okay and this is the route for your uh like EMS prediction on the okay so that's how actually uh we have completed our entire project implementation okay so you can if you want to add some more extra like layer okay you can add it okay it's up to you but I think this this is enough for this project okay now guys uh this project is ready um let me stop the execution I'll press Ctrl C yeah software uh like uh so after stopping that uh one make sure you have removed this one runs folder okay so let me remove it Brands folder should not be there inside the folder okay uh yeah I've removed it now it's fine now guys uh we are ready to deploy this project okay so but before that uh I need to like you can say prepare this Docker file first of all so let me open the docker file so here you just need to put one basic image of the docker all right but before that let me commit the changes so I'll just write prediction okay now let me show you the docker actually you need here so guys uh these are the like uh Docker comment actually you need to execute okay uh you don't need to execute it because by default whenever you will be doing the CSD deployment on AWS okay it would be run automatically there okay first of all I am taking one uh 3.7 uh slim Buster image okay because I was using python 3.7 here and if you know a little bit about Docker I think you are familiar with what is like uh image and all okay then I'm creating one working director is called app okay inside app actually I'm moving all of my code and there actually I'm updating my apt package manager okay with that actually I'm installing my AWS CLI okay and after that uh what I'm doing I'm updating like my apt gate then I'm installing these are the utility package actually I need there okay and after that I'm also installing my requirement.tst I have there okay then after installing all the things I'm running python3 app.pi okay it will run this file and it will launch my application there okay so this is a simple like Docker file you need to write it okay then with that actually you need to also write your um workflows okay so here I will create another uh using GitHub action so now I can remove this dot gate okay okay here it is not required now I already prepared one uh this one your main.yaml okay so you just need to copy the same yaml content as it is okay and it will work for the every project okay so this is the uh you can say workflows okay I have uh prepared as you can see here um so first of all I'm naming it as workflow and whenever you will push your chances through the main branch okay it will uh it will start the workflows okay I am giving the permission then after that first of all I am running my integration okay that means uh continuous integration it will run okay because this is the cic deployment it will run into three states continuous integration continuous delivery and continuous deployment okay so I will tell you like what other things will happen uh first of all inside continuous integration uh here I'm not doing anything I'm just echoing the command okay that's a running unit test and all if you have some testing uh okay in your code if you have some test cases okay if you have let's say integration and unit test you can execute these are the files here okay but this project I am not using any testing file that's why I'm echoing some of the command okay now in convenience delivery what I'm doing uh I am first of all authenticating with my AWS account okay then I see I'm reading my credential from the secret okay I will tell you like how you can add the secret and all then I'm logging with the ECR let me see elastic container history okay uh I'll tell you like whenever I'll do the deployment what is the ECR and all the easier is like you can save uh it's kinds of Docker Hub only let's say if you have built in one image Docker image you can keep those image inside ECR repositories okay then after that I will do I'll uh build the image okay of my entire source code and I will push that image to my ECR okay as you can see here okay and after that what I'm doing inside continuous deployment I'm again logging with my AWS credential login with my ECR I'm pulling that emails from the easier to my ec2 machine okay and I'm running my Docker there okay this is the port I'm like uh you can say giving okay and this is the app name I'm giving which is nothing but voice you can see this is the app name you need to provide here okay otherwise it you don't need to change it okay so yeah this is the yaml file actually will help me to do the sales redeployment okay now uh yeah my project is ready now what I will do uh first of all I will show you the AWS deployment okay how we can do it uh so guys I have already uh added this readme file as you can see I have already updated like if you want to execute this project so what are the commands actually you need to execute I have added everything and I have also added the deployment steps okay actually I'll be following so this is the awsc deployment with GitHub action uh so as you can see uh what would be the deployment like strategy first of all we'll build one Docker image support source code then I'll push that image okay to the ECR repository that means elastic container is steady okay this is the service Prime Amazon web service and after that actually we launched our ec2 machine okay and we'll push that uh then guys what I will do I will pull that I see your image okay I will pull that emails from my ecl in my C2 machine then I will launch my Docker image inside my C2 machine okay so this is going to be my deployment steps and these are the policy I need actually I'm going to use like uh Amazon is ec2 machine okay so I'll give the full access and uh elastic container study this is full access okay because all the services I don't need from my Amazon website I only need these are the services only okay then uh yeah so I need to run some commands Okay I have given everything here so I'll be following this thing even at the last I have also given the Azure CSC deployment steps as well okay so after completing the end of this one I will start with the Azure okay so first thing what I will do uh I will here I have already mentioned first of all you need to log in with your AWS console so for me I have already logged in okay so second thing what I need to do you need to create one IM user okay for the deployment so let me create my I'm user so here I will search for IM okay so I'll click on IM I am inside like as an identity access management so instead of giving all the permission of your root account okay you want to give some specific permission okay so that's why we use IM so here I will click on user first of all I need to create one IM user Okay so click on add user then just need to give your username so I'll give OS user okay West and click on next and here you need to address the policy so I'll click here now policy I have already added just copy paste from here only I'll copy the first one I'll sorts here and just select it okay after selecting just remove it okay then I will copy the second one okay edit now click on next now you can see these tools are like a service uh permission I have given okay now I will click on create user now waste user has been created I think yeah this is the worst user now I will go inside now here I need to get the security credential okay I'll click on security credential and here you will see something called access Keys Okay click on access Keys select the command line interface CLI okay and I will click on I understand the above recommendation and click on next uh now create access keys and now you can download a CSV file okay now just keep this USB file okay uh so that actually we can use it later okay because this CSV file is needed okay so I'll remove it yeah so after that actually uh I need to create ECR repository okay to save my Docker image so again I will go back to my AWS service and here just search for uh ECR Revenue ECR that many elastic containers ready Okay click on the first one then here uh just click uh create one repository and click create here and just keep it as private okay and I'll give the name waste okay let me give uh West uh yeah West only now I'll click on create Repository okay so what is my waist uh yeah so I'll just copy this URL right and I will save it somewhere okay because I need it later on so I'll save it inside my redmi file itself so here I think somewhere I mentioned yeah so I'll replace my previous one so this is the URI you need okay later on I'll save it here okay just uh for my reference only now I'll go back and uh what is the next phase okay I created already this list here now I'll create the ec2 machine okay so I'll go back answers for ec2 ec2 is a virtual like you can say uh servers okay so it will give you like a virtual machine now here I will launch one machine I'll click on launch long stance so just give the name I will give waste machine foreign and here uh if you're using like uh let's see if you're doing deep learning project at that time actually you need to take the GPU instance okay uh so here actually you have lots of instance type as you can see deep learning Mi GPU Pi dots okay then you have uh that's a tensorflow uh so in yellow V5 actually uh in the back end they're using pytots okay so you can select pythons any kinds of GPU machine okay if you want but I won't be selecting any GPU machine because again it is like a costly okay it will charge a lot so I'll just keep it as default okay those who are using very tired access okay you can also use it okay and if you're like let's say doing the actual deployment so at that time just go with the GPU instance okay then I I need to select the instance type so here I will take at least 16 GB memory so I'll take I'll select the t2 X Lars okay this one now yeah you can generate a key like key pair okay here from here so just click here give any name so I'll give waste so it is not necessary it is only for let's say if you want to access your machine from any third party let's say tool like a mobile extreme or putty edit at the time actually you need that pen file looking but you need to keep this at least here okay to continue with your machine creation now I will select this to one and here you need to select the storage size so here I will take at least 32 GB storage okay uh yeah I think it's fine now I'll launched the machine now click on view all instance so guys as you can see my machine is running okay status should be running now I'll click on the machine ID now here you have to connect it so click on connect and I'll simply click on connect so it will launch one terminal for me here all right now I'll clear it all right so first of all here you need to execute some of the commands so first of all you need to do the package manager update so let me copy one by one and execute it here okay it's done now let me copy the second one okay I will give y that means yes so it may take some time let's wait so guys if you are getting these kinds of window just press enter from your keyboard it will disappear okay it's done now let me clear it okay now you need to install Docker in this machine because by default there there won't be any Docker so let me install so these are the command you need to execute so let me run one by one okay it's done now let's paste this command okay now this last command okay now if you want to check your Docker is running or not just write docker type in version so you will see the version okay of your Docker that means it's running now uh it's done now what I need to do I need to configure my C2 has self-hosted Runner okay so how I can do it so to do it first of all let me make one copy of this same project yeah so I'll refer this one okay as a documentation and I will do my job here okay so let me do it so first of all click on settings so make sure you have a committed to your code in your GitHub okay otherwise you won't be getting the option so here you have something called uh actions okay now you have something called runners now here click on new self-hosted Runner now select the machine as Linux okay and these are the command you need to execute okay let's copy one by one so basically what you are doing you are connecting uh your ec2 machine with your GitHub so that whenever it will be committing your code inside your GitHub okay it will automatically get deployed here so these are the connection actually we need to do by executing these are the command okay all right now I need to execute this to command as well see it's connecting with my GitHub action now it's connected to the GitHub now it is asking for the uh enter the name of the runner group so I'll give it as default I'll press enter Then it is asking for enter the name of the runner so I'll just give cell type in hosted okay so make sure you are giving the same name now it is asking for the label uh I will again skip press enter again I'll press enter okay yeah so it's connected now to check it just run this command okay connected to GitHub and listening for the jobs okay now that means it is connected successfully now what I need to do uh I need to uh here uh we have connected already now I need to set up my GitHub secret okay so for this I'll again open my project here I have something called secret variable click on action Now new repository secret okay so the first secret I will be adding which is nothing but AWS access Keys let's copy the name paste it here and why you will get the access keys so I think you remember we downloaded uh once you like you can say access Keys CSV file okay just open it up this is the access Keys okay so don't share this keys with anyone secret with anyone otherwise they will also able to access your account I'm showing because I'll delete the instance okay after the deployment now I'll add it then again I'll click on your repository Secret then address secret access key okay and this is the secret access key the last one okay copy till comma and paste it here now again click here in AWS region okay make sure you know your region which region you are in so for me I think I'm inside Us East one okay to check it just click on uh okay I think I'm in Mumbai okay so if you're in Mumbai just give a peace out one okay so hear it that's right AP South one I think that's how I need to write let me check it again AP South one yeah so uh yeah it's fine now I'll click on ADD Secret okay now uh the last two things that required by WS login URL I think you remember uh Douglas login URI yeah we copied one URI in my code let me open so this is the URL just copied till.com and give it here again I'll click on new repository secret and copy the repository name okay copy leave it here and what is the name of my depository so it is best okay just copy this name and give it here yeah I think everything is done uh yeah so now I think we are ready to deploy our project okay I will click on my project now let's open my PS4 okay now let's push the changes okay so what I'll open my terminal okay let me clear it I'll just write git add space Dot git commit IPM CI CD yeah edit okay then hit push origin Main okay before that I need to uncomment this thing sorry okay and return complete this thing because I have already comic commented yeah so initially you will comment this three line because initially there won't be any container running okay for the second time whenever you are doing the deployment at the time you need to uncomment okay at the time there would be a container so again I will open my terminal and let's push it again okay it's done now let's go back to my GitHub refresh the page here now guys you will see uh action is running this yellow Mark you will see now click on action now CCI CD edit it's running see guys first of all it will run the continuous integration okay so let's wait after that it will start with the continuous delivery so inside continues delivery first of all it will login with my AWS account and all it will build the image okay and it will push it to the ECR see now it is building the image so guys it may take some time okay so let's wait I will come back when it is done so guys as you can see continuous delivery is also done now it's running the continuous deployment okay so again it will log in with my easier and uh pull the image from the ECR and it will run in my ec2 machine okay so let's wait uh so guys as you can see all the steps run successfully now I'll go back to my instance uh let's go back and uh let me figure out this is the waste machine I will click on the ID now here you have one public IP address okay just copy the IP address and paste it here so initially actually it won't be launching because we haven't done the port mapping yet okay so to do the report mapping I'll come here inside uh I have security Now security groups okay I'll click on security groups and again I have to click on edit and bound rules and here I will add the rules custom TCP output number was 8080 okay you need to add your port number you have given in that app.pi okay now I'll select 0 0 and I'll save the rules okay now if I come here and give the port number 880 on put number 8080 I think it should work see guys okay it has launched my application uh now I actually you can do the prediction okay okay so that means it's running successfully now you can train it okay now you can also do the prediction everything you can do okay so now uh after doing the replyment actually you need to also learn like see if I also remove this uh like tab also see it will also work okay see it is working that means uh it's running perfectly now uh don't uh just leave it as like running otherwise it will charge so what you need to do you need to also learn like how to turn me to instance okay so for this I'll click on instance so this is the time minute I uh instance I have I will click here and I will click on instance type and terminate instance okay terminate see it is shutting down after sometimes it will terminate okay then I will also remove my ECR repository so I'll search my ECF so here I had something called waste I'll click here I'll click on delete just keep delete here okay it's done now I also need to uh uh remove my iMusic okay so I'll search about iron click on user and here I have this West user I'll simply delete it okay I think it's done yeah so yes guys we have successfully completed a deployment or AWS okay now uh next I will show you like how we can deploy this project inside out uh as your Cloud as well okay as you see ICD uh so guys for Azure actually I already showed the deployment process okay uh it's like very uh like easy okay so the same structure actually you need to follow and uh see you don't need to like change that anything okay only some slight modification you need to do I have already shown like what you need to do there okay uh so I will add these videos okay so you can also see like how to deploy on the Azure site as well okay uh so yeah so this project actually you can do as it is okay just follow the instruction and only just change your project name okay that's it okay otherwise everything will remain same okay and for this actually you don't need to create this yaml file separately because in Azure we'll be using something called web app service okay it will automatically create that ml file for you okay so I will suggest just uh like you can say see the deployment okay so that was the deployment from my previous project okay I showed uh instead of that project actually you just need to use your project only okay so that's it so yes guys let's start with our Azure deployment so guys uh I have already updated this uh GitHub readme as you can see as your cicd deployment all the steps I have added here how we need to do it okay I will explain it one by one and uh some changes actually we need to do in your project so let me show you whatever changes actually you need so everything will uh like remain common uh like uh your Docker file as well see uh this is the same Docker file I'll be using and uh here only you need to make the changes inside app dot Pi so here instead of taking port number 8080 you need to take port number 80 okay so this is for our AWS this is for our localhost okay and this is for our Azure okay whatever you need just activate that one okay and comment that one and try to use it okay so yeah everything would be common and here actually I don't need to create any workflows because here I'm going to use like web app service from AWS okay so it will automatically create that uh yaml file for me okay csdml file for me whenever I will push like you can say you are taking the code and after everything what you need to do you just need to commit the code okay you just need to push this code in your GitHub so I have already pushed the code in my GitHub as you can see if I open my GitHub so it is already pushed now let me show you uh the app.pi yeah so yeah it is already post okay so uh in AWS actually I was pushing the code but here actually before that you need to push the code okay then I will select the uh repository okay from my Azure okay whenever I will be selecting so you need to push the code and uh yeah this repository should be ready okay now what you need to do uh first of all let me tell you like what are the like process actually I'm going to follow so deployment status as you can see I will first of all build our uh like Docker image of our source code then I'll push that Docker image okay to the container is ready so in AWS actually we have something called elastic container study but in Azure we have container registry okay so in consider yesterday I will push my Docker image and I will launch web server okay in Azure then I'll pull that Docker image from that container study and I will run it in my web app server okay so that is the steps actually will be following so guys first of all you need to log in with your Azure portal so I have already logged in so first of all here what you need to do you need to search for uh container registry okay so search for container uh you will see container Registries there uh somewhere sorry container registry um container registry if you search it yeah so this is the content will study I will click here okay so first of all you need to create one container I will click on create and here uh first of all you need to create more Resources Group so let's create I will click on new create new okay just give the name so I'll give Chicken it should be unique name guys okay so you can uh secant app okay let's give Chicken so I'll copy the name and I will I will use this name okay everywhere and I will click on OK then uh you need to give the registry name okay so I'll give it as chicken app only then I will select Central us okay Central Us location and yeah everything should be as default and I will click on review and create okay then I will create it uh it may take some time okay after that you will see one button here see it's coming now just click on go to resource now here you will see called access Keys okay just click on access keys now here first of all copy this uh logging server okay I'll copy the login server and I will save it in my project let me open my project let me open the readme file so let's save the login server here so this was my previous login server I'll just change it okay and also I'll change here as well so these are the command actually I need to execute later on so that's why it's better to change it here okay foreign okay you know this thing I will also give it here chicken okay yeah now uh after that what you need to do you need to activate this one admin user okay and from here you just need to copy the password so let's copy the first password and save it inside your project don't share this password with anyone I'm giving it here just for the reference and I will delete all the Resources Group later on that's why so yeah so this is the password I will keep here and after that yeah everything is ready now what I will do uh I will search for web app for container see web app for container this service actually I'm going to use okay first of all you need to select your Resources Group so I'll select it was nothing but chicken okay chicken app I'll select it now you need to give your uh name of that web app I will give the same name okay it should be Docker container Linux machine it's fine now reason should be centralized Central us okay Central us and everything should be as default okay now what I will do I'll click on next again I will click on next okay so here you need to select a single container as your container history okay now it will ask first just select the history name it was chicken app okay now it needs the emails okay but I don't have the image okay so I haven't built the image yet so for this actually what you need to do uh you need to uh you need to uh install Docker okay in your system so let me open my Docker desktop I already have Docker installed in my system guys so so guys as you can see I've already launched my Docker desktop okay so by default it will look like that and it should be running okay engine should be running so if you don't have Docker installed in your system just try to install it first of all okay otherwise you won't be able to uh do it okay so you just need to uh install this Docker in your system it's like very easy just go go with like Docker documentation okay and from there actually you can install Docker desktop in your system okay so I'm expecting you have already dotted now what I will do I will minimize it just make sure it's running okay I'll minimize it yeah now what I'll do I will open my terminal okay and let me clear it yeah so I think you remember I saved some of the command so let me open my code so first of all I need to build the image okay so I'll just copy this command blocker build I20 okay this is the command then I will open my terminal again okay now let me just execute this command okay it will build your image so guys uh it will take time because uh my project size is like use okay so definitely it will take time so let's wait okay uh I will pause the video and once it is done I will come back uh so guys as you can see uh building image is done now if you open up your uh Docker desktop so you will see the image actually here so the image name was uh it was a chicken chicken app okay see the chicken app is there okay now what I need to do uh I will uh let me see the command I need to log in with my uh this one as your portal okay so this command actually will do it so what I will do I will just uh Open my terminal okay just give the command now if I press enter yeah so it's asking for the username so username should be this name okay the app name you have given so just give second app okay and then it is asking for the password so password I think you remember I saved one password here okay just copy this password as it is okay now let's open our terminal and just right click okay right click and press enter okay it don't show the password and it will see it will show you one success message okay after login see login success okay now what you need to do you need to push this image to your uh so let me do it so Docker push then I'll press enter see guys it's pushing the image okay so it may take some time that's right uh so guys as you can see uh it has pushed successfully so it was more than 2.5 GB around so yeah it's done uh now what I will do I will uh go back to my Azure portal now if I show you like what should be your image so if you click on container registry so guys see this is chicken app okay this is the image okay it has post now let me close it yeah so now what I will do uh so you need to refresh this page okay so it is again asking for the same information so let's give chicken app or web name I will give the same name it should be Central us and yeah everything should be as default then I will click on next okay then it should be Azure container and registry and it will automatically select the image okay now you'll see see you can image it has selected and latest image now I'll click on next okay so again click on next okay so I will simply click on create so it may take some time now let's write okay now uh just click here go to resources now left hand side you will see something called deployment Center okay have a click here now here uh so guys uh you can see this option so first of all what you need to do you need to activate this one continuous deployment okay I will click on on then I will select a second option GitHub action because I'm going to use GitHub action as my csd2 okay now you need to sign in with your connect with your GitHub account so I have already connected okay now I'll select the organization so it is the antibody okay now I need to select the repository okay so the repository actually I want to deploy so it was chicken uh second is classification project this repository okay now Branch it is main branch and as your container is steady uh yeah so everything should be as default yeah so now uh just click on preview file so this is the like yeah ml file it will create okay inside the uh dot GitHub folder okay now let me close it now I'll just click on set you can see the progress here just click here you can see the progress so guys uh deployment has started now to check it I will come back to my project now if I refresh here see one workflows is running see it is automatically uh created this email file you can see main checkpoint uh main chicken okay see this is the ml file it has already written okay so everything should be automated now I'll click on action now this action is running I will click here first of all it is building the image okay so let's write so it may take some time okay let's wait and see so I'll pause the recording uh once it is completed I will come back so guys it may take some time for you okay just do it so guys as you can see build is completed now it is running the deployment so I'll click here okay so deployment is also completed now I will come back here and I will click on overview okay here you will see that link default domain I will click uh just copy that and here I will paste it so guys it may take some time to launch your web app here uh because initially I was trying so it was taking around four to five minutes to launch this one okay because in the back end it will like launch that container Docker container at all so for this actually you need some time so just uh wait for some time okay it will automatically launch that web application for you uh so guys now you can see it has launched my application okay if I refresh the page see okay now what you can do you can just give slash screen to start the training after training you can do the prediction okay so yes guys it's running perfectly now I will show you like how to delete that resources you have created so to delete this actually let's go to home first of all so here you will see something called uh Resources Group chicken app okay so I'll click here yeah so at the top you can see delete Resources Group just click here because if you just leave it then it will charge a lot then copy the name and paste it here and delete it see after some time you will see it will delete everything okay so yes guys uh we have successfully completed our entire project implementation and we have also completed the deployment uh so yes guys I hope you like this video if you have liked this video just try to share this video okay so I hope you have enjoyed a lot uh of this entire series of the video okay and I hope you have learned a lot okay so yes guys this is all about uh from this video I hope you like this so thank you so much for watching this video and I will see you next time
Info
Channel: DSwithBappy
Views: 3,788
Rating: undefined out of 5
Keywords: Deep Learning Tutorial, Computer Vision Project, YOLOv5 Model Training, Real-time Object Detection, Image Processing, AI Object Detection, Machine Learning Deployment, Custom Object Detection, YOLOv5 Optimization, Data Annotation, Object Detection Pipeline, Object Detection in Images, YOLOv5 Fine-Tuning, Deploying ML Models, Object Detection Applications, Computer Vision Development, YOLOv5 Accuracy, Object Detection Techniques, Object Detection Challenges
Id: cr17R0fyVXc
Channel Id: undefined
Length: 220min 53sec (13253 seconds)
Published: Thu Sep 21 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.