CICD Pipeline | DevOps Tutorial with Project | CI CD Pipeline using AWS | DevOps CI CD Project | K8S

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello friends and welcome back to my channel I'm ashfak today we will create a complete de seops project using Jenkins sunar Cube trivy Docker and the kubernetes this cicd pipeline will work in complete automated manner any change on the GitHub repository will trigger the pipeline and once pipeline execution is complete Ed you will see the changes appearing on the application which we will Deploy on the kubernetes in short I will explain you the flow of our today's project first of all we will use the terraform to create the instance for the genkins and all the packages we will install in that that instance using the terraform itself in our CD pipeline if user makes any change on the GitHub repository it will trigger the pipeline and our pipeline will start executing the stages first of all it will do the sonar Cube analysis of our code then it will perform the npm depend dependencies install and then it will do the TRV FS scan of our code and then it will build the docker image and it will push that Docker image to the docker Hub then it will scan that Docker image using TV and then finally I will our Jenkins script will deploy the ports on the kubernetes using the docker image and we will have the monitoring configured using the Prometheus and the grafana which will monitor the eks cluster and it will also monitor the genkins and after completion of the every build we will get the email notification on our uh Gmail ID with the Trey scan results and the complete log of the job completed so let's start building our project this is my Windows 10 system in which I'm having Visual Studio code and the AWS CLI installed you can download the visual studio code from here and you can download and install the AWS CLI from this page I will create the folder for my project and inside it I will create one more folder called Jenkins sunar Cube VM on the visual studio code I will open the folder which I have created and on this folder Jenkins sunar Cube VM I will create one file main.tf and I will paste the content for the main.tf uh so all these uh content for the main.tf file is available in the description section so this is the Ami you need to change it as per your region instant type I'm going to create T2 large this is my key name name for the Linux server and this is the user data file install.sh which it will run inside the created ec2 instance it will give the name to the E2 instance Jenkin hyen sunar Cube this size would be 40 and it will also create the security group note one thing here we don't need to create the VPC because every account has the default VPC if I go to my account and VPC so this is the default VPC in my account so we don't need to create create the custom VPC once again it will just associate these Security Group to the default VPC of the account and it is also going to open the inbound Port 2280 443 8080 for the Jenkins 9,000 for the sonar Cube and 3,000 for our application and this this would be our this would be the name of the security group I will save this file I will now create the one more file provider. TF and I will provide the content in this file here actually we are defining we are registering the uh provider which is the AWS this is my region AP sou 1 I will save this file and finally I will create the install.sh file which it is going to run inside the created EAS to instance so uh this install.sh file will first update the packages then it will install the jdk then it will install the genkins in our ec2 instance then it will install the docker in in that instance and then it will run the sonar Cube as the container in the ec2 instance this is the image name for the sunar cube the standard image of the sunar cube and after that it will install the triy in that E2 instance I will save this file also and the path for this install.sh we have defined here on the main.tf I will now go to AWS console I am users we are actually going to create the one user through which we will log to AWS from the command line create user I will give the name Cloud user next attach policies directly I will give the full access to this user next and create user I will go to user security credentials and here I will click on create accs key command line interface I will cck here next description I will give Cloud user create access key I will copy the access key I will go to vs code and on the folder I will right click and click select this option open in integrated terminal on the terminal I will give the command AWS configure I will paste the access key and hit enter I will copy the secret key from here and provide it on the terminal so we have successfully logged in uh from the terminal on the AWS console I will I will download the CSV file and I will click on done on the vs code I'm inside this uh folder I will give the command terraform in it okay so terraform has been initialized in this folder I will now give the command terraform plan okay so it is going to create these resources in our AWS account and finally I will give the command terraform apply Auto approve instance created if I go to AWS console instances it is not showing the instance because we have taken the region AP South one and we are inside the Ohio as of now so I will change my region to episod one so this is the instance which has been created and this is the security group which has been created if I go to Security Group so these are the inbound rules which has been applied as per the terraform script I will copy the public IP of the instance and on my local system I will go to terminal session SSH I will paste the public IP username I will provide ubu Advance use private key and I will select my key if I give the command Jenkins hyphen version so genkins has been installed in my system if I give the command Docker hyan version so Docker is also installed and if I give the command triy hyph version sorry so this is the version of the triy which has been installed in my system if I give the command docker psyph a so this is the container for the sunar cube which is running created about a minute ago and we are actually running the uh sonar Cube as the container on this E2 instance you might be doing this project in multiple set settings so while you are uh shutting down your ec2 instance you need to first stop the uh Docker container first give the uh this command Docker psf a you will get the ID for the container and then give the command Docker is stop and the container ID after that you can shut down this ec2 instance and once you are back you need to give the command Docker start to start the sunar cube we have installed the ec2 instance for the genkins and the sonar Cube now we need to configure the sonar cube in sorry we need to configure the genkins in that E2 instance so I will browse the public IP of the instance with the port 8080 I will go to this path on the terminal and copy the password install suggested plugins I will create the user Cloud admin I will set the password okay so we are inside the Jenkin dashboard now we need to install the uh few plugins which are required for our project so I will go to manag genkins plugins available plugins and first plugin I will search here Eclipse timin installer this one and then I will select sonar I will search here sunar Cube I will select sunar Cube scanner and I will search here Gates I will select sooner quality Gates quality Gates I will search here nodejs I will select the nodejs also then I will search here Docker so I will select Docker Docker Commons Docker pipeline Docker API and Docker build step I will click on install Okay so plugins installed I will again go to manage genkins now we need to install few tools which are required for our project so I will go to tools and I will go to jdk installation and I will click here on ADD nodejs I will give the name node 16 install AO automatically install from nodejs.org and the version I will select nodejs 16 do 2.0 this one now I will go to next tool which which is jdk jdk installation I will click on ADD jdk name I will give jdk 17 I will close this add installer install from adm. net and the version I will select 17. 0.8 do 1+1 this one and next is Docker Docker installations I will click here on ADD Docker name I will give Docker install I will click on install automatically add installer download from do.com and the and the next tool is sonar Cube so under this sonar Cube Scanner installation I will click on ADD sonar Cube scanner and the name I will provide sonar scanner I will keep the tick here install automatically install from Maven Central and version is fine I will click on apply then save we have configured the genkins now it's time to configure the sonar Cube so I will copy the public IP of the instance and I will browse it with the port 9,000 I will provide the default credentials which is admin admin I will set the new password I will go to Administration security users I will click here update tokens I will give the name for Jenkins token for genkins and I will click on generate copy this token somewhere on your local system I will go to Jenkins manage genkins credentials add credentials and type I will select secret text ID I will give sunar Cube token and here I will provide the copied secret which we had created create now I will go to manage genkins system and and actually we need to add the sonar Cube server here so I will go to sonar Cube servers add sonar Cube name I will provide sonar Cube server and the URL I will copy the private IP of the instance and I will provide here HTTP colon // IP address colum 9,000 and the token I will select uh which we had created just now apply and Save I will now go to sunar cube dashboard and I will go to Quality Gates and we need to create the new quality gate here so I will click on Create and I will give the name sunar Cube quality gate save now we need to create the web hook uh between sunar Cube and the genkins so I will go to Administration and under the configuration I will go to web hooks I will click on create name I will provide Jenkins and for the URL I will copy the private IP of the instance and I will provide it here HTTP colon SL SL private IP of the instance colon 8880 SL sunar Cube web hook and and I will click on create we have configured the genkins we have configured the sonar Cube and we have integrated the sonar cube with the genkins also now it's time to create the pipeline so first of all I will create the token on the sonar Cube for our project so on the sonar Cube dashboard I will go to manually I will give the uh project display name and uh I will give the project key name uh also which is the same as of project display name main branch I will click on setup I will click on locally and I will click on generate this is our token name which is being used which will be used in our project and I will click on generate continue other Linux so these are the command which would be used in the script so I will go to genin dashboard new item I will give the job name YouTube hphone cicd pipeline okay I will click on discard all bills Max of build to keep I will type two I will not set any trigger as of now and I will directly come here on the script portion I will give the script in the script uh first of all we are recalling our tools uh jdk 17 and note 16 these two we had installed in our genkins and this is the name of our sonar Cube scanner added in the genkins sonar scanner and in the first stage it will clean up the workspace then it will uh do the checkout from the git so this is my GitHub repository and then directly it will come to sunar cube analysis stage uh this is the server name sonar Cube server which we have added in the genkins and this is the scanner sonar scanner and this is the project name and and this is the key name which we have generated in the sonar Cube and under the quality gate stage uh this is the uh token which was created in the genkins sunar cube token then it will install the dependencies and then it will perform the 3v fs scan I will click on apply save and I will click on build now job completed if I go to uh sonar Cube dashboard projects so this is the analysis of our project if I go to issues so these are the issues found in our project D the during the sunar cube scan now we need to configure our pipeline to build the docker image and push that Docker image to the docker Hub so I will go to dockerhub first of all we need to create the personal access token and add that token into the genkins so you can go to user my account security and you can create the access token here I have already created the access token so I will go to Jenkins manage Jenkins credentials add credentials username with password username I will provide my username of the dockerhub and password I will paste my personal access token ID I will provide dockerhub description I will give Docker Hub create now I will add the two stages into our genkins pipeline I will go to my job configure and I will add the two stages for for okay so first stage is Docker build and push so this is the credentials which we have added and this is the tool name uh which we have given Docker so uh using First Command it will build the docker image and this would be the name of the image YouTube hyphone clone then it will tag the local image according to the dockerhub this is my dockerhub username and this would be the tag latest then using this command it will push that tagged image to the docker Hub and once it is pushed it will do the tri scan of the dockerhub image this is the uh image on the dockerhub I will click on apply save and I will again click on build now job completed and if I go to dockerhub and do the refresh here so you can see here the image has been pushed here just now in this section of the video we will set up the monitoring with the help of Prometheus and the grafana and for the Prometheus and the grafana we will create the separate instance so I will create the new folder under my project folder called monitoring server and on the vs code I will go to that folder monitoring server and we will create this inst instance uh with the terraform so I will first create the main.tf file under this folder I will paste the code so this is my Emi you need to change the Mii as per your region the instance type I'm going to create is T2 medium this is my key and it will install the it will run the commands in the created instance using the install.sh file and the name of the instance would be monitoring server this size would be 20 and it is going to create the separate Security Group monitoring server SG and it will create the inbound rule also and it will allow the port 2280 443 909 for the Prometheus 910 for the node exporter and 3,000 for the the grafana it will uh create the outbound rule also I will save this file I will close the previous files and then I will create the provider. TF file in which we will register the AWS provider so I will create the separate file and name it provider. DF I will give the content this is going to register the AWS provider in the region AP South one I will save this file also and then finally we need to create the install.sh file through which we are going to run multiple commands in our newly created E2 instance and these commands would be run uh via terraform itself so I will go ahead and create the file here install Dosh and I will paste the commands for this file so in short I will explain you what what are the uh packages we are going to installed using the install.sh first of all it will do the system update and then it will uh download the packages for the Prometheus then it will create the service system service for the Prometheus then it will start the Prometheus service and then it will download the packages for the node exporter and then it will create the system service for the node exporter and it will start the node EXP fter service and then finally it will install the grafana and it will start the graph on service I will save this file I will open the integrated terminal and I will do the terraform in it okay so terraform initialization completed under this folder I will give the command terraform plan okay so it is going to create these resources finally I will give the command terraform apply Auto approve okay so creation completed if I go to AWS console and do the refresh here so this is the instance which has been created if I go to security this is the security group which has been created and these are the inbound roles added to the security group I will copy the public IP and I will access it over the terminal and if I give the command PSE sudo system CTL status Prometheus so Prometheus is running and if I give the command sud system CTL status node exporter so node exporter is also running and then if I give the command Pudo system CTL status grafana server so grafana server is also running if I go to AWS console copy the public IP and browse it with the port 9090 so I can see the Prometheus dashboard if I go to status targets so only the uh Target local Target is showing here we need to add the target for the node exporter so on the terminal I will go to directory Etc SL Prometheus and then I will open the file prometheus. yml and here we have the job for the Prometheus below that we need we need to add the job for the node exporter here I will add the job below this line job name would be node exporter I will give the next line static configs and here I will provide the public IP of the monitoring server colon 910 is the port number for the node exporter I will close the comma I will close the bracket I will save this file control o and enter to save crol X to exit now we need to check the indentation of the yl file we can check this with this command so showing success means indentations are correct now we need to reload the service which we can do with this command and if I go to dashboard of the Prometheus and do the refresh here so node exporter job is showing and it is up we can see see the metrix so logs are coming now with these locks we need to create the dashboard on the grafana so I will browse the public IP with the port 3,000 to access the grafana admin admin I will set the password okay so we are inside the graphon dashboard uh so we need to now add the data source so I will go to data source I will select the Prometheus name would be Prometheus and here I will provide the URL HTTP colon SL slash and the public IP of the instance colon 9090 this is the URL for the Prometheus and remaining things I will keep as it is I will click on Save and test so it is showing successfully qu read I will go to home and here I will search the import dashboard and here actually we need to uh give the dashboard ID so for the node exporter the dashboard ID is 1860 you can search on Google node exporter graph on a dashboard and open this link and copy the ID to the clipboard and paste it here so it is 1860 load and here select the source which we have created Prometheus import so here we can see the dashboard for our monitoring server using the node exporer multiple dashboards are available now we need to integrate the genkins with the Prometheus so that we can import the dashboard for the genkins also on the grafana so I will go to Jenkins manage Jenkins plugins available plugins I will search here promus and I will select this plug-in Prometheus Matrix and I will click on install so it is asking to restart I will click here the genkins is restarting I will login again I will go to manage Jenkins system and under the system here is the Prometheus path is Prometheus I will take these two options also and the job name is Jenkins job I will click on apply and save I will go to terminal of the server and here I will again go to CD Etc Prometheus I will open the prometheus. yl file and here I will add the job for the Jenkins job names Jenkins uncore job and we will have to provide the metrix path so metric path is Prometheus then with we need to give the the static config targets colon I will open the bracket and here I will give the public IP address of the Jenkins server colum 8080 I will close the bracket I will save this file crol o and enter to save control X to exit and uh I will run the command to check the indentation of the yaml file it is showing the error I will open the file again 33 I will go to here this extra space we need to remove in both the lines I will save this file control o and enter to save control X to exit I will again give the same command okay so now it is successful I will reload this service if I go to Prometheus dashboard and do the refresh here so jenkin's job is added data fetching may take some time now this target is up if I open this link so the log is coming now we need to add the dashboard for the junkins on the grafana I will go to grafana dashboard and here I will search import dashboard and type uh sorry dashboard ID I will give 9964 you can search also on the Google uh Prometheus Jenkins dashboard for the graph uh 9964 I have given the ID for the dashboard I will load it so the name of the dashboard is it has taken I will select the source Prometheus and import so this is the dashboard for the Jenkins executor free to I will go to Jenkins I will go to job and I will click on build now job completed if I go to graphon dashboard and do the refresh here so the successful job is showing as one now after integrating the genkins with the uh grafana we have the successful job quantity as one in this section of the video we need to create the email alert for our job in Jenkins so I have logged into my Google account and I'm inside the my account. google.com and your uh account must have the two Factor authentication enabled I will go to security I will search Here app password I will give the name and I will click on create copy this to your system and close now we need to enter these credentials into the genkins so I will go to genkins manage genins credentials add credentials username with password username I will provide my Gmail address and I will provide the created password Here ID I will give Gmail description I will give Gmail and I will click on create now I will go to manen kins system and I will go to email settings SMTP server I will provide here smtp.gmail.com default user I will provide my email address I will go to advance I will take here SSL SMTP Port 465 I will click on test I will provide here my email address test configuration I will take here use SMTP authentication username I will provide my username and password I will provide which we have created I will again click on test configuration email was sent successfully if I go to my Gmail I have received the test email from the Jenkins I will now go to extended email notification server I will give SMTP do . gmail.com Port 465 Advance credentials I will select the credentials which we have created Gmail use SSL default content type I will select HTML and under the triggers I will select always and success up apply and then save now we need to modify the pipeline script and we need to give the post so I will go to my pipeline configure I will go to script and here after the stages I will provide the post I will click on apply and save I will click on build now job completed successfully and if I go to my Gmail account I have received the email from the pipeline which has the complete build log along with the trvy file scan result and trvy image scan result in this section of the video we are going to configure the AWS eks I have taken the uh remote of the genkin sonar Cube server first of all I need to install the cube CTL in This Server so I will give the command to install the URL then I will download the package for the cube CTL then I will install it if I give the command Cube CTL version so this is the version of the cube CTL which has been installed now I will go ahead and install the AWS CLI in This Server so first of all I will download the packages for the aw CLI I will install the unzip in This Server I will unzip the downloaded package and then I will install the executable file if I give the command AWS hyph F version so this is the version of the aw I installed in my system now I need to install the eksctl so first of all I will download the packages for the ekl using this command these packages has been downloaded in the TMP folder so I will go to TMP directory and I will move the TMP sorry executable file to the bin file bin folder because all the executable files are under the/ bin if I give the command eksctl version so ekl has been installed in my system now we need to create the IM am rooll and attach it to the ec2 instance so on my AWS console I will go to I am rols create role AWS service in the drop down I will select ec2 next I will select the administrator access next I will give the name eksctl underscore rooll create so role has been created I will go to my ec2 instance I will go to Action Security modify IM roll and in the drop down I will select the created role update I am roll okay so Ro has [Music] been assigned to the ec2 instance now I will go ahead and create the EK CTL sorry eks cluster using the EK CTL so I will give the command in which this would be the name of my eks cluster I will hit enter then I will select my region AP South one then I will give give the note type T2 small then I will give the node count three this kubernetes cluster creation will take some time so I will pause the recording here and I will resume it once it is completed kubernetes cluster creation completed if I give the command Cube CTL get noes so these are three notes created in my cluster if I give the command Cube CTL get SVC so this is the default service of my kubernetes cluster now we need to configure the monitoring for our kubernetes cluster so we will install the Prometheus on the uh e cluster and then we will add the Prometheus as a source to the graphon dashboard so to install the Prometheus first of first of all we need to install the helm on the server so I will install the helm if I give the command helmo version so the helm has been installed in my system now we need to add the helm stable charts for our local client so for that I will give the command then I will add the Prometheus Helm repo then I will go ahead and create the separate name space for the Prometheus and then finally I will go ahead and install the Prometheus using Helm okay so it has been installed I will now give the command to check the promises has been installed or not so I will give the cube CDL command ql get PS name space Prometheus so these are the ports created in the for the Prometheus then I will check the service which has been created for the Prometheus so this is the service created for the Prometheus but uh these are not the uh these are not the load balancer service so these are not exposed to the external world so to expose the service to the external world I will open the file and in this file I will go to end and here type is given as cluster IP so I will make it I will type I button to insert and I will make it load balancer and then this port I will make 9090 I will go ahead and save this file I will press escape button then colum WQ and hit enter if again give the command to check the service so now you can see we have the load Balan service instead of cluster IP and this is the external uh DNS for the service I will copy this URL and on the browser I will browse it with the port 9090 so you can see Prometheus is ready and it has been installed on the kubernetes cluster I can check the targets here for our kubernetes cluster okay now we need to add this Prometheus as the data source in the uh grafana so I will go to our grafana server and uh on the left side I will go to connections data sources and we have one data source which is our Prometheus uh server which is installed in the ec2 instance I will add one more Prometheus as the data source which has been installed on the kubernetes cluster so I will type here I will click here add new data source Prometheus name I will give Prometheus e and URL I will give HTTP column sl/ and this URL of this service colum 9090 and here I will click on Save and test so connection successful okay so data source has been added now we need to import the dashboard for the kubernetes cluster so I Will Show You by adding the two dashboards you can uh search for the dashboard ID and you can uh import the multiple dashboards for the kubernetes so here I will go to search and I will go to import dashboard and here I will give the ID for one of the dashboard 15760 and I will click on load so this is the name of the dashboard kubernetes ports and the data source I will select Prometheus EAS and I will click on import so this is the dashboard for our e data fetching may take some time and here on the data source in the drop down we need to se select our data source fromus e so CP usage by container CP usage by memory usage by container all the graphs are available here if I select the name space Prometheus so we can monitor the ports created under the Prometheus name space I will go to home of the dashboard so these are the dashboards which has been imported to our uh grafana I Will Show You by importing one more dashboard which is also for the kubernetes so in the search bar I will go to import dashboard and here I will give the ID 17119 and I will click on load so name of the uh dashboard is kubernetes e cluster I will select the data source Prometheus e and import so here we can monitor complete cluster so we can monitor the name space Prometheus here you can search for the dashboard IDs and you can import the more dashboards regarding the kubernetes cluster to the grafana so on our grafana we have total four dashboards as of now one one is for uh genkins and one is for node exporter which is the local host and two are for the E okay now finally we need to configure our genin pipeline to deploy the resources on the kubernetes so for that we need to First deploy some plugins to the genkins I will go to man genkins plugins available plugins and I will search here kubernetes I will select this plugin this one also this one also this one also total four plugins I have selected I will click on install I will go to home I will go to terminal of the server and I'm inside the slome slubu inside you can see one uh directory do Cube I will go to this directory and this is the config file for the kubernetes I will right click and download it I will save it on my project folder I will go to that folder and I will open the config file in the notepad and I will do the save a copy s and I will give the name secret. txt now we need to add these secrets to the genkins so on the genkins I will go to manag genkins credentials add credentials kind I will select secret file ID I will give kubernetes and I will choose the file which we have just downloaded and the ID is kubernetes this ID we will recall in the pipeline script create I will now go to my pipeline configure I will go to script and here we need to add one more stage to deploy the resources on the kubernetes so here I will add the one more stage and the name is deploy to kubernetes for for now it's okay so I will delete this command actually we need to generate this command and directory kubernetes this is the kubernetes directory inside our GitHub GitHub repository kubernetes because this directory is having the Manifest files so I will generate the command command here so to generate the command I will go to pipeline syntax which will open the new tab and here in the drop down select the plugin which we had installed Cube CTL credential select the kubernetes credential and remaining things don't change anything generate pipeline script copy this command and paste it on the pipeline script for for I will click on apply and save before we verify our cicd pipeline I will make one change on my Pipeline on the pipeline script I will make it 3 image scan apply save while pushing the changes to the remote repository from your uh git Bash you may be needed the GitHub personal personal access token to be provided so on the GitHub you can go to setting developer setting personal access token token classic and you can create your token here I have already created the token for myself self I will clone the repository on my G bash I will go to Repository okay so I'm inside my repository before we verify our de secop pipeline we need to enable the web hook for our pipeline so I will go to configure and here I will give GitHub project I will paste the URL for my repository and under the build trigger I will select this option GitHub hook trigger and I will go to my repository on the GitHub I will go to setting web hooks add web hook I will give the URL here HTTP colon sl/ and the public IP of the Jenkins instance colon 8080 SL GitHub hyphen web hook slash ADD web hook I will refresh here it is showing as green means connection is successful on my project I will apply and save okay so everything has been set up now it's time to verify our Dev SEC pipeline if I go to dockerhub you can see there is no image for YouTube clone clone app I have removed the previous image so on my git bash if I do the ls I will modify the readme file I will make it test 50 crl o and enter to save crol X to exit git add git commit hyph name get push origin main so here I need to provide my token so I have already created the GitHub token I will go to token option and I will paste my token here so changes has been pushed to remote repository if I go to my Jenkins job so it has triggered automatically upon the change on the grub Repository for job completed successfully if I go to dockerhub and do the refresh here so image has been pushed just now and if I go to my email I have received the email also with the log from the pipeline if I go to my on dashboard and if I go to the ports I will select my source and the name space is default so data is fetching for the newly created Port if I go to server and give the command Cube CTL get service so this is the URL of my pod I will browse it on the new tab so this is my application running this is the YouTube clone app you can play the videos also on this [Music] app new legislation we will verify it once again on the application uh the color of these symbols is red we will change it to some other color and will verify it changes or not so on the G bash I will go to [Music] to SRC directory I will go to components and I will open the file sidebar in the Nano editor and this red color we will make it some other I will make it blue control o and enter to save control X to exit K add get commit and I will I will send the changes to the remote repository get push origin men if I go to my pipeline the job has been triggered upon the change on the GitHub repository job completed successfully if I go to terminal and give here the command Cube CTL get p PS so these are the new newly created ports we will check these ports on the grafana on the grafana dashboard uh name is spes default in the drop down I will select the Pod this one and if I do the refresh so data is fetching for the newly created port and if I go to my application and do the refresh here so you can see the change has color has been changed for these icons from red to blue by this way you can create the complete automated Dev secop cicd pipeline which will trigger automatically upon the change on the GitHub repository and once job is completed you will see the changes appearing on the application follow me on the LinkedIn I post multiple useful useful things on the LinkedIn also you can fog my repository and create your own branch and practice the complete video during practice if you face any kind of difficulty you can ask me in the comment section I will definitely reply to your query hope you find this video helpful a lot of effort went behind this video so do subscribe to my channel to encourage my work like the video share this video with the friends and spread the knowledge I look forward you to join me in the next video thank you
Info
Channel: Virtual TechBox
Views: 12,206
Rating: undefined out of 5
Keywords: cicd pipeline, devops tutorial with project, ci cd pipeline using aws, devops ci cd project, complete devops project from scratch, devops projects for practice, devops tutorial for beginners with projects, devops real time projects, devops jenkins with docker integration, learn devops with projects, jenkins ci cd pipeline explained, cicd pipeline project, ci cd pipeline using docker aws, cicd devops project, devops with projects, devops cicd project, devsecops, prometheus, grafana
Id: TY6hW7fecuI
Channel Id: undefined
Length: 116min 27sec (6987 seconds)
Published: Mon Dec 11 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.