Ethical Hacking websites using burp-suit Full Course

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hello and welcome to course burp suite web application penetration testing let's talk about synopsis so what we are going to cover in this course so i've divided this course into five different sections first of all in section one we will learn about the burp suite setup in cali linux that how you can set web suite in section 2 we are going to learn about fast spidering process with the help of burp suite in section 3 we will learn how you can scan your web application for different vulnerabilities in section 4 we will learn how you can exploit those one liberties with the help of webstreet and in section 5 we will learn how you can generate a report by webstrate now if we talk about prerequisite so you should have basic id skills and you should also have kali linux operating system installed in your machine so this is all about course overview thank you so much for your time and i'll see you inside the course [Music] hello and welcome to section 1 of the course setting up your burp suite environment now let's start the first video of section 1 burp suite introduction and proxy configuration now in this video what we are going to learn so in this video we will learn the basic soft burp suite tool and we will also configure proxy so let's open kali linux machine and let's open burp suite inside it so guys i'm inside kali linux machine and inside kali linux machine we are learning about burp suite tool which we use for web application analysis so let's see where it is so you need to click on applications and if you select web application analysis so you can see that burp sweet tool is available here now this is uh not a professional version this is a community edition so you won't get all the features but yeah you can easily learn burp suite with the help of website community edition okay so let's click it and open it so there's also another paid version which is web suite professional version you can also purchase it if you need so this is worksheet community edition all right click next and click start burp and here you can see that it is a community edition so here we go so this is the graphical user interface of burp suite and you can see on the top that there are various options available which we will use later in this course and on the right hand side you can see that it is showing you a list of vulnerabilities which you can detect with the help of burp suite and upgrade to burp suite professional to automatically find vulnerabilities so for automatic detection you need to use purple professional but for practice purpose uh you can use purpose with community edition all right so you can see here first of all this is the dashboard area then here we have the target area where you will get all the information about target proxy where you need to set the proxy we will do it intruder is also available with the help of intruder you can perform various kind of attacks on web application repeater will we will also use repeater and sequencer decoder to decode compare to compare to results extender to add various kind of extensions in burp suite we also have app store which we can use to add other features project options are available here and user options are available all right so these all options are available here and we will use them in other sections all right so now we are learning about proxy so how we can configure proxies so you can click on proxy here so in proxy you can see here that there are options available here so click on options and we already have a proxy available here here it is so the ip address is 127.0.0.1 and you can see the port number is 8080. so this is by default so 127.0.0.1 is obviously localhost and this is a port number you can also add another one if you click on add so you need to provide the port number here you need to specify the iep address and then you can easily add another one but we don't need it right now so you can see here that you can also edit it so for example if i select this click edit so you can see here i can easily change the port number for example if this port is not available so you can easily change the port number to another port and you can use the services of burp rate got it so it is you can you can basically you can edit the port number like for example 8081 and then you can click okay so you can easily change it to eight zero eight one here you can see now it is eight zero eight one but if it is available then you don't need to change it right otherwise if you don't want to use this proxy then you can also remove it so you can simply select it and you can click on remove and you can easily remove this proxy setting we will also set the same proxy settings inside our browser where we will run web application so when we will use or when we will run the website in browser so the proxy will be same and burp suite will be able to get the request from browser inside grab suite so that's why we set proxy inside burp suite as well as in our browser so this is called proxy listeners so you can see here burp suite uses listeners to receive incoming http requests from your browser you need to set your browser to use one of the listeners as its proxy server so first you need to check inside burp suite if proxy is available or not and then you need to set the same proxy inside your browser then both will be able to communicate with each other that's why we set proxy inside burpee as well as in web application so that's why we set proxy inside burp suite as well as in the browser where web application is running in the next video we will learn how we can set proxy inside a browser [Music] hello and welcome to the next video web application proxy service now in this video what we are going to learn so first of all we will learn about our target and then we will learn how we can set a proxy settings inside the browser so to perform this task let's open kali linux machine now guys we are inside kali linux machine and first of all we will learn about our target means target website and second how we can set up proxy settings so let's open the browser so click on applications here and then you can click on web browser which is firefox so you can click it and open it inside kali linux machine so this is mozilla firefox all right so first of all let's talk about our target so we are taking a sample website provided by a company company name is acunetix which also provides web scanners right so we are using their website so generally the website name is oneweb.com so this is a sample website wellweb.com and these all are subdomains all right so we can use this website for practice purpose okay so we can try this one also testphp.onenewweb.com so you can open it here it is so we can use this website as target if we go back so you can see here the technologies are apache php and mysql right so we can and we can find various kind of vulnerabilities inside this website right so we will use this as a target in burp suite all right so how we can set up proxy settings inside browser and when we should do it so for example if you are testing a website inside burp suite so the first step is to open the website inside the browser for example the website is open now for example if you want to check the login page so on the left hand side you can click on sign up and this is the login page we have right and you for example you are testing various kind of entries inside it so admin admin and you want to send this request inside burp suite for other kind of tests so first of all you need to open the website inside browser all right then you need to change the proxy settings of your browser so to change proxy settings you can click on the right hand side on top open menu all right then you can click on preferences you need to search for network settings or network proxy settings so you can come down and you can see here network settings are available here configure how firefox connects to the internet alright so click on settings here you can see here that it is using system proxy settings but we need to change it to manual proxy configuration select last one which is bandwidth proxy configuration now if you remember in the previous video we have seen the proxy settings of burp suite so burp suite was using ip address and port number ip address was 127.0.0.1 and port number was 8080. so we need to use the same proxy inside the browser so that browser and website can communicate with each other all right so what you need to do here so in http proxy here you need to set the ip address which is the local host ib 127.0.0.1 right the same proxy settings are available in burp sheet then port number same port number you need to provide which is available in burpee 8080 correct and then you need to select this option let's use this proxy server for all kind of protocols i mean if it is using any kind of protocol it will use this as a proxy all right if there is any entry available here you just need to remove it and you can see here that there is no entry available here no proxy for so so proxy is available for all kind of protocols and proxy ipn port number is this 127.0.0.1 and port is 8080 all right you don't need to change anything else that's it select manual proxy configuration set your ip address and port number and then you just need to click ok and you are done you can again re-verify if you click on settings so you can see here our settings are saved it means that the proxy settings you have entered in browser as similar to the proxy settings available in burp sheet so if you run a website inside your browser so it will generate a request the browser will generate a request and burp suite will be able to capture it inside it easily all right we will also check it later so that's it first open the website or open the particular page you want to test in burp suite enter your credentials if you want change the proxy settings and then run it basically you just need to click on login right and basically if you have burp suite the burp suite will easily capture it all right so that's how you can set the settings inside your browser to connect with broadband in the next video we are going to learn how you can burp sweet for the particular target [Music] hello and welcome to the next video launch burp suite for target now in this video what we are going to learn so we will learn that how you can launch burp suite and then how we can intercept a request so let's open kali linux machine now guys i'm inside kali linux machine and we are learning about how you can intercept a request inside burp suite we already have a target which i mentioned in the previous video so first of all let me open the browser so click here click on web browser and open it here it is so we were using a testing website the website name was walnutweb.com provided by akunetix company all right and let's try this one testphp.oneweb.com so click here all right so first of all let me check my proxy settings so first of all we need to remove proxy so come down and check settings all right no proxy then click ok then run it again that's php.oneweb.com all right click sign up and here we are now we are on the target page next is we need to set a proxy setting inside the browser again so click here click preferences here you can click on settings then select manual proxy configuration the ip address is 127.0.0.1 the port number is 8080 select this use this proxy server for all protocols no proxy is blank and then you can click ok so we are set all right next is we need to start burp suite inside kali linux machine so you can click here in applications web application analysis open burp suite we will also check the proxy settings inside worksheet and then i'm going to show you how you can intercept a request click next start properly as you already know we are using burp suite community edition so we have limited features inside it all right here we are now if you select proxy here and inside proxy we can check options so you can see that proxy is available here the same proxy is available in browser we are all set next is you can also check another option inside proxy which is intercept right so what is intercept so enters with the help of intercept we can intercept request coming from the browser for example if a website is running in the browser and if it is generating a request so burp suite can intercept that request with the help of intercept here you can see that intercept is on for example if it is off then you need to turn on so you just need to click here and now you can see intercept is on it means now it is ready to capture the request coming from the browser all right and obviously proxy is already set now just minimize it what we need to do here we need to generate a request from our browser where this website is running so for example let me enter the credentials i don't know about credentials so for example admin admin all right now you can click login and here you can see that the request has generated by browser and it is intercepted by burp sheet you can see here the color has changed from black to orange right and you can see here that we have the request inside intercept right so i have already told you that if intercept is on then if browser is generating request into broad suit will capture it easily so you can see here that this is using post method the page was user info dot php right the host is the website name so the website name is testphp.oneweb.com we already know that we are working on this website testphp.oneweb.com right user agent is mozilla slash 5.0 all right firefox we are using okay encoding accepting and this is the actual page link which is testphp.onenewweb.com login.php let me open the browser you can see here that this is the link we have all right so that's and you can also check the credentials which we have passed in that website or application so i have typed user name admin and password admin and here you can see that now we can see the request here so it means that if you want to generate a request you just need to set the proxy settings and then you just need to open the particular page of the website which you want to test generate a request capture that request inside burp street and then you can perform various kind of tests and if you right click here so you can see here that you can perform various kind of tests like you can send this request to intruder or repeater or sequencer or computer or decoder so we will learn about these options later in the videos but yeah now you know how you can capture a request inside brepsuite in the next video we are going to learn about burp ca certificate [Music] hello and welcome to the next video trusting burp ca certificate now in this video what we are going to learn so in this video we will learn the basics of perhaps c certificate and why we need it so to learn more about burp ca certificate let's open cali linux machine now guys i'm inside kali linux machine and we are learning about bob ca certificate so to learn more about it let's open the browser and the website is running here so first of all let me check my proxy so let me remove it first in preferences and settings no proxy and then you can click ok all right now we are learning about biopsy certificate here so you can type here burp ca certificate hit enter and you can see here that uh this is the official website portswigger.net here you can see installing burp ca certificate so you can click it and open the link in your browser here we go all right so installing burp ca certificate now these steps are all necessary if you want to use an external browser for manual testing with bob basically bob also comes with its embedded browser but as you know that we are using external browser means mozilla firefox so and we are performing manual testing so that's why it is necessary if you prefer you can just use burp's embedded browser which is pre-configured to work with proxy already all right to access the browser you can check proxy intercept tab and click open browser but we are using external browser which is mozilla so that's where it is necessary the process for installing burp c certificate where is depending on which browser is so basically it depends on the type of browser you are using for testing so please select the appropriate link below for detailed information about so we will check it now let me come down the main important thing is why we need burp ca certificate why why should i install it so let's learn about it so one of the key functions of tls which is transport layer security is to authenticate the identity of web servers that your browser communicates with so if your browser is communicating with a web server or for example if you are opening a web website so it needs authentication this authentication process helps to prevent a fraudulent or malicious website from from masquerading as a legitimate one so for example if you are opening a fake website or fraudulent website so it helps to prevent it for example it also encrypts the transmitted data and implements integrity checks to protect against men in the middle attacks so we know about men in the middle text it is also called sniffing so to prevent sniffing it can encrypt the data which is transmitting in the network in order to intercept the traffic between your browser and destination web server bob needs to break the tls connection as a result if you try and access the https url with file burp is running your browser will detect that it is not communicating directly with the authentic web server and will show security money now for example uh you know that we are not using https website in testing we are using http website now for example if you are testing a website like instagram or facebook right so when you will try to intercept the request inside burp suite so obviously burp needs to break the tls connection so what will happen the the browser will detect that it is not able to communicate with the server because it is communicating with the burp suite bulb so it is in middle so it will show you a security warning so to prevent the issue burp generates its own tls certificate for each host signed by its own certificate authority ca and the ceo certificate is generated the first time you launch burp and stored locally so basically if you want to test https websites and without any security warning you need to install this burp ca certificate inside your browser all right otherwise it can show you a security warning if you try to test a https website obviously with http website there will be no problem because in the previous video we have captured or intercepted a request but yeah in testing you have to use https websites then surely you need bob's ca certificate so that's why we need bobca certificate inside browser so if you have bobca certificate installed in your browser there will be no security warnings when you will try to test any kind of website all right so and you can see here that there are various kind of burp certificates for different different kind of browsers so you know that we are using the firefox so we will try to install this one the first link so after installation of the certificate you won't get any security warnings and you will be able to test any kind of website so this is all about the burp ca certificate i hope now you know why we need prop propc certificate inside browser in the next video we will learn how to install ca certificate in browser [Music] hello and welcome to the next video installation of ca certificate now in this video what we are going to learn so we are going to learn how you can install burp ca certificate and then i'm going to explain the whole process so let me open the lab setup so guys i'm inside kali linux machine and we are learning that how you can download and install ca certificate in your browser so we have website available inside kali linux so let me open the page here here it is the support page installing burp ca certificate in firefox so how you can install burp c certificate in firefox so to install burp cs certified in firefox proceed as follows first of all you should run burp suite in your kali linux machine then visit this url in firefox as you can see that we are using firefox you should be taken to a page that says welcome to burp suite professional if not please refer to proxy troubleshooting page depending on what went wrong and you may be taken there automatically so let's visit this http colon slash workflow in firefox you can see here that website is running inside kali linux so let's open a new tab and you can type your http colon slash and then rip suite and hit enter so it should open a page like this welcome to buffalo professional then you can click on see a certificate right to download it and guys you can see here that we are on the page so the first step is basically you need to change your proxy settings to 127.0.0.1 and port number is 8080 in your browser so for you have changed it then you can visit http colon slash uh burp suite and then you will be on this page website community edition welcome to bucket community edition so we have the ca certificate available here so you can click on see a certificate and you can see here the name is cassert.der which is around 9 30 39 bytes so you can click on save file and then you can click ok so it will download inside your download folder all right so we have successfully downloaded the certificate inside kali linux so let me check where it is so let me check the file manager inside file manager go inside downloads and inside downloads we have the site fixed available csr dot der great let me open the page again so we have successfully downloaded the ca certificate next is in firefox open the the menu and click preferences or options so we can open the menu here then we need to click on preferences options so so let's click here then click preferences all right next is from the navigation bar on the left of the screen open the privacy and security settings and scroll down to certificate section click on view certificates button so let me click on preferences here in preferences we have privacy and security click here then you can come down right and here it is so we can click on view certificates option here and here we go we are on certificate manager so we have clicked on view certificate and we have certified manager so in the dialog that opens go to authorities tab and click import and select the burp csv that you have downloaded earlier and click open when prompted to edit the trust settings make sure the check box this certificate can identify website is selected and click ok close and restart firefox with burp still running try and browse to an https url so that's it so you you know we have downloaded the certificate so we can simply import it so you can click import and you can see here in downloads we have sir dot dot so we can select it and we can open it here so here you can see we have options do you want to trust post figure ca for following purposes trust cs ca to identify websites trust ca to identify email users so you select the first one just to see it to identify websites and then you can click ok and you can click ok here all right so we have selected the first option the certificate can identify websites and then next option is basically we need to close and restart firefox that's it so let me close this and let's restart firefox again and we are done web browser here we go so in this video we have learned that how you can download burp ca certificate and then how you can install it in your browser so [Music] so the setup is completed so website community edition is available inside uh calallen's machine which we are going to use and we have also installed burp ca certificate inside browser in the next section we are going to learn about spidering process [Music] hello and welcome to section 2 of the course fast and hybrid spidering your web application now let's start with the first video of section 2 about spidering process now in this video what we are going to learn so we are going to learn about uh the spy drawing process and i am going to explain you so guys uh here we are learning about spidering process or we can also call it crawling so in this section we will only learn about spidering process provided by websuite but first of all let's learn the basics so let's talk about crawling which is also called spriting so you can see here the crawl phase of a scan involves navigating around the application following links submitting forms and logging in wherever necessary so it means why we need crawling for example if you have a web application you need to test it so first of all we need to perform spidering or crawling process so that we can get more information about applications like various kind of links available in the website the various kind of urls available and we can also test forms inside the web application and logging process so to categorize the content of the application and navigational paths within it the the seemingly simple tasks present a variety of challenges that burps crawler is able to meet to create an accurate map of application so crawling is like you are creating a map of application that what is inside this application right so what are the various links available forms login pages etc so you will get an overall map in burp suite after the spattering process so that's why we need spider to learn more about web application which we are testing right now uh if you are using burp suite community edition which is uh available in kali linux so crawling or spidering process or spreading option is not available in community edition so it is available in the paid version which is professional version it was available in in burp suite but now in current version it is not available so so spidering process and scanning process is not available in current version of burp street now you can see here the for burps with professional users if your machine appears to support it bub scan will automatically use the embedded browser for all navigation all right let's talk about core approach so what is core approach here here you can see it is written by default bulbs crawler or spider navigates around a target application using an embedded browser clicking links submitting input wherever possible it constructs a map of the application so the basic idea is burp suite uses an embedded browser first it visits the website or the target check all the links available in the website it submits the inputs for example if there is a form available inside the web application so it can ask you the login id and password so that it can submit application wherever possible and after that it constructs a map of the application's content and functionality in the form of directed graph representing the different locations in the application and links between these locations like you can see in the diagram here so you can see here these are the various uh pages of living available in a website like blog feedback about us login right now when it reaches login page so you need to submit form so it will give you a pop-up and it will ask you to provide username and password so that it can login inside the website so when you will provide username and password it will login and it will also check the internal pages so you can see the crawler makes no assumptions about the url structure used by the application locations are identified based on their contents not the url that was used to reach them so it enables the crawler to reliably handle modern applications such as csrf tokens etc right so you can see here the approach also you allows the crawler to handle applications that use the same url to reach different locations based on the state of application or users interaction with it so the basic idea here is it checks each and every page of the website for example if it cannot reach a particular page so it will ask you or it will ask the user to provide input and after that it will login inside that page and check internal pages as well so crawler checks all the links in the website create a graph for you after completion of this process so it also handle the sessions right detecting changes in application state applic obviously application login you need to provide login so that crawler can get inside the web application you can see here that so start then login right submit users credential submit admin credentials then it checks the internal pages like my account and then it can also check internal pages like change password my messages etc so that's how it works crawling volatile content so you know that many modern web applications also contain various kind of volatile content and crawling with embedded browser so by default if your machine appears to support it burp will use its embedded chromium browser to all navigation of your target website or application so you just need to provide a url link and then you need to start crawling process and then it will check all the links available in the website and provide you a proper graph at the end so this is all about the spider crawling process in the next video we are going to perform spidering process in burp suite community edition [Music] hello and welcome to the next video spidering in community edition previous version now in this video what we are going to learn so first of all we are going to perform spidering process with the help of burp sheet community edition previous version and i'm going to explain the whole process so let me open the lab setup now guys i've started my kali linux inside vmware workstation so you can see here that i have two versions of kali linux one is uh 2020 and second is 2019 so this is previous version and this is basically latest version so i want to tell you that in latest version of burp suite committee edition spydering is not available inside the tool they have removed it and now it is available only in professional version basically the paid version so you are not able to perform spidering process directly with the help of burp suite community edition so i am showing you in the previous version how to perform spidering process then later in the next video i will show you how you can perform spidering in the latest version because in previous version of burst read spattering is available so let's check the previous version so let me open the old version which is cali 2019.2 all right all right here it is so let me close it let me start again so this is uh the previous version of kali linux where web suite is available this is 2019.2 so just to show you how you can perform spidering in the burp sheet just to provide you information that's why i'm showing so you cannot find the spidering in latest version of burp street so just to get information so here you can select web application analysis then you can click on burp suite i will also compare both burp sheet and both versions okay so this is the old version of kali linux where webstreet is running here it is so let me first of all start this one you can see the version is burp suite community edition version 1.7.36 so let's start it click next and let me start up all right so this is free version which is community edition version here it is now this is the old version and here you can see that spider is available here inside burp suit right so this is available here so we can use it in previous version but if we check the latest version so let me open the latest version of kali linux and let me open burp suite inside it i'm just providing you information about the difference so that you can learn more all right so let me close this first of all all right uh yeah here you can see that spidering is not available and you can see this is the latest version of kali linux which is 2020 and spydering is not available they have removed it and in the previous version spidering is available but here you can't find spiriting it means you are not able to perform spidering in the latest community edition so let's perform spidering in the pre-old version then i will show you a technique to perform sweating in latest version all right so let me minimize it and let me start a browser here it is all right so this is the testing website which we generally use testphp.onenewweb.com provided by acunetix company yeah it is running so for example this is my target and i'm going to perform spidering on this website so the first step is change the proxy settings so click here then check preferences come down and let me click on settings here and select manual proxy settings the ip address is set here 127.0.0.1 number is 8080. select it use this proxy server for all protocols remove everything from here which is no proxy for and then you can click ok and you are done all right now next is to check if interception is on in broadbread or not so let me check it click on proxy and you can see here that intercept is on great minimize it and you just need to refresh the page in browser refresh it and that's it so minimize it open websheet again and here we go so i got a request from browser if i check https tree so i can see here that it is showing you a request coming from browser where user has visited a website testphp.onenewweb.com great all right now next is uh you can right click here you can add this website to scope so you can add two click or add to scope click yes that's great all right in scope you can see that we have successfully added this website to scope okay next go into proxy now you can click on the website right click and you can click on spider from here right you have this option in the old version spider from here okay so let me click on spider from here you can see the color has changed here inspire and in spider you can see here that now it is sending data it means spider is working so burp spider submit form so you can see here that it is asking you uh to provide username and password for example let me provide so username is test password is test and let me submit this form all right form here we go so if you provide right username and password then the spider will be able to check internal pages if you don't provide any username password spider is going to ignore the internal pages all right so you can see here around uh so you can see here that data is transferring here and here you can see another form so for example if you don't want to test this one so you can simply click on ignore form there we go and you're done all right now in the options of spider you can see the crawler settings so you can check robots.txt page of the website you can detect custom not found responses ignore links to non-text content and various settings available here great so in proxy we have if we click on target in site map so here you can see that this is site map which is created by uh crawler or spider here you can see so this is the website testphp.onenevweb.com right so let me go into proxy [Music] here it is you can also check intercept you can now forward it if you want you can click on target site map here we go so you can see here that spider has provided us information about the urls in the website you can see these all urls are available in the website great so that's why we need crawler because we get information about uh the website and this is the site map basically we have so we know everything about the website what pages it contains what kind of pages right so you can see here that these are the pages available here if you want to see the content just click on that and you can see here the content is available response you can you can check here right so you can also check that this is the login page and this is log out page and also check the content so here it is right now in the request you can see here that it is showing you the page name the hostname and the user agent etc these are the headers hex value and in response so this is a response now you can check the server version here so the website is using server engines 1.19.0 so now you can also check the server version with the help of this and obviously you will get more information and you can check the php version php 5.6.40 right so this is a very useful information if you are testing a website that what version the server is using or what is the version of php right for example if it is using outdated version of php or server so you can detect it easily so the whole information is available in header so you can easily check here this is the hex value html format here and render if render is available all right so this is all about the spidering process and you can see here in render that it is showing you the website here so because this is the login page so it is showing you the login page of the website so if you click render you can check the website inside render window that's great so this is all about the spiding process uh available in the previous version of burp suite the goal was just to show you the whole process because this option is not available in the latest version of burbridge so that's why i showed you in the previous version so this is all about the spidering process in burp suite community edition the next video i'm going to show you the spreading process in community edition latest version [Music] hello and welcome to the next video spidering in community edition latest version now in this video what we are going to learn so we are going to perform spriting process but in latest prep sweet person and i'm going to explain the whole process so let me open the lab setup so guys i'm inside kali linux machine and this is the latest version right in the previous video we have learned that how you can use uh spidering in the old version of burp suite but yeah in the latest version of kali linux burp suite is also updated so you won't be able to find spidering option in latest version of burpseed so if i open brep sheet here here it is website you can see here that no spidering option is available here then what you can do in this case obviously you are not able to perform spreading process and community edition of bird feed yes obviously if you purchase the professional version of burp sheet all options are available there but for practice purpose how you are going to perform spidering right so it means that we are not able to perform spidering directly from burpseed we need to take help of some other tool so that we can create a proxy to perform spidering process and latest what's on purpose now fortunately we have another tool which we can use and we can take help of this tool to perform spidering in burp suite the tool name is zap provided by osp which is z attack proxy and it is also open source tool provided by os so we can take help of zap and then we can make it a proxy and then we can transfer or basically we can uh set proxy inside ceratec proxy so that request will be redirected to brexit we can do this all right so let's check in proxy in options so in options we have proxy settings proxy listeners available here 127.0.0.1 and port number is 8080 all right intercept should be on yeah it is on here now let me minimize this so we are not able to perform spidering directly from burp street let's open z attack proxy in kali linux so you can click here select web application analysis and here you can see zap is available here this is which is called zetac proxy so let's open this click and open it inside kali linux it also has various other options like spidering and active scanning etc so guys uh you can see here that this is os zap now let me start it here we go so what is zap why we use it so you can read here zap is an easy to use integrated penetration testing tool for finding vulnerabilities in web applications so we can use zero tech proxy to find various kind of vulnerabilities in web application for example injection vulnerability cross-site scripting vulnerability cross-site request forgery etc so it's a penetration testing tool and the important thing is it is open source so you don't need to pay for it it is open source provided by osp so you can use it easily in kali linux as well as in windows machine so we are using in kali linux all right now what we are doing here so what we have a target test php.polynomweb.com now we cannot perform spidering directly in bird suite because we don't have that option in community edition so what we are going to do we are making zap as a proxy or you can also say it as a mediator so we will take request inside zap first and then zap will redirect that request to web suite right so we are taking this tool as a proxy so zap is going to perform all kind of scanning and the result will be redirected to preparate all right so that's how you can use zap and web suite to perform spidering process in the latest version of kali linux or you can say latest version of burp suite okay so now next step is to set proxy settings so what you can do here you can click on settings here you can see this button so click here it will show you all kind of settings available for that attack proxy here we go so you can click on local proxies first of all so you can see here port number is eight zero eight one so if it is showing eight zero eight zero you can change it to eight zero eight one all right so what you can do here if it is showing like this eight zero eight zero so change it to eight zero eight one localhost is fine now port number should be eight zero eight one or any other do not use eight zero eight zero here okay next is we need to check the proxy settings here so let me check where it is so you can click on i think connection yeah right and come down so you can see use proxy chain right so what you can do here you first of all select it use an outgoing proxy server then provide address so address should be localhost 127.0.0.1 and port number here you can provide eight zero eight zero all right so it will redirect to this ipn port and we know that burp fluid uses this ip address and port number as a proxy listener so what is going to happen we will take request inside zap zap is going to perform scanning and that will be redirected to burp feed on address localhost port number 8080 all right so these are the two settings in connection use it as a proxy server change the ip address and port number set it and uh if you come down and in local proxies you just need to change the uh port number to eight zero eight one and we are done so you can click on ok and we have set all the proxies inside zdeduct proxy and we also have burp suite available here here it is which is using proxy 127.0.0.1 and 88 is the port number so we are all set intercept should be on which is on and here we have the jtag proxy great now next thing is we need to visit a browser and we need to change the proxy settings of browser because we need to set the port number to eight zero eight one because zap is using eight zero eight one so now or let's do one thing it also contains its browser which is pre-configured so let's open this browser you can see here mozilla firefox so let me click here and let me open this mozilla firefox so guys here we go mozilla firefox is open explore your application with zap so let me remove the url here and let's test it http s column slash test php dot one web dot com hit enter here we go let me open first of all zero proxy here now guys uh you can see that inside zadatec proxy we have the website here right so we have visited the browser and inside browser we have used this website and zelda tech proxy have intercepted all right now next is we need to perform spidering process so you can right click here and you can see there are various options available here so inside those you can select attack in attack you can see spidering is available here so we can perform spidering in z attack proxy and then we can redirect this to burp suite so in spidering we have active scan we have a jack spider fuzzing why is other scanning if i open burp suite so in burp suite we only have one request that's fine because the request has been accepted by zelda proxy and xerodec proxy has redirected it to web suite but we have not started the uh spreading process so you can see here in http history there is only one data available all right now let me start the spidering process inside attack proxy so it should get more data here so right click here let me start a basic spidering process and you can click on this so this is the website all right now let me start the swiping process click start scan all right so it is providing you information about these urls like robots.txt sitemap.xml all right now if i go inside burp feed you will also see the data same data here we go so you can see here that uh here you go it is still coming because the zap is redirecting the request in websheet that data so jab is transferring data to burp suite so you can see in the https tree we have robots.txt information here sitemap.xml information of testphp.onelevweb.com website great so we are able to perform spidering process with the help of z detect proxy tool so you can see let me perform some other tasks for example let me try ajax spider so let me click on ajax spider click start scan all right now let me go back to burp suite and uh first of all what we can do we can add this website to scope so add to scope here yes so in target i have added this website to scope right now in proxy we have this let me go in intercept let me forward this request forward everything right okay now let me go back to zadek proxy so it is still running you can see here it is taking some time so it will surely update the data inside webstreet so basically this is the way you can perform check this that this is the same url available in jtec for block list dot settings here we go here it is right i mean it means that zydeck proxy is redirecting data to webstreet and we can check the data inside worksheet here we go great so guys uh that's how you can use zerotech proxy as a proxy to perform spidering process inside burpeed so you just need to set proxy settings inside zap and then you will be able to redirect the request or basically the data from zlatec proxy to prep suite and you can check all the data in prepaid all right so this is all about the sparing process and latest version of brexit in the next video we are going to learn about scanning process [Music] hello and welcome to section 3 of the course scanning your web application now let's start the first video of section 3 about scanning process now in this video what we are going to learn so we are going to learn about scanning process with the help of burp suite and i'm going to explain so let me open the lab setup now guys i'm inside kali linux machine and we have two versions of kali linux one is old version and one is a test version so in latest version let me open burp suite here we go so as you can see here that uh spidering and scanning is not available in community version of preparate if you want to perform scanning then you need to use burp suite professional which is a paid version so as you can see here that new scan is disabled in burp suite so we are not able to perform automated scan inside webstreet if i talk about the old version let me check the old version and here spydering was available but if i click on scanner so scanner was not available it means scanner is only available in paid version or in professional version of root all right no problem so let's learn about scanner what it does and how it works so you can see here burps can resist state of the art vulnerability scanner for web application so if you have a web application and if you want to scan that web application for various kind of vulnerabilities then you can take help of burp suite scanner it is designed with security test in mind to integrate closely with your existing techniques and methodologies for manual or automatic basically you can perform automatic testing by webstreet and you can also use manual testing so you can see what you can do you scan for many types of liberties without making any additional requests use your browser to control exactly what gets scanned and manually select items anywhere within burp and send it into scanning all right so let me go back to the latest version here so here you can see if you click on dashboard so on the right hand side you can see it is showing various kind of vulnerabilities listed here and some are high vulnerabilities some are medium vulnerabilities some are lower vulnerabilities so it has categorized vulnerabilities into various parts high medium low and informational right so you can see in red color these are high level vulnerabilities and you can see in orange color it is medium volatility and in low there is no no vulnerability available informational vulnerabilities all right so it is saying upgrade to burp suite professional to automatically find vulnerabilities it means if you want to use website scanner then you need to purchase website professional for automated testing but it's it's very easy you just need to click on new scan then surely you can start the scanning process easily right but yeah new scan is disabled in community version of web suite all right and you can see here that uh issue definitions so you can see it handles various kind of vulnerabilities which is listed in burp street old version check this main vulnerabilities are like uh sql injection vulnerabilities crosstalk scripting vulnerabilities cross-site request forgiving vulnerabilities and file path manipulation vulnerabilities any kind of injection you can perform with the help of webfleet and uh the information is also available here if i click on any one liberty the information is available here about that validity that what is a sql injection how you can remediate it and various kind of references all right so you already know that we cannot perform scanning in burp suite community edition which is available in kali linux so it's all right so in previous section we have learned that how you can use z attack proxy as a proxy to perform spidering process right so let me show you directly without making it as a proxy so let me minimize this just to provide you information about scanning how it works by an automatic tool in the next section we are going to perform exploitation and we are going to use manual techniques inside it we will learn that how you can perform injection attack manually then cross-site scripting attack etc so this is z attack proxy correct now you can see here automatic automated scan is also available here so url to attack you just need to paste the url here for example the url is http columns slash testphp.wellnetweb.com all right then simply click on attack it will start the attack process and it will show you a list of one liberties inside the tool purp suit also does same so you can see i have already done the active scan with it and in alerts you can see there are various kind of alerts available uh inside this website basically this website is vulnerable and what kind of vulnerabilities it contain like cross-site scripting vulnerabilities it also shows the page which is affected if you click on any link you will get more information about the link that risk is high right description what xss does and affected items etc it also contains sql injection attack right so it shows you everything that what vulnerabilities are available in the website all right so bob suit scanner also the same it will show you a list of vulnerabilities that these vulnerabilities are available inside this website all right so it is pretty easy to use you can also perform spidering we have already done it in the previous section successfully and here we have learned that how we can perform active scanning now if you want to check again how you can perform automated scan in this tool or similarly in burp suite so you just need to provide the url here in burpseed you need to intercept a request so it's pretty easy to perform automated scan inside these automated tools like here just need to provide the url here and then you can click on attack so you can see here that first of all it is performing spidering process because i have selected it then it will start the scanning process spidering is done and here you can see now it has started scanning so now it is performing attack here you can see right if you want to stop it then you can stop from here similarly and you will get a list of alerts here so you need to complete the scanning process and then you will be able to see them in burp suite it is pretty easy you just need to click on new scan and then you can start the scanning process that's pretty easy and you will get the list of vulnerabilities right here in the dashboard like for example if the website contains sql injection vulnerabilities you will get a list here and you will be able to find sql injection vulnerabilities in that list all right so that's how you can perform uh scanning inside burp suite and you will be able to find all kind of vulnerabilities with the help of prep street in the next video we are going to learn about os top 10 web vulnerabilities [Music] hello and welcome to the next video os top 10 web vulnerabilities now in this video what we are going to learn so we are going to learn about uh the top 1 liberties for web applications and i'm going to explain so let me open the setup now guys i'm inside kali linux machine and we are learning about os top 10 web vulnerabilities so in the previous video we have learned that bub suit can scan various kind of vulnerabilities inside web applications now we need to focus on top 10 web vulnerabilities because they are critical or high vulnerabilities so let me show you let me open google here and you can type here osp top 10 vulnerabilities for web all right here we go so this is the official website os.org and this is the link so let me click and open it here we go so this is a top 10 vulnerabilities for web applications provided by osp so you can see that the os top 10 is a standard awareness document for developers and web application security it represents a broad consensus about the most critical security risk to web applications so obviously first of all we need to protect web application from critical vulnerabilities so you need to be aware about it that what are the top 10 vulnerabilities available for websites or web applications provided by osp so here you can see companies should adopt this document and start the process of ensuring that their web applications minimize these risks so first of all you need to focus on these top 10 risks using the os top 10 is a perhaps the most effective first step towards changing the software development culture within your organization into one that produce more secure code so let's see what are these ten top ten one liberties let me come down here we go so top ten web application security risks so first is injection second broken authentication sensitive data exposure xml external entities broken access control secured misconfiguration xss insecure dc realization using component with known vulnerabilities and insufficient logging and monitoring so these are the top 10 risks available in web applications how it is so let's talk about it one by one to learn more about it so first is injection you can see here injection flaws like sql injection or ldap injections are available in applications so when untrusted data is sent into an interpreter as a part of commander kodi so attacker can insert these sql injection queries into your web application to gather information from your database where all the data is stored attacker hostile data can trick the interpreter into executing unintended commands all right so this is injection broken authentication is applications functions related to authentication and session management are often implemented incorrectly how allowing attackers to compromise passwords keys session tokens and to exploit other implementation flaws to assume their users identities temporarily or permanently so basically this is about authentication in the web application so for example if i authenticate inside an application so i need to provide username and password so if attackers are able to get those unit password they will be able to access the application sensitive data exposure many web applications and apis do not properly protect sensitive data so if uh attacker can also take help of weak apis to gather information xml external entities which is also called xxe so it happens because of poorly configured xml processors evaluate external entity references within xml documents so attacker can also insert xml code inside application and if possible attacker can gather other information like attacker can perform port scanning remote code execution dos attack means flooding attack in web application broken access control restrictions on what authenticated users are allowed to do are often not properly enforced attackers can exploit these flaws to access unauthorized functionality or data such as access other users account by a view sensitivity for example if you have a url and in url we have a parameter name equal to abc so if broken access control one libid is available in application and if i'm able to see the result of abc so if i change abc into bcd i can i can see the result of bcd without any authentication [Music] so it is because uh broken access control existed inside application all right next is security misconfiguration and it there's a very common type of issue in application because uh uh you can see here in secure default configuration incomplete or ad hoc configuration open cloud storage misconfigured http headers whereabouts are messages containing so it can happen using various techniques like for example if you are using uh default accounts default username and passwords error messages provided by applications are showing uh sensitive data so it can happen cross-site scripting yeah so xss or cross-site scripting flaws occur whenever an application included untrusted data in a web page without proper validation or escaping so we can insert a malicious javascript inside application to gather more information and we can also insert html code so xss allows attackers to execute scripts in the victims browser which can hijack user session so it can attack on victim's browser with the help of html or javascript queries insecure deserialization often leads to remote code execution yeah so remote code execution is possible with the help of insecure deserialization and they can be used to perform attacks including replay injection privilege escalation etc next is using components with known vulnerabilities so components application components basically the libraries frameworks software's run with the same privilege as the application if a vulnerable component is exploited for example you have used the external library into your application and that library is outdated so attacker can take help of that library and can access your application can attack on it so that's why all if you are including some external libraries or frameworks inside your application make sure they are updated and finally insufficient logging and monitoring so insufficient logging and monitoring coupled with missing or ineffective integration with incident response allow attackers to further attack systems so basically you need to you need to monitor your systems regularly yeah you need to check logs regularly right if you want then it is possible that attackers can use the same technique to attack again and again inside your systems so obviously logging and monitoring is necessary you can see here that allows wreckers to further attack systems maintain persistence pivot and tamper extract and destroy data most of breach studies show time to detect a breach is over 200 days typically detected by external parties rather than internal process or monitoring so that's why logging and monitoring is very necessary so these are the top 10 uh os one liberties so first if you are a web penetration tester or web tester first you need to focus on these top 10 web application security risks if you are testing a website yes you can use various kind of automated scanning tools available in zaratec proxy or website etc but you need to focus on these top 10 security risks if you want to protect your web application from external attacks so we will perform some manual attack techniques in the next section but i will show you an example of brute force attack in next video [Music] hello and welcome to the next video example of brute force attack now in this video what we are going to learn so we are going to learn the basics of brute force attack and i'm going to explain the whole process so let's open the lab setup so guys i'm inside kali linux machine and we are learning about an attack example and we are taking here brute force attack so what is brute force attack so for example if you have a website and you have login pages inside it so what we can do here we can create a username list we can create a password list and we can basically create combinations and we can attack on the login page to get access inside it we don't have username and password but we can create a list and then we can attack right so if there is no restriction then we will be able to perform brute force attack if there are restrictions then obviously we are not able to perform what kind of restrictions for example uh if user has provided three incorrect username and passwords so application can lock the account but if this facility is not not available it means if this protection is not available then obviously we can perform brute force attack with thousands of words or username and passwords we can create combinations and then attack so let me show you an example of brute force attack using burpsweet so first of all let me open breakfast here here we go all right now we are taking help of intruder in this example so we can perform attack with the help of intruder we can insert payload with the help of intruder so i'm taking help of intruder before that let's check the proxy settings so inside proxy intercept should be on in options proxy should be set here 127.0.0.1 and 88 is the port number all right bulb fluid is all set now let me open uh the browser so let me minimize this let me open a new browser target is obviously test php.wonderweb.com login page so let me open first of all the testing page to testphp.wondermap.com is our target here we have the sign up page and here here we go so this is basically the login and password page but we have no idea about login and password but you can see here that please use the username test and password test because it's a sample website but uh i'm assuming that we have no idea about the credentials or username and password so how we are going to perform brute force attack on this web page interesting now we need to generate a request so that we can intercept that request inside webspeed this is the first step for that we need to change the proxy settings of the browser so let me change the proxy tag we are on the page then change the proxy settings all right so click preferences down and here you can click on settings then you can select manual proxy configuration provide the ip address 127.0.0.1 provide the port number 8080 select it use this proxy server for all protocols remove everything from here and you're done click ok all right next is we need to provide some parameters here because uh username and password is available so we need to provide username for example i'm typing admin admin obviously it is not right username and password right username password is testtest but we are just generating a request with parameters in websheet so when i click login it will generate a request that request will be intercepted by web suite and you will be able to see it inside website okay let's let's click on login here we go let me go back to webshoot so it has not intercepted let's try again interceptors on all right let me try again let me go back in browser try again all right in proxy in http history we have a request now that's to bhp.onenotweb.com if i click here i can see the request so you can see here the page was user info.php and the exact url was testphp.onenewweb.com.php now you can see here that we have provided the username and password so username of admin and password was admin obviously it is not right username and password so we will not be able to login inside application okay so we have successfully intercepted our request inside webstreet great next is as i already told you we are going to use intruder function inside webspeed because we have not used it yet so let me right click right click and you can send this request into intruder so send to intruder you can see here the color has changed in intruders so click intruder inside that you can click on positions and you can see here that we have the same request inside intruder now it is showing you two parameters here one is username and one is password automatically great if you can see any other parameter you need to clear it so you need to select it and then click clear all right now in attack type as we are performing brute force attack so we need to select here cluster bomb all right everything is fine if i select here payloads so we need to provide payloads here payload means username list and password list so i'm not generating any list here you can generate a list here easily and you can provide here otherwise you can also paste passwords one by one so as we are just testing here so let me insert three or four username and three or four passwords to check so first of all number one is user name and number two is for password so let's select number one here so i'm inserting three incorrect passwords three correct usernames and one correct username which is test so let me type for example sunil add then admin add it and let's try three so i'm typing here final one which is right test okay so two incorrect one correct again select number two for password and then provide any other like guest add it root that's the right username and password is test test so we are just checking here we go so this is for username this is for password let's see if intruder can provide us the right combination of username and password or not everything is fine in options you don't need to change anything in payload we have set two payloads here and positions we have just two positions one for username and one for password that's correct and in target we have the right url everything is set here now it's time to start attack so on the right hand side you can see a button start attack so what it is going to do it is it is going to take combinations of username and password and we'll check so click on start attack it will generate a new window here we go so you can see here that it is testing and it is not done yet you can check below all right so you can see here it is taking username and password one by one all right completed you can see finished here so how we will know which username and password combination is correct to log in inside the website first of all you can check the status here in status we can see zero two three zero two in everywhere but if you can check the last one last one is two hundred it means looks like this is the username and password test and test for example if you have thousands of username and password in your list you can test it with the help of burp suite and the bub suite will take some time but yeah surely if right username and password is available inside the list shortly you will get it and also check length so you can see here length is 253 but if you check the last one which is 6287 it means this is the right combination of username and password and we have successfully got the right credentials so username is test and password is test other combinations are incorrect that's how you can perform brute force attack inside an application right and we have used intruder option inside birth rate great so you can close it let me close everything let me change the proxy settings again no to no proxy settings no proxy and let's try to log in so refresh it let's try the right username and password test test click login and here we go we are in you can see the logout button here so i just wanted to show you a simple attack with the help of web suite and we have also used a new feature in worksheet which is intruder so this is all about the scanning process inside burp suite in the next section we are going to learn the manual techniques to perform attack so in the next section we are going to exploit vulnerabilities in web application [Music] hello and welcome to section 4 of the course exploiting vulnerabilities in your web application let's start the first video of section 4 sql injection attack now in this video what we are going to learn so first of all we will learn the basics of injection attack and then i'm going to show show you in an example so let me open the lab setup now guys we are inside kla linux machine and uh we are learning about sql injection attack so in the previous sections we have learned that you can perform automated scan with the help of burp suite to get information about vulnerabilities but yeah you can also use the manual techniques to find out the vulnerabilities so here in this section we are learning about some manual techniques so how you can check if a website is vulnerable to injection attack or not so we can find out it's pretty easy so we need to insert some malicious content and then we can check if it is malicious or not correct we already have a target so first of all let me open the browser here here we go so we have this target testbhp.wonderweb.com and i'm using this website as target so click on sign up and this is the login page we have correct so we can insert few injection queries inside this login pages to get more information but first of all we need to create a request and we need to intercept a request inside web suite so for that let me open websheet again all right i will try to insert few true queries inside username and password let's see if we can detect injection attack or not so it next start burp suite community edition all right and next thing is we need to set up the proxy so let's check inside proxy intercept interest intercept should be on proxy settings are fine all right and as i already told you that you can also perform automated scan with the help of burp suit professional where scanning facility is available so click on news can you can find out easily but yeah manual technique is also very important so you should also learn it all right if you click on you can learn more about sql injection attack if you have no idea all right generally we insert malicious data into into website to get more information okay so now let me minimize it and uh let me open my browser where the target is running let's change the proxy settings again click on preferences then in settings select manual proxy configuration and then click ok all right and that's okay now let me type and username and password because we have no idea so for example admin admin click login request has been intercepted by web suite and where it is so if you check proxy intercept so you can see here the request is available right here now we have another option on the facility available inside website which which we are going to use so we have used intruder now we are going to use repeater right so i will tell you the use of repeater but first of all we need to send this request inside repeater okay so you can right click on the request and then you can click send to repeater or you can also press ctrl r send to repeater all right so if i click repeater here so you can see here that request is now inside repeater great now let's check the response so let me send this request so click send and you can see here the messages you must login it means the username and password which we have provided are incorrect that's why it is showing us the message you must login all right so you can see three zero two message also available here it means we are not logged in you can check the header value available here x value and render it if so render is not possible here all right now how we can perform injection attacks so what we can do here we can edit you name parameter and pass parameter here and we can insert few injection queries here so one injection queries like this let me show you in mouse pad what we are going to insert so we can insert query like this for example user equal to postgraphy or one equal to one and pass equal to same thing apostrophe or one equal to 1 correct let me change the font all right here it is so now we have user equal to admin and pass equal to admin so i'm going to remove admin and i'm going to replace it with this query here and i'm going to replace password with this because 1 equal to 1 is always true so it's a true query and it is possible that it can provide us the access of application so let's try this i generally use this query so that's why i'm trying this one otherwise you can find various other queries on internet which you can hit and try okay so let me copy this all right so let me copy this and let me open web suite and let me change it so i'm removing admin and i'm posting this query so let me try this one if you name equal to this and pass equal to admin so let's try let me send it so it says you must log in it means this is incorrect what if i blank the password let me try this you must log in this is not working so let me set the parameter into password also so i'm using the same parameter here apostrophe space or space one equal to apostrophe one you you can find various other queries on internet so let me click send now and here we go so you can see here that we have successfully logged in inside the website so you can see here that this is not actual username and password the actual username and password is test comma test but i have used sql injection query true queries and we have successfully bypassed the authentication great you can see here that the message is 200 means we have successfully logged in inside the website all right now if i open the login page we are still not logged in so let me forward this request let me send it again you can see here right okay so you can right click here and you can also send it into another features or you can check response in the browser so let me copy this and you can paste here and here we go we have successfully logged in inside the website you can see here great you just need to right click here and you can open the request inside browser or you can also check show responses so i clicked on show responses got the link open the browser paste it and you can check the result here so we have successfully logged in inside the website without using actual username and password so that's how you can use injection attack injection queries to perform attack on website that's how you can test the website manually because you can see here that we have not used automated scan of uh burp suit we have uh tested it manually we and you can see here that we have successfully performed injection attack inside the website it means injection vulnerability is available inside this website in the next video we are going to learn about cross-site scripting or xss attack [Music] hello and welcome to the next video cross site scripting attack now in this video what we are going to learn so you are going to learn the basics of crosstalk scripting attack and then we will perform this attack on target using burp suite so lets open the lab setup now guys i'm inside kali linux machine and in the in the previous video we have learned that how you can perform injection attack in a website using manual methods now here in this video we are learning about cross-site scripting attack which is also called xss attack on the target on the same target with the help of burp sheet using manual methods so first of all we are going to perform attack on the website without authentication and then we will access the website and then we will perform again so we have our target here you can see that i've been logged out from the website all right and web suite is also open here intercept should be on check the proxy again all right so i've removed my proxy from my browser then i have visited the website again so i need to change the proxy again so let me change the proxy so click preferences here and then you can click settings then you can select manual proxy configuration set the ip address and port number etc click ok all right so you can see here there's a search box so i will try to perform cross-site scripting a tag using the search box because i i can insert anything inside the search box that's why so for example let me type my name here and let me click on go so burp suite has intercepted the request inside the proxy intercept section here we go great now you can check inside https tree you can also use this one the same request is available here so search for equal to sunil and go button equal to go it means i need to change the parameter of search for now the value is sunil i need to change it so let me provide you more information about cross-site scripting attack that which injection combat we should use here so let me remove this one so first of all i'm going to use this one uh script open then alert inside alert you can type a string here we will type it and then script clause so this is a basic javascript if it works it means that cross-site scripting validity is available inside the website so let's see let me type my name here so so script open then alert then inside that provided the string here then script close so alert means basically it will generate a pop-up box and inside pop-up box it will show you the value and the value is sunil if cross-site scripting vulnerability is available inside the website so you will see a pop-up box and inside pop-up box you will see sunil right so i will try to change the value so let me copy this all right let me go inside webstreet here and let me send first of all this request into repeater so send it to repeater here great and uh if i click send here all right now let me change this and let me set this query script open alert string and script close all right if i send this obviously it should work and rendering is not available here so let me right click here and let me open it inside the browser so show response in browser copy this and let me paste inside my browser and hit enter and here we go so you can see here that it has generated a pop-up box and inside pop-up box you can see my name sunil great let me change the settings again for example if i change here for example alert one all right we can also try this one so all right right click here show response in browser copy this link open it let's try again and here we go so you can see here that we can see a pop-up box and inside pop-up box we can see the value it means that cross-site scripting vulnerability is available inside this website and this website is highly vulnerable so an attacker can perform cross-site scripting attack can gather more information great now what i'm doing here now i'm logging inside the website so let me change the proxy settings again and i will try to steal cookies so that's why i'm logging inside so no proxy okay let me open the website again testphp.wonderweb.com all right then click sign up here and [Music] all right let me type the right username and password test and test click login all right now let me change the proxy settings again so that we can intercept request in web suite so the difference is now we are logged in manual proxy configuration click ok all right now let me refresh this page so refresh it we will intercept a request in burp suite so open websheet now let me go inside proxy here we go so this is the request the file last request all right now for example if i send this into repeater and basically just a second what we need to do here we need to generate a request from search box basically we have logged in but we need to create a request from search box so that we can steal cookie by editing the value so uh we have logged in but now i need to enter something in search box for example um abc all right now click go it should generate a request go inside web suite click prox check proxy this is the request we need actually so search for abc but the difference is we have logged in inside websites so we should have a cookie you can see here we have a cookie in the previous request we didn't have now send this to repeater great and now we can steal this cookie so how attacker can insert command like this like javascript script then alert and then i this is a function document.cookie which can steal cookie so we can try this one script open alert then document.cookie without any double quotes so alert document.cookie and scriptclose let's try this so let me copy this and let's change the value there in burp suite and here it is so let me change the value which is abc and paste this value script open alert document.cookie script close that's it so i can send it first of all all right render facility is not available so i need to right click here then request or show response in browser copy this and open a new link inside browser so now we can paste here and hit enter and here we go so you can see here we can see the cookie here which is test and test this is the cookie so we have successfully stolen cookie you can see in bird street that the cookie was this here's a cookie so what attacker can do attacker can perform cross-site scripting attack and can steal cookie and with the help of rookie attacker can hijack users session and we can see here that with the help of crosstalk scripting attack we can also steal cookie off a user great so that's how you can perform cross-set scripting attack on a website and that's these are the manual techniques or methods you can use to gather information about this vulnerability in the next video we are going to learn about cookie management issues [Music] hello and welcome to the next video cookie management issues now in this video what you are going to learn so in this video first of all we will learn about cookie management issue and then i am going to show you an example so let me open the lab setup now guys uh first of all what we are going to do here we are going to capture a request inside burp suite so first of all we need to login then we will capture a request inside brub street right then we will copy that request and then we will log out from this website so we will try to log in inside website using the cookie which we had so let me show you the process step by step so first of all let's login in the website here we are trying to steal cookie and we are hijacking session so first of all you can click on sign up all right so let me type your test and test click login here we go now let's change the proxy settings here and let's capture a request in bloopsweet so you can click here you can click on preferences inside preferences we have network settings and then you can click on settings here select manual proxy configuration and then you can click ok correct all right and then you can refresh the page all right in proxy we have the request available you can see here your name is test passes test cookies login equal to test test all right so we have the request available here all right so for example if i send this request into repeater so if i click send so you can see here that we are getting response and we are logged in inside website great so do one thing uh copy it and save it for example let me save this here all right so let's remove this and paste here the request here right all right save it so we have saved this request now next thing is we are going to log out from the website we are going to clear all the cache from browser and we will go back to the login page so let me go back to the browser here and let's change the proxy settings again all right click settings no proxy okay then first of all let's log out from this website all right here we go and one more thing you need to clear the cache also so click here click preferences click privacy and security then clear data you can clear it or you can also clear a clear history so you can also clear cache all right so last two hours okay so clear now so we have cleared the cache let me refresh this page again all right now i'm what i'm going to do i'm going to capture a request when we are logout and then i will try to login using previous request so we have captured the request now we have the cookie so we will try to hijack a session or we can say that we can use the previous cookie to authenticate a new request so i'm going to let me again change the data settings manual proxy configuration and let me type here admin admin click login and let me open verb suite here here we go so this is the request we have right so first of all let me drop this request and this is the request we have current request where you can see that we are log out from the website we don't have any cookie right so can we use the previous uh request here right so let's try this one so copy it and open verb suite and let's paste here let's see if we can login inside website automatically or not you can see here that we are not logged in so let me forward this request to the browser you can see and here we go so you can see here that we have successfully logged in inside website it means that we have used the previous cookie to login with the help of the new request so this is there is a cookie management issue because that particular cookie is static it means it is same but to secure a website you need to change the cookies automatically after each login but it is not happening in this case cookies static cookie is same in all requests that's why we are able to perform this task so there are certainly cookie management issues available in the website that's why we are able to use previous requests and now you can see here that we have successfully logged in because i have also cleared the cache from my browser but still we are able to login inside the website it means there is certainly a cookie management issue so this is all about cooking management issues in a website in the next section we are going to learn about report analysis [Music] hello and welcome to section 5 of the course deep dive analysis of report now let's start the first video of section 5 web suite report from it now in this video what we are going to learn so we are going to learn the report format of web suite and i am going to explain the whole process so let me open the setup so guys we are inside kali linux machine and inside calendaring's machine we have web suite so let me open it here it is so now you know that we can use blurb free to to perform web application analysis and inside burp suite we can test various kind of vulnerabilities for example os top 10 and also you can see the vulnerabilities on the right hand side for example suspicious input transformation smtp header injection sql injection webcast repointing etc so it can scan various kind of vulnerabilities in a web application now these vulnerabilities are categorized here so they have categorized one liberties into four parts first is high volatility second is medium volatility third is global liberty and fourth is informational vulnerability so these four types are available inside berkeley they have categorized into these four categories informational vulnerability means when application is providing information to the attackers in the form of that web error page etc low vulnerabilities are basically lower liberty means that uh it is not affecting much medium volatility and high vulnerabilities means uh these vulnerabilities are affecting web application and attacker can gather sensitive information from a web application so these are the four categories available here right now you know that uh scanner and uh spidering is not available inside burp suite community edition they are available in website professional version so if you want to use them you need to purchase purple professional all right it's a paid version this is a free version right now you can also create reports from burpseed it is also possible so let me provide you more information about report there it is so we are learning about reporting scan results how you can report these results so you can export a report of some of some or all of the issues generated by burp scanner so obviously the use of burp scanner is to scan a website right and you will get a list of vulnerabilities in burp suite so how you are going to export this so the option is available inside worksheet to do this select the desired issues in the issue view of the site map so let me open number sheet so if you click on target if you click on site map you will get a list of issues here on the right hand side and on the issue activity log and choose report selected issues from the context menu the reporting wizard lets you choose various options for your report as described below so basically when we have a list of one liberties for a particular ip address or for a particular machine we can also create a report for that particular ip option is available here now if i talk about report format so you can choose one of the following reports to perform it and you can see here we can create report in html format we can also create report in xml format so let's talk about html format so this produces an html formatted report for printing or viewing in a browser so when we create this html report from webspeed we can simply view this report in the browser second is xml report so this produces an xml formatted report suitable for importing in other tools or reporting frameworks so it creates xml report now you can use that xml report in other softwares or in other frameworks so you can optionally base64 encode http request and response within the xml so it is using the encryption algorithm which is base64 since http messages may contain non-printing characters which are not strictly permitted within xml documents using this option is preferable because it ensures compatibility with xml parsers okay note the xml format uses an internal dtd and authors of interoperability code are recommended to view a sample report to obtain the current dd the following example elements so these are the elements of xml so first is the serial number so the serial number elements contains a long integer that is unique to the individual issue instance so a particular issue contains a serial number if you export issues several times from the same instance of bob you can use a serial number to identify incrementally new issues [Music] so if you export issues again and again from worksheets so they will increment the serial number the type element contains an integer that uniquely identifies the issue type sql injection or xss etc this value is stable across different instances of verb see the list of scan issue types for a list of all numeric so if i open this page so basically there is a unique identification for each issue type for example if a website contains sql injection so there will be an integer for that if it contains accesses there will be another integer so in issue definitions you can see here that this is the issue definitions for sql injection which is obviously high severity the type index is available here for cross-site scripting it will be different so you will get a list of issue types here with the numeric values [Music] all right the name element contains the corresponding descriptive name for the issue type see the list of scan issue types so let me open this one corresponding descriptive name of issue type so again basically for example if if a website contains injection attack or a sql injection attack so it will look like this sql injection if it contains second order order of in escrow injection attack it will look like this in issue types similarly for example if contains cross-site scripting or accesses so if i search here [Music] cross-site scripting [Music] so let me check again here it is so you can see here that there obviously there are three types of cross-site scripting attacks are available so in issue types you will see these names [Music] all right it means name element contains the corresponding descriptive name for the issue type the path element contains the url for the issue excluding query string so path will provide you the url the location element includes both the url and the description of the entry point for the attack where relevant a specific url parameter request header etc so the location will provide you that which url is affected it will provide you a proper information about url and the description and finally the request and response elements have a base64 attribute which contains a boolean value indicating whether the messages have been b64 encoded or not all right now if we check about issue details so in issue details you can choose the types of details to include in the report what kind of detail you want to include issue background this is the standard description of the issue and is the same for all issuer so basic issue description obviously we will include it remediation background so this is a standard remediation advice that how you how you can protect your device from this kind of attacks and that is basically remediation background issue detail for some types of issues this contains some further custom information about the particular issue drawing attention to specific details so yeah you will get a issue description but you can also choose issue detail if you want remediation detail obviously about uh remediation process and vulnerability classification so this contains this relevant mappings to the common weakness enumeration list so we have with cwe list you can also check it on internet so if you want to include it inside your report obviously you can that is one liberty classification so you can add issue background remediation background issue detail remediation detail and vulnerability classifications now http messages you can choose how http messages should appear in the report the following options are available for request and response do not include the report will not contain any messages of the relevant time so you won't be able to get http messages if you select do not include include relevant extract the report will include the parts of the message that are highlighted in the in tool results and enough of the surrounding message to give context so it will include the part of the message include in full obviously it will include the full messages including parts that are not directly relevant to understanding or reproducing the issue so optionally you can limit each message to a specific maximum length fine selecting issue types so the wizard lists the different types of issues that were included in your selection and the count of number of instances you can deselect any type of issues that you do not want to include this is useful if you have selected a large number of issues and want to remove certain less interesting type of now for example for example you only want a particular type of issues in your report so you can also select a particular type of issues and you can deselect the other ones so they will be included inside your final report created by website report details you can specify the file where the report will be saved for html reports so you can specify the following details report title now for example if you select html type format so you can also specify these details report title how issues should be organized within the report by type severity or url right so for example you want to categorize it using severity so you will get high severity then medium severity then low severity level issues the number of levels of details to include in the table of contents and the severities of issue to include in the summary table and bar chart so you can add these detail inside your report so verb suite provides various kind of options to generate a report right you can use these custom options to generate report but you can create format in html and xml format and then you can change the options inside it what kind of issue you want and what comes what kind of severity you want inside your report http messages the url location etc right so that's how you can select and create a final report for you with the help of web suite scanner in the next video we are going to learn how to generate reports [Music] hello and welcome to the next video generating reports now in this video what we are going to learn so in this video we will learn that how you can generate reports from burp suite and i'm going to explain the whole process so let me open the setup so guys i'm inside kali linux machine and we are learning about uh reporting with the help of websheet so in the previous video we have learned that what kind of report you can generate from rip sheet and what are the various options available inside workfleet while generating reports now in this video we will learn that how you can generate a report from web suite so as i already told you that in burpee community version we don't have scanner and spider so we so we are not able to generate those reports but yeah you should know how you can generate from professional version so let me open this websites support page so you can see here that it is providing information that how you can generate report so how to generate report about issues so here you can see that in target you can go inside site map then go inside ip address and you can go inside issues and then you can try right click on the issue or selected issue or bring up the context menu then using the report issues function to generate a report so in target site map ip address and issues so inside burp sheet you need to click on target then inside map and then you will get ip addresses on the right hand side and then you can right click and generate report it's pretty easy now for example you get these one liberties or these issues inside a web application and it is listed in target site map right so you can select a single vulnerability and then you can right click on and then you can click on report issue second is for example you have selected this one so if you want to select more so let me click control and then you can select these kind of issues so then you can right click here then you can see here then you then you can click on report issues and then you will be able to generate it it is pretty easy you can select a single one or the whole whole list by using this option but this option is not available inside you can see here that it is not enabled inside burp suite community edition but yeah surely if you use it professional edition then you will get all these options so to generate a report target site map then machine select machine then issues and then right click on the issue and then using report issue or function so that's how you can generate a report from burp suite from scanner right let me show you now this is a screenshot so for example uh this is a burp suite professional version so you can see here that let me show you so you can see here that so this is burp 3 professional edition and in professional edition you can generate uh reports because it contains spider and scanner option so you can see here that inside target we inside site map you can see we have a web application so we have tested a web application we got information about it and about the url or page is available inside it on the left hand side now for example if you have tested five web applications you will so you will get the content of web applications here like this so this is the ip address the ip address is 192.168.56.28 and they're using get method this is the url of the web application parameters and status for example you have five different web applications here so it means that you have five different uh ip addresses so you you what you need to do here for example if you want to check the issues related to application one so you can select the first ip address and then on the right hand side you will get a list of issues available in the web application after scanning process so if otherwise if you select number two option so you will get the other issues related to the application number two so that's how you can uh check issues related to a web application correct so for example we have selected the number one application so we have a list of issues available here right like crosstalk scripting is available here or local file inclusion etc so you can select a single issue or you can select all of them so you can select all of them and then you can right click on the issue and then you can click on report selected issues right when you will do this you can easily generate a final report and the issues will be available inside them right as we have already seen that in web street community edition this option is disabled right because we don't have scanner and spider but in professional version all options are available and you just need to select all issues and then you can create a final report in html or an xml format right so the professional version looks like this so you will find it in target site map and then in ip address so you need to click on target then inside target we have sitemap click on that then inside the contents you have you have the ip addresses so select the ip address and you will get issues related to the ipad is on the right hand side and in the below part you can find explanation about a particular issue now for example you have no idea about what is the cross-site scripting so you can click on that and you will get information about cross-site scripting vulnerability in the below part here and here in the below part you have the request response about that particular url request on the left hand side you have more information about the application so that's how you can generate a final report from website in the next video we are going to analyze report [Music] hello and welcome to the next video analyze report now in this video what we are going to learn so we are going to analyze a report generated by burp suite i am going to explain so let me open the setup so now guys i inside kali linux machine and we are analyzing a report created by burp suite so we have broke through it inside kali linux machine and in the previous video we have learned that how you can generate a final report so you can easily generate if you click on target then if you click on site map and then you can click on ip address and right click on the issue and then you can click on report issue so you can easily generate a final report from webstreet now for example you have created a final report so how we can analyze it what it can contain so let's check so let me open a sample report from burp read so here it is so this is a report created by websheet scanner and obviously created by webstreet professional version so you can see here it is created by berkshire professional version and it is burp scanner sample report so first of all let's check the summary of this report so a report will look like this so the table below shows the number of issues identified in different categories so issues are classified according to severity as high medium low i or information as i already told you that burp has categorized the severity level into four parts so high medium low and informational vulnerabilities this reflects the likely impact of each issue for a typical organization issues are also classified according to confidence as certain form or tentative this reflects the inherent reliability of the technique that was used so you can see here that the severity levels are four and you can it is also categorized in three confidence categories like certain or firm or tentative so you can see here the severity and confidence next is the chart below shows the aggregated number of issues identified in each category solid colors bars represent issues with a confidence level of certain and the bar fade as confidence level fails so you can see here that these are for severity level and these are the number of issues so for example number of issues are 12. so now it is certain and after 12 to 16 that is not certain so bars have faded all right so it means confidence level false so till 12 number of issues the confidence level was 13 and then it faded right and you can also check the coloring scheme here so high volatility is in red color medium vulnerability is an orange color low volumity is in yellow color and information vulnerabilities of label lane and gray color right so you can also check the sculpting scheme provided by burp street so that we can easily identify the most impacted or affected issues all right next is content so in content you will get information about the vulnerabilities the application contains for example if for example if application contains a sql injection vulnerability so in list you will get it sql injection vulnerability information so you can see in number two we have a sql injection vulnerability then the web website also contains file path traversal vulnerability then x x even liberty and lab injection cross site scripting ssl cookie without secure flex set password field with auto complete enabled and various kind of vulnerabilities are available inside this application right so in content you will get a list of vulnerabilities like this now for example let's take a single liberty so first of all let me come down and you can see here then you will get information about the details of a particular issue so if issue is os command injection then you will get issue details inside your report so in summary you can see that severity is high confidence level is firm host is with host means that this website or application path where in application which by which particular page is affected which is available inside path so in issue detail you will get information about that issue what is os command injection attack then issue background the issue remediation and then the request basically a question response from burp sweet let me take a sql injection attack so we know that sql injection attack is basically a kind of attack where attacker injects various malicious sql queries into web application to gather sensitive information on data from the database so if i check the issue background there so you can see here sql injection vulnerabilities arise when user controllable data is incorporated into database sql queries in unsafe manner an attacker can supply crafted input to break out the data context in which their input appears it's very simple that we need to insert malicious queries and we need to gather data from database that is basically a sql injection attack so various attacks can be delivered by a sql injection including reading or modifying critical application delta interfering with application logic escalating privilege within the database and executing operation system commands so because because the sensitive data is stored inside a database so sql injection attack perform attack on database and basically gather sensitive information like customer data for example phone numbers fax numbers customer information customer address etc from the database so this is issue background remediation how you can remediate the most effective way to prevent a skill injection attack is to use parameterized queries or prepared statements so if you use prepared statements or parameters queries so you can prevent sql injection attack so this method uses two step to incorporate potentially retained data into sql queries right so there are other steps are also available here so you can see here you should be aware of some commonly employed or recommended mitigations to escalation memory so you will get a list of mitigation steps inside the report so what you get so you get information about the vulnerability then you get the page which is affected for example if i click on this so you will get information about that particular page you can see here that this is the url mdsec.net slash address book slash 32 slashdefault.aspx this is affected and uh you can see inside the question response here so you can see here we have used a single apostrophe and then response we got the error message right it means that's condition only it is available inside but in the in the particular website now let me check another one for example let's check the access attack which is also common so let's try cross-site scripting or access attack so each first of all issue background that what is xss so accesses one liberties arise when data is copied from a request and e code into the applications immediate response in unsafe way an attacker can use the only constructor request which if basically the simple thing is attacker uses malicious javascript queries and they supply the javascript quiz queries to execute within the user's browser so they can take over the user session if user is logged in inside a website so they generally use html or javascript code you can see here that it will cause javascript code supplied by the attacker to execute within the user's browser in context of the user session so it means that you you are getting information about cross-site scripting vulnerability here and you are also getting examples here and then you will get issue remediation that how you can protect your application from cross-site scripting vulnerabilities so that's great and also that which pages are affected with cross-site scripting mobility so these informations are available inside web suite report which is great because we are getting information properly so that anyone can read this website report easily so you can see here this report is generated by bob scanner obviously by professional version of burp suite and you're getting enough information in a report to identify threats or vulnerabilities inside your application so this is all about the analysis of report in the next video we are going to learn about bug bounty [Music] hello and welcome to the next video reviewing bug bounty now in this video what we are going to learn so we are going to learn about bug bounty process and i am going to explain you so let me open the setup now guys we are inside kali linux machine and in the previous video we have learned that how you can generate report and then how you can analyze it in this video we are learning about bug bounty process so first of all what is bug boundary process so bounty process means for example if you are a tester right so you can test a website or test an application and then you can report it to the company and then company can reward you so this process is called bug bounty now many researchers perform bug bounty so let me show you more about it so first of all you can see here that this is burp suite community edition where you can see here that it has listed various kind of vulnerabilities now for example if in a particular website for example oneweb.com is our testing website so in that website you found sql injection vulnerability now suppose it is a real website so you can simply report this website to the company acunetix and then kinetics can give you a reward for this so you need to find some new one liberties inside a website right and then a company can provide you various kind of rewards for this right so let me provide information about uh bounty so finding your first bug bounty hunting tips from the berkshire community so if company so if a company wants researchers to find one liberties in their website or application then you can try and search it if you are not authorized then do do not try to scan a website for any kind of vulnerability first you should be authorized so if company says that we have an application and and you can test it then you can test a website and provide them a report so we have various kind of platforms for this for example you can see here hackers 1 to 20 reports showed that the hacker community nearly doubled last year to more than 600 000 with so many people involved we wanted to get some tips and tricks from seasonal professionals so you can see here that what is the process here are some advice on how to find your first paid bug bounty according to our community so obviously you can use burp sheet to test a website and if you find a one liberty and if you are authorized then you can simply submit the bug to the company and company can provide you a reward right so first is understand the process so new bug bounty hunters should narrow their focus to allow them to get familiar with the specific volatility type and really get to grisp in it so for example i have showed you in the course the os top 10 so first you need to learn more about them our community advised newbies to start small go for simple bugs and really understand the end-to-end process before trying to hit those bigger targets so first you first try to find small bugs right and then learn how you can report it to the company and then you can go for the big liberties focus on specific type of vulnerability for example sql injection attack read write-ups on this vulnerability search for this one liberty on the program you are targeting so you can search it or you can take help of burp free tool to find it when you find a bug change the type of liberty and repeat step one so for example if i if you formed sql injection vulnerability in a website so you can change the type of vulnerability now you are checking access small liberty in the website and then you can repeat the step one which means focus on the specific type of vulnerability you can also say don't over complicate things go for something easy that you understand even if you first bounty small it will feel much better than a bigger one later on so you have to start from a small issue for example if you found a simple injection attack right so you should not over complicate things uh select a particular issue and try it in a website if you can't find it then change the issue type and then find it all right so do it one by one step by step find the uncharted territories number two looking out for and explore exploitable areas of the web is what people lot of people would say hacking is all about those dark and dusty corners according to some of the other followers are great place to make a start of finding for example look for the dusty old corners of applications that everybody has forgotten we ran into all the time when i walked on at google if you see the old google logo or time sermon font somewhere it's a good place to look right so you can also choose an old private program that pays small bounties there are various platforms i will provide you more information never stop learning the most popular piece of a community advice that to always keep learning so always keep learning about vulnerabilities always always always learn more about one liberties this advice will get you far in the world and it's certainly something we encourage ourselves that's why we build uh so they have built sick web security academy as far as we see the only way you can be sure of never achieving achieve something so you can check web security academy i will check it later all right so let's check your first bug boundary what's when it come to the first successful bounties for a community that was a definite focus on content discovery it looks like once again knowledge has been proven to demonstrate power right so recognizance or footprinting is the main step to gather content about the website so first you need to be mastering it so discovered the page by using google docs from my phone on the web form for work obviously that's how you can gather information from internet about a particular website you can see here my first bug on very first bug crowd bounty one was the stored accesses in a messaging back that i found using the buster leicester via intruder so doug buster is the tool available in kali linux uh to visit directories of a website storing ecoing out user controllable information first buck i ever ever found as analyst was a massive sql injection turned into my first miniature pen test and which led to me discovering my love for penetration testing so you should you need to focus on a single vulnerability focus on a small issue and then go for big later so if the final word is for some advice we say when learning ensure you get practical experience via labs like web security academy on lags exotic for example so let me check web security academy so let me open this one so obviously it is provided by port swigger so you can see here web security academy is free online web security training from the creators of burp suite so i mean you can learn about vulnerabilities quickly with the help of these labs right so labs are about sql injection cross-site scripting csrf etc so you can try this one web security academy or you can also check exor.net obviously it will contain few tasks so you can see here these are the missions and you need to complete the missions here right when you are ready to hunt for real pick a website with complex functionality and don't move on until you have learned how it works inside out so obviously bug bounty hunting whether undertaken as a hobby or a full-time professional can be foot in the door for all manner of cyber security careers right ethical lacking is fast becoming an integral component of security testing and according to hacker1 hacker powered security officially became a widespread term in around 2016. so you need if you want to learn about bug bounty hunting then you need to start from a small one from a small issue first of all perform uh practice on websites like these and then you can come in real world from his beginnings as a bug bounty hunter james experience with pen testing and moved his focus to becoming a security researcher when he's not scouring the web for forgotten hacking methods and vulnerabilities obviously you can use the the functionality of verb scanner right and obviously we know that with the help of buff scanner we can gather information about various kind of vulnerabilities in the website that's why pen testers or security researchers use burp suite tool to gather information from application or website so this is all about bug bounty hunting of a website or application with the help of brepstreet so this is the end of the course and i hope you have enjoyed it thank you so much for your time
Info
Channel: Science Course
Views: 15,788
Rating: undefined out of 5
Keywords: wifi hacking, wifi hacking course, wifi hacking full course, wifi hacking 2022, wifi hacking tool for android, wifi hacking in mobile phone, wifi hacking with kali linux in hindi, best wifi hacking tool for kali linux, best wifi adapter for wifi hacking, wifi hacking class, wifi hacking course in english, defcon wifi hacking, linux wifi hacking distro, wifi hacking explained, web hacking, web hacking for beginners, web hacking 101, web hacking with burp suite
Id: 3HW4cK-L4io
Channel Id: undefined
Length: 166min 51sec (10011 seconds)
Published: Tue Jun 14 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.