Holistic SEO Tutorial Step by Step and Case Study: From 10,000 to 500,000 Organic Clicks a Month

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello my name is correctuber and I am the owner and founder of holistic Su and digital in this video I will be demonstrating to you a holistic SEO case study and tutorial step by step so we will see many practical things in this video I will I won't be diving that much to the patterns or the theoretical sections but you will be seeing that why the average pixel every letter or every millisecond and every user feedback or user data or user Behavior actually matter for the SEO if you want to be an holistic essay or if you want to actually implement the holistic as your usual pay little paranoid about the answer it is like SEO for SEO before the money and of course eventually it makes money too and the real world SEO case study that I will be using for explaining the holistic as your mindset step by step actually it is from uh let's say it's from 15 months ago because for the last 15 16 months I am managing this project and I can tell that actually it goes really really well maybe even more than actually 15 months maybe even 18 or 90 months I don't remember the exact name exact date but I believe it's really long time I am managing the project and when I first started this project it was taking only like I don't know probably it was taking like only 200 clicks a day and right now it is actually taking more than 15 000 clicks a day averagely because we also leverage always the trending SEO events or trending user behaviors with Trend worthy entities or news Authority topics so you see the actual updated results here it is coming to the half million according to the CM launch but the actual number as I said is actually higher than this and the another section here is that by the way this SEO case that it will be used especially explaining how to expand a topical map because when I started this case the website actually was really new so actually brand new but when it comes to the expanding the business because after we dominated the energy industry and electric prices or let's say electricity bill or electricity supplier change type of things once we dominated that Niche where the company decided to go on go for for credit and insurance so you all know that these industries are actually highly competitive and in this case to be able to meet another main competitor in the area there are some important steps that I am implementing and I will be explaining some of them in this video but all of them or let's say the most important parts of them they will be explained in the how to expand a topical map over an existing website section but the key point there is actually protecting the brand identity so it is not just about about an opening and new subfolder you have to connect the new topic to the old topic via your brand so usually seos they miss the source attribute because on the web we nearly have billions of websites and every day 252 000 of new websites are opened so AI is already belonging the web and that's why even if you have the really quality content I can't tell everyone has that the main attribute of a Content comes from the source if the source is Authority if the source is quality the words from the source will be more actual meaningful so the to be able to expand the topical map first you should connect your Source attribute to the topic then the new topic should be connected to the all topics and with that said these are the overall results you can see the query rankings in this area and this is this means actually Insurance in Turkish and this is the initial results for the insurance industry and these are the credit industry results initially so most of these topical Maps they are not completed yet I can tell only 20 percent of the topical maps are actually completed so topical coverage is really low and it will be very much higher and the backlinks are not used for this project by the way at the beginning for the uh for the electricity or the main Niche let's say we have used digital PR but for these two now it's just passive and you can see auto score or backlinks in this area for this main page in this section if I go to the hdfs in this area you will see that actually the website even has lost some of the broad core algorithm updates and we will come there too while I'm actually explaining the past of this project because there are side migrations here anything that you can imagine it happened and we have lost all of the improvements and Etc but basically as I always tell to you if we create a proper semantic content Network website eventually will continue to thrive even if you lose a couple of broad core algorithm updates it will be coming back in fact my in my YouTube channel actually I have published a couple of videos about it so that some SEO Consultants they can actually explain these type of things through their own clients and they can actually gain some further time and I don't remember which one exactly was that I guess it was this the website in this video it is actually the from the in topical Authority SEO case study and I have actually published this case around this area so when you are actually shocked with for the ring with the rankings without backlinks it was in this area from 0 to the 15 000 clicks a day then I have actually doubled it nearly then I lost it then we have even tripled it so my point here is that do not scared as long as you are actually improving your website eventually it will be showing its results if you create a proper topical map and a proper semantic content Network search engine will eventually get these connections from you and they will be start they will be starting ranking you even further and further so I believe this example here it is a good example and you can watch this video too I guess it's just six months I feel like I published this very much before anyway the managing agency it's it every minute every day every month is really intense so you can check this and I believe it will be helping for you so if I come back this area we have uh regained all of the Lost traffic as I expected and if I come to this section this is for insurance results initially and these are the initial results for the credit industry right now here and as you see there is not that much referring domains or something here I I didn't create this by the way you can check this if you want and this is the traffic value according to the ashes foreign okay so since I explained these areas I say it before that all these videos that I am publishing they're actually for preparing you to do my semantic SEO course and when I publish the course I won't be giving the website name from my socials I might not even announce it so I like this type of marketing strategies that I did so I can even actually use I don't know maybe I can even just accept the one first 100 person so I am serious about this type of things and to be honest if you if you want to just see it directly just subscribe to this newsletter I will be putting the link to the description or the comment area and with that said this case that actually have been published with the authoritas before this is the website name you can check it everything is open and everything is nakedly published here the main thing here is I order the update of the case study three times and then I updated it four times two but the outer test or owner Lawrence a really loud person he told me that actually he will be able to update the article later because he is already shocked because whenever I updated update the case that it is like doubling the traffic and since I also collect all the case studies that I published before to do my own website I can tell that we will be able to actually follow these things from this area probably I will be putting the updated results to the top so that it can be more relevant to the reader because as you know most people won't go through these areas these areas these areas but we will be following these things actually from the updated version with that site without or let's say before going further I would suggest you to actually check these articles too so I said that I am using the trend word events a lot in internal links in new page Creations or also topical map Creations as well so if you want to understand why the trending events are important watch this video and the video that just here this is the second version of the same case study and it explains the search engine trust and historical data Concepts the jump in this area is the jump in this area the second jump is actually nearly over three times bigger than that and if you've watched these two you can understand the trending events importance and historical data's contribution for the search engine trust this is the latest video that I published again as you see we lost I started here to the project and I gained many core algorithm updates and I lost and then I actually use the topical consolidation and topical Gap Concepts to fix the website again and it is here and as I say most of these cases the Publications it is it represents a switch for me so most probably I won't be updating this case studies again I will be publishing some books then I will move on to the next step which is owning revenues acuities and creating my own projects mainly at most probably my agency yes I will continue to manage my agency but I want to change my focus or some let's say after the course as well I will be focusing on some other areas as much as possible and that's why I wanted to actually publish these things as a memoir for me and help for you so I believe yeah this one too this one here is also important because in this one I am explaining actually how to exceed the quality thresholds to take traffic from the even the highest ranking websites like WebMD helpline or Mayo Clinic and with this website I used only 27 articles and this is the three weeks later than that and the website right now it is nearly same actually so this is the updated graphic I said in that video that this website actually would lose lose the traffic but it didn't lose because it's it's between the core algorithm update it once won the core algorithm update but since the website is passive because it didn't get the investment it will be losing the traffic and then probably I will be explaining the website name and I will be showing the Articles or I even have the recordings about my revision methodologies maybe I can put them into the course I don't know but I will definitely put the complete topical map design to the course as well probably after the initial bite size modules there will be Advanced models so in these Advanced models probably these will also be there so with all these steps I believe I processed and made a beginning here in 10 minutes sorry and let's go to this area and start from the beginning so this website here ngazab.com my first step for the case study or this project is actually making it an entity I don't rank websites I rank Brands and if their website doesn't represent a brand I create a brand so being an SEO consultant is not just about rankings you actually you should be behaving like an SEO vice president in other words you have to redesign the company you need to actually give a give it as a structure you need to coach the people you need to train the people you need to convince the people if you're able to do these things believe me actually you are a good SEO consultant and you won't fail sometimes I can't tell that the clients they are like a little babies and you need to be at their babysitters digital babysitter sometimes I call it call myself in this way but you have to do it and if you have enough information it won't be that much hard but you might feel tired since you will need to talk a lot so once I started the project actually I can tell that these are some of the results and this is my motto I have many SEO Motors like think like SEO code like developer change as your culture or every pixel millisecond buy it letter and user matter for the SEO because at the end of the day website it's it has this confidence it has pixels milliseconds basically we have the milliseconds while using it but it has bytes and it has letters so we have to optimize all these things and user section here it actually represents the historical data or user behaviors with the historical data and if I come to this bottom area you can see some other initial results and some other initial results in this area you can realize that how small the traffic was at the beginning section when I published a case study and these are some of the increases or increase graphs again and in this area I explain the background of the case then here I have a code test model from the owner of the company a person that I highly respect and I I can't even tell that he is a kind of mentor to me right now but this is one of the things actually that's why I also like the SEO consultancy because I know many really good people talented people and there are many other names here you can check them later too in the project and these are some of the results so let's start to the steps so practical suggestions I will be giving directly and I won't be reading that much patterns or these type of things in this video so let's start first of all you have to understand this concept okay this concept actually it is coming from the Google developers and Google developers they use these things like or rail abbreviation to explain how the browser works when you actually interact with the any kinds of web page component first thing is giving a response then animating which means actually changing the pixels then there will be an idle time then it will be loading so they call it actually rail and this rail concept it actually explains all of the optimization steps and according to the adiosmani actually we have only I can tell that we only have like 13 13 milliseconds to move a pixel because even if actually even F actually you you click or you do something basically until the website actually give a response animate things or there is a load event or idle time Etc the the time that the browser can actually find the biggest time span there it will be 13 seconds here actually it says 16 but I I can tell that actually this is a little optimistic let's say you act you should accept the 13 uh seconds there and if we would have further time I will show you actually performance type of the Chrome most seos don't know how to use it properly but there are many FPS check or FPS audits or layout threshold types of things if you want actually you can read this let me just open it I guess I can find it but somewhere I guess I won't be able to find Maybe This Time layout threshold or it doesn't have to be from me if you check this concept the word layout threshold where I can't find it in this area I believe some sections explain that but maybe not maybe here but basically layout threshold means that you are painting an element or a web component on the web page then you repaint it again it means that the previous paint has been wasted it means you wasted the bytes you wasted a milliseconds and also internet bandwidth so anyway let's continue you already know I believe these metrics too from one second to the three second the triple probability of the bounce rate actually increases the 32 seconds 32 percentages according to Amazon in every one second they actually lose nearly 1 billion dollars if they just upload or download a web page a little later then one second one second they see it one billion dollars so in this case here we actually try to create a kinds of uh a kind of performance thresholds when we say the performance threshold or performance budget it is something that you can also do with the lighthouse Lighthouse performance budget I hope I can find it this time so that you can see it here too web from web.dev you can actually see these things if you read this article you will get actually what kinds of performance budgets you can create if you create a performance budget means that you will be creating a resource size resource count limits for yourself you will be also creating a limit for the third party resources too it will be helping to you actually demonstrate what you should be focusing on and it will help you for communicating with the developers as well so according to the Google actually your response time for Server response time it should be under 200 milliseconds when it comes to the HTML web page size actually it should be under the 1.2 megabytes 1.2 megabyte it's even high I would suggest you to have something around actually 700 kilobytes and one more thing is of course that one more thing that we need to focus on is actually the Dom size the HTML Dome size it should be actually again lower than 1.2 let's say 1 200 Dom element size but I will suggest you to have something around at least at top let's say 900 you don't need that much actual Dom elements especially if you have a block you can even create a layout like with the 400 or 420 or 450 Dom elements if you are able to actually create a simple website or a light website you will be having actually very much higher rankings this is something that I will be publishing in the future which is about actually cost of red level imagine that you give this much information in your web page but this is the cost of retrieval or let's say cost of rendering cost of fetching cost of just downloading your web page so in this case actually one pixel or one byte is not one anymore because if I just request that web page 50 times a year it means that actually it is 50 byte now and usually Google actually they make maybe just 10 000 requests a day just for that image so you need to multiply it with this type of numbers that's why I I am telling that you should be a little paranoid about the SEO so basically we are playing with these claw bites and to be able to give these develop to be able to give a conscience to the developers you have to use this type of methods visualizations it will be helping to you and I have used this type of actual methodologies to explain them I can tell that I will I will start if you are a new generation SEO please go to the historical articles and check the old version of the Google search console it was giving way much more information inside the alt version of the Google search console you will get a tab news tab in the news tab there is an error which means HTML is too long it doesn't exist anymore inside the Google search console but I will suggest you to check it because if your HTML web page is higher size from a certain level the Google won't be able to actually use all of that section or all of that article still they have something like that there is a googlebot size limit for fetching requesting or downloading the specific resources but this is a little more actually let's say citric so I would suggest you to check this HTML is too large warning it still exists in the Bing webmaster tools by the way so basically if you have a smaller HTML web page size is always better and to be able to do that please be a little bit paranoid just open your source code and find any anything that is not necessary anymore and clean them all get rid of from all of the actual unnecessary buyers if I would ask you what is the size of this letter most people wouldn't know it but it's actually it's just one byte which means that actually eight byte at the same time and this also again is an eight by two so I know that it might sound a little funny but after a point actually these eight these eight and the other eight they start to they start to become like megabytes like the gigabytes and it will be actually building in your website is added down and this is again actually this is from the let's say uh bing webmaster tool as a warning and I guess I have taken this warning from the BTM GitHub project and if a lot of size of the HTML is estimated to over the 125 kilobyte and risks not being fully cached search engines may not fully acquire the content on a page if the page is contains a lot of code extensions code can pushed content down in the page Source making it harder for a search engine crawler to get to a soft limit is of the 125 kilowatt is used to guidance to ensure all content links are available in the page source to be cached by the crawler Etc so in this case actually yes at the beginning the project was using.net.net core or these type of things and HTML minification for android.com was done in the first month of the project but further down the line we had to can we head to this work as we had some server in capitalists during the migration from.net to the.net core even the simple things might take time if you don't have a proper back-end structure and it was one of the situations but we handled it so CSS and JavaScript refactoring I will suggest you to check these Concepts CSS refactoring means that actually maybe I have written it maybe didn't I will suggest you to check also smashing mangas in its really good source basically we are actually using the same CSS rule same CSS classes IDs with less lines so we are telling the same thing with less lines when it comes to the JavaScript refactoring it doesn't exist it is actually called JavaScript and I say three shaking basically you shake the tree for actual dropping the that code and if you are able to drop all this that code it means it actually web page again will be able to let's say rendered in a faster way and when it comes to that part let me just check that how does a browser create a web page basically it doesn't create but it was a name that I have found this is one of the articles that I have written very much before like two years ago just give it a read you will get actually many things that you don't know about the browsers you will get how the browsers actually would get things for example here as you see these are the bytes first they read these bytes then they turn them into the characters characters becomes the tokens tokens becomes North notes become Dom and this section here it also explains what a rendering tree is and if you have a proper HTML 3 or tone 3 then if you also have a CSS form CSS object model in a proper way the render 3 will be able to create actually in a faster and better way and the structure here explains it you can also understand the difference between virtual Dom and Dom because I even use Shadow Dom inside this project so again this might be a little Advanced technical issue but I have to explain these things to be able to transfer these things to Ezreal and this was by the way I would also suggest you to read this article to advanced concepts for measuring the page speed in 2020 so I I will tell you one thing I was following Google developers from GitHub to understand what they focus on so that I I was trying to optimize my projects before they launched something and these advanced concepts these are core web vitals they didn't call it call of core web vitals on that time they were talking about less largest contracts of control paint Etc but I use these con these things as advanced concepts so I can tell that actually even like 15 months ago or 15 or maybe one year ago before the core web vitals is announced we were even optimizing for it so this is about actual holistic as your mindset again you don't have to wait a search engine if you continue to learn things from the mainstream SEO media you are not learning something you have to learn before others so that you can actually be the Pioneer in the industry and this is one of the examples for that too and if you check these areas you will get these Concepts and how they can be used by the relevance point of view as well I will suggest to check this performance tab screenshots it will be helping to you even there are some examples of actually JavaScript tree shaking in this area I have created these images even with my hands to explain these Concepts this is here actually it explains the largest content of paint and it explains actually why this is the largest contact of paint not this because there are multiple elements inside the same div even if this div is bigger there are multiple elements inside it so a single element one actually will be the LCP but there are other rules here that actually you should remember this is also one of the newest websites that I managed before and here as you see this is an example of the community layout shift because of the injection of the ads and Etc anyway so if I come back to this area so here actually this is an example of to actually CSS specificity if I need to explain it to you basically here actually we try to for example if I need let me just explain this why try to use shorter CSS selectors one of the biggest costs when it comes to using the CSS it is coming from the long long selectors always try to use shorter selectors it is actually better when it comes to using the really big numbers of the let's say CSS IDs or big numbers of classes also try to avoid that too and it can be done by creating a proper Dom tree and this is called actually CSS specific calculation based on how many IDs and how many classes that are you are using the specificity actually will be changing and usually of course if you want to override some CSS rules you need to come closer to the that specific Dom element in the let's say selector area so basically I can tell that we I want as I say the total size of the web page assessed was more than half megabyte and then we actually decrease their buys to do like 14 kilobytes or the 7 kilobytes so from this number to this area it is a really great increase I can tell and this the year over year increase which is doubling the traffic exactly and you can even see the impression increase or other areas in this two area in this sections too so when it comes to the caching you should understand that actually the search engine crawlers they are also caching your documents but they are using aggressive caching the aggressive cache means that even if you tell them that do not cache this they will be caching it so they don't care about you while caching but you can help them for example please do not use that much numeric values inside your document names okay so just give them simple things to Cache if you if you just have six just today I was scrolling your website there there were more than 6 000 version JavaScript resources so you can't go to a search engine and tell that please I have just 30 pages that you can use but to be able to use these 30 pages you have to crawl also these 6000 versions GS files it doesn't convert it so use a single file update it don't be lazy and there are other things here like CSS chunking I won't explain these details that much you can check them later because I know that many people come and come these videos and go back but basically please use CSS chunking most of you use WordPress it's really easy to do maybe one day I can create a WordPress course too or tutorial something but if you go just inside the functions PHP actually you can tell that lot these resources on this page type load these resources on this page type it's not even hard you can also use some plugins but you don't have to upload everything for every web page type or every web page resources and these are the actually metric changes for the core web titles and also in this area actually we are talking about the size changes but I can tell that I can't tell that actually one once once I have used this homepage CSS and sub page CSS the core web models maybe didn't exist on that time but when we look at to the change on the performance and also the crawl Stars it was really really good one one advantage of the technical SEO is that you can see the effect faster when it comes to the content optimization the effect changes of the content as special site-wide changes here I am talking about it might take really good amount of time but when it comes to the technical SEO changes it might show its effect directly for indexation crawling stats and also especially for the impression increases too so you can use these type of things for your advantage as well when it comes to broadly so we all know that actually it has been found by the Google instead of the Gizzy Google founded actually for compressing the web so that they can actually crawl better and make money and it actually uses the lz77 algorithm for the compression and it compressed 36 percent better than the other four months I can tell and it is still actually a state of art for the compression most people even don't know what these type of compressions are but it is important to understand the biggest cost for creating a web page is waiting and what we wait response from the web server and once you get the response then you need to transfer the file so before transfer the file your web server compresses it they transfer the file then you uncompress it that's why when you look at the actual Google Chrome Network tab you will see two sizes transferred resource size and web page size directly so it is the uncompressed size you should care about the uncompressed size mainly but when it comes to web server compression checked actually transfer size is more important there so in this area I explained actually why the server-side compression is important and here yes website was using keyzip and we changed it it wasn't easy because I need to make lots of resources on that area to help the developers because we were using a kinds of Microsoft system and Microsoft Technology there anything to think here is that of course HTTP 1.1 why it is important when use HTTP 1.1 I usually use paint for this type of thing so let's say this is your web server and let's say this is the web page and let's say this is the user so in this case if user makes a request to your web server to actually render this page and if you have let's say this amount of resources for per round trip whenever the requester or let's say the client whenever they go to your web server they can actually perform they are performing actually around trip it is called rtt round trip time whenever they go and they come back for render the page they can only take a certain portion of these page resources a certain portion when you use actual HTTP 2.1 it means that actually they can take bigger resource sizes I can go even deeper there I remember that actually the first 1000 1400 66 bytes which means 1.4 kilobyte of the nhtml document it is the first thing that actually that can be taken during a TCP connection that's why I was putting the most important SEO attacks to the top so that the search engine can actually see the SEO text directly because I can tell that every crawl hit has a different purpose I am not just talking about their Discovery or refresh as they show in the Google search console if you just check the first Google Patent or the Google design even you will find their fancy hits plain hits if you check what fancy and plane hits are you will understand their point or obstacles and of course some of these things come from the experience unique experience and I always tell that experience is the most important information or knowledge type so it is one of the hard things in SEO because getting that kind of experiences is not easy especially in 2022 so I'm glad that I have started this industry seven years ago because I have seen many things thinks that the new generations won't be able to see so you can check these images as well I will I believe they will be helping to you by the way I even optimize the response heaters because some of them are not necessary or not necessary so even these sizes I calculated okay compare yourself to the comp based on competitor for every vertical and in this area we actually explain what resources have been taken and I also have written tcp's low start contact concept maybe I can find it TCB slow start holistic seo let me just check yeah what is this this slow start if you just check this article it will be helping to you a lot for understanding actually how the web servers work and I won't tell but you can find a really critical difference between Firefox and the Google Chrome with this concept I will suggest you to understand this algorithm here too by the way this number as well the segmenting these things and the conj congestion window Etc so these things will be helpful for you to understand or deep dive in the technical SEO further so server push what it is no one talks about it but I have used that so imagine that when you what what did I say just before the biggest obstacle in a web server is actually waiting you wait it's called stall time or stalling time if you don't want to be stalled just push the resource so if there are some really critical sources for you for every request just push it even if it is not necessary for that moment so the purpose here is actually giving some important resources in a really faster way resource loading order and prioritization so you can actually optimize basically three things for a web page when it comes to speed resource count as I say resource size and also resource loading order or prioritization so I have used all of them I won't be explaining the difference from Perla to the DNS prefetch or to the pre-connect here but I can tell that the previous version of the prayer lot was actually sub resource the sub resource was a chrome actually let's say Behavior hint and then they replace it with a prayer lot for because of certain reasons and you shouldn't use the preload a lot if you pull out everything you are not preloading anything so just use the most important thing there I will suggest you to pay attention to this s section do not forget them I won't pastel everything as I say one more thing is maybe you would like to check these articles too let's say loading CSS async I believe I have written it before you're here I will suggest you to check this article because you can actually dial upload or download use CSS files even with as I think it's important one more thing here is actually web font file optimization maybe I can find it I don't know yeah loading point yeah this is also a really long article for me for technical so I would suggest you to check this article too I will putting it to the description area there are more than 20 I guess ways of optimization here for the font files with all details and the Practical examples even cleaning glyphs inside the font files as I say be paranoid and I did it for this project as well I clean teeny tiny corners of the cliffs so that I can just save some other bytes okay cross browser compatibility again no one does these things lately but just check actually all of your CSS files or JavaScript files for cross browser com popularity if it is not compatible for all these major areas it means that your website is not servable for all of the resources or the users and in this case actually some of the things that we were using they are not servable for everyone so we are using other ways for these type of areas about the prayer lot here I explain many things what it does how it does and how it caches to I I also would suggest you to read some other articles again from me I guess it was very let me just check because here I explain something called actually something called HTTP cache hierarchy if you understand the HTTP cache hierarchy it will be really useful for you but this is not the article that I mentioned actually maybe this this is easy yeah in this article you will be learning actually that hierarchy in a really good way and that hierarchy section will be helping to actually understand these areas especially first browser tab image cache product cache service worker HTTP cache HTTP 2 push cache and the edge cache and the origin server so if you can understand this cache kinds and the hierarchy you can actually leverage your server response times in an easy way and here I explain why I have chosen some certain types of resources for caching there and if you put a lot everything by the way you have to be careful about the bottleneck CPU bottleneck of the browser I did these type of tests as well for multiple browsers multiple tools so this is an example of actually what preconnect does DNS TCP and TLS basically actually DNS resolution we check where your website and resource is located when it comes to TCP it means that actually we just connect the server and start the communication and TLS means that actually we are encrypting or let's say we are using your HTTP https certificate to request the resource from the https server and there are some other sections here you can check them later as well I can even talk about this polyfill i o by the way the one that I have used these are the preconix that I have used I let me just explain you what is difference difference between preconic and DNS less ideas preferage Etc the main difference between them is that perconnect also performs the TLs handshake and it means it is faster actually it does some other things but I will suggest you to use it for the only the most related ones or the necessary ones so loading CSS files I think I already mentioned it how we do it and why we do it and these are some representative images for explaining why it is useful I will suggest you to check these things too and there is an example here by the way image placeholders I also use image placeholders in this area to just increase the increase the actually let's say first meaningful paint it is not used anymore as a metric but let's say first paint of the main element of the web page in an earlier way in this case and I can also tell that I have some scripts that I am I have written before basically here I am actually disabling all of the JavaScript files in the website and I'm taking a screenshot to see which JS file if I blocked it how the content changes and as you see here actually swiper mean GS in this area when I block it I don't see this cookie or something consent blah blah blah prompt so in this case according to that I actually I think some of them or deferred some of them and in this area it is for that and this polyfilm in GS2 I didn't remove it even if I am actually increasing the request request count here but here actually the polyfill means yes it is a little important because if I am using a legacy browser an old browser this file here it will be making or keeping the website usable at the same time right now it is empty because I am not using a legacy browser if it is a legacy browser it means that the rest of the section comes so even I actually cleaned these unused uh bytes based on the dynamic serving okay I'm aggressive image optimization with SRC set and Aviv extension when I write this thing no one knows the Avi what Aviv was right now by the way there are some other really good image extensions from time to time I do this type of searches to understand which one is best as if right now still is it gets more popular it's better than web people please use Aviv and here actually there are different types of pixelization algorithms and there is something that we use image capping image capping has been used by medium activity by the way to increase the actual page speed and it's really good thing basically image capping means that whenever you look at a quality image actually there are some certain types of pixels pixels that actually you can see you can't understand even if you see and with some certain types of tools without losing any quality we just remove these pixels actually so imagine that there are these type of pixels in an image you don't see some sections they are just there but you don't know so we are actually removing these type of things and this is it this is the owner of the or founder or inventor of the Aviv by the way Justin Sheamus and this is like from 2020 I was telling that why these browsers don't support lfv despite it is the best and prepare for the future he says yes he is right prepare for the future and right now there are better stand up if so learn them so I am skipping many sections here because I need to be quick or a little bit quicker here I talk about the exchangeable image file or exif data and yes I have used the exif data don't just look at the Google use the exif data they say that they don't use it and they start to use it to understand actually who is the real owner owner of the image whether the image is licensed or not and we will have some screenshots for them too but I will suggest you to actually just try to use this iptc metadata and exif data to show the owner of the resource or the main Resource as well and yeah this is also true I didn't use different images for desktop or on the mobile to keep the let's say the HTML even shorter and cleaner I just used 600 pixel of course it's not good for the gold discover but I don't have that much interest on there in this project so I used a single simple image for both of the devices and there are some other things here too because since there wasn't a there wasn't since these browsers they don't use avif I just use actual picture Source search set it means that actually if the browser can't can use avif user if if you can't use webp if you can't use gpac so I have tried to optimize for everything in this area and as you see I'm using cement HTML here figure inside the figure image inside that there is something else this area says no image because it's the placeholder okay so do I really need to talk about the alt text or the image URLs here yeah I didn't talk about them here because they are the one-on-ones but yeah I optimize them too and maybe I can even talk about alt text a long time because image search is really magical thing but maybe another day entire section Observer for image laser loading entire section Observer API it is a thing that actually that you are using directly indirectly for laser loading and I use Manila JavaScript for it by the way I just missed the old days I was quoting very much more than today when I was just alone anyway so I will suggest you to check this intersection or server API it's really good thing you can actually use less code for a really simple uh thing and I can tell that if you just use the laser attribute many people use lazy attribute inside HTML for Chrome laser lot but back then it wasn't work for the Firefox so many people were actually using the laser light for only Chrome but they even didn't realize so I just that's why I used actually intersection Observer API for very much higher compatibility and here you see that actually the compatibility from intersection of the Observer API it says 91 or less than 92 percentage of compatibility but when it comes to these laser loading it's just 70 percent for the population so just think every user you will get better results in the SEO so image plus hold hold it for the better speed index and speed index and the largest contentful paint yes I used actually these and there are some pivots here but I guess the whoever actually put these things there they they didn't think about it but there are some yeah this is actually coming from me because I remember see here this is a feature snippet from the website but here there is a placeholder because Google doesn't render or use the JavaScript all the time and I can tell that many times they were using actual image then they replacing with the placeholder than replacing it back I call these things actually Loop of indexing in other words they re-index the page in the same URL and based on the last scroll information and when once you use the placeholder image it might cause you to actually lose the lose the featured snippet so that's why actually I have changed these things a little further too but this is a good example of actually Google doesn't render the JavaScript all the time so as much as possible rely on the HTML to be honest and here uh image completed and it brought it back to yeah after a point they removed the image section already but if they find actually a candidate that they that they can serve with image as well they can replace you and later they brought the specific let's say image back so here I say actually Loop of search engine decision trees as well and there are some explanations here that I have given to you for preventing these type of situations and there are some other cumulative layout shifts related drawings that I have done in this area with coding to just explain sometimes to the clients or to the developers so when I just scroll down as you see it delays a lot it creates a kind of shift so any unexpected shift in your web page it is actually a problem for the layout shift and to be able to prevent such a situation actually you can make the request based on intersection Observer API you can tell that if I am close my image or my screen if it is close to the image for let's say 100 pixel you can make the request and it will actually prefer prevent that I used Progressive web apps by the way I have a presentation for it I didn't upload it to my SlideShare but I will be uploading it later but I have done a speech in the SEO Mastery Summit for this with every detail I would suggest you to check it if you are a member of the SEO master smith from Mass singers but here yes I use the pwa I would suggest you to suggest you to you also used it the main point of the pwa is also it is also structured data if you are able to communicate to the search engine from multiple layers from Json LD from your also HTML structure from third-party sources also from your manifest Json file to demonstrate your website name or your website logo or your website sections or your website purposes or the screenshots of the applications or Etc it means that actually you are giving the same signal over and over and over again which will be increasing actually the search engines thrust or the confidence score for the thing that they are getting there and then actually I made the specific uh specific website I'll launderable to the home page and then I use the service worker for better HTML payloads and make the web server actually relived and cleaning unused code from the third party trackers this was actually an in initial screenshot basically there are some tools to understand actually restored to party resources that they are using basically and I cleaned most of them and some of these text sections actually there for entire web so it shows actually how the and how these things are used entire web and in during right now actually in Lighthouse there is already a built-in three map creator for the Javascript file so I will suggest you to check it as well and called LinkedIn document types and the responseaters and HTML files this was an issue too because in response here there's many times the website was telling that this is an HTML or text Etc but it wasn't or it was telling that this or your HTML documents charge set and the response reader chart set they should be matching if they don't the search engine or the browser or whatever renderer that they are using that on that moment they can actually change some of the characters inside your document which will affect your relevance to certain queries so always give the actually correct signal to them and HTML digestion and HTML based improvements by the way this is an interesting uh coincidence model anyway maybe I can explain it later but HTML digestion is a concept from the search of the record basically Google actually takes your HTML document and they actually normalize it then they they just digest it the purpose there is actually removing some of the unnecessary unnecessary let's say concepts unnecessary attacks so that I can actually use your HTML files in a light lighter way this will be actually explained in the cost of retrieval book that I will be publishing and here there are some quotes from the Gary Elias and actually here we also explain what kinds of canonicalization issues might happen or happened during the project as well like in the August and September and November and it was a really big event but no one remembers it right now sometimes I just I get surprised because of the memory of the people in in the SEO field because Google doesn't Google usually they make really big mistakes actually and then they just call an event and they tell that our algorithms understand this this disease and then everyone starts to be there Messengers anyway so these canonicalization issues they didn't actually affect the website that much maybe at the beginning a little but not later because the HTML code errors they didn't even exist in the website HTML was pretty simple and it was pretty pretty clear everything so it must be really easy for search engine to actually digest everything there when it comes to the semantic HTML I used it so either now aside article figure filter this type of structure I have used and right now because of the intensity sometimes I don't use this anymore because some clients they don't have enough level of developers and I don't have enough time for coding for them but I would suggest you to use these things as much as possible and these are the actual semantic HTML tags that I have used there are a few more actually like time I will suggest you to use a Time tag for the days on your page so you wanted the Practical things and I am telling them between the paragraphs without that much detail I hope it was it is helping to you and I will suggest to understand this concept bad and mixed signals to not give mixed signals to the search engines with visual transition or heating levels or font sizes or colorizations and Etc so by the way for the semantic HTML the most important attack is actually this section when it comes to decreasing the HTML Dome size I have done many things on that area too and I was able to decrease it in this area actually I'm explaining the fever font less color by the way it's important too why do you use multiple colors for a website like five color even the five is actually too much to be honest but some people they use like maybe even 24 different colors for entire website it's not needed you just need a color for this area a color for this background and a color for the text and maybe a few colors for your foot rent either so the rest is just image colors which is about image not about CSS so I used less colors and fever phones for entire website I also found some expert authors to use yes I was using actually Auto rank or Auto Authority for many years by the way I will explain you something funny you know that actually I have written one article about Google auto rank which is just here and when I write this article to explain how Google author Authority helps you for further rankings for their Authority one person from a website from somewhere from somewhere a mainstream SEO media they told me that actually the patterns that I am the patterns or the researches or the explanations that I that I am using they are not that much related to the Google authors or somewhere but now if you just check here or somewhere you will see that actually there using the same patterns just just for explaining and ranking here with with really really less less really like a few sentences there they are just writing now I don't know why I am not able to open this page quickly but if I can show you the research you can also check the video there is a video about it by the way this is the article here maybe I can also add the video here too I sometimes I do some questionnaires like this to use them inside articles this is the difference from Twitter to LinkedIn which was interesting and I guess this video is somewhere here too Google auto rank which is here I will suggest you to read this and to understand the outer vectors you can use it in a really good way and my point here is that actually don't learn things from the mainstream SEO media even you won't be able to find I don't know probably won't be able to find 70 percent of the things that I am just explaining in just in this video inside the main stream SEO media font variables you won't find there these things so I won't be explaining everything about the font variables but imagine that you are just uploading one font for everything you just rewrite or red raw the specific existing font for changing it further so I have used a single font file there are some things here that you should know like flash off unstyled element or text or flash off let's say invisible text and I have fixed these problems inside the website too with the with the help of the font display so you can check it later too and using browser's site caching for static resources we all know this already I believe so you can find this in the mainstream SQL media and I have used the e-text but there are some things that actually you should know this there are different types of let's say E-Tec values I even optimized this right section if you check the E-Tech article that I have written you can actually find different types of E-Tech response header values and in this area you can also understand which one is better what this section addon actually means it will be easier for you to understand the difference between strong E-Tech and weak ATAC I want to explain everything as I say you can check it later and structured data usage so this is one of the actually let's say organizational structure data I have used most important part actually is here which is again related to the auto range so right now actually I am creating a kind of even website for the founder so make your founder always a thought leader in the industry I have actually used digital PR a lot sometimes I even actually changed the text for press releases to relate a Founder with some certain types of things I used also for the bkm github.com I didn't even write it into inside the case maybe I can explain it later website accessibility uh yeah I am using the accessibility a lot I really give it importance and again there is an iframe here I didn't or maybe my interns they had to put there or my employees teammates they didn't put these things here properly I will handle it I will be handling later but basically I have actually optimized everything based on this with the Aria labels Ariel Dexter buy area labeled by or the role elements and a special area labeled by and the IRA District Pi they are really helpful for actually being used together with the semantic HTML you will realize that actually they are completing each other and a screen reader can easily actually recognize which element is for which text it will be really easy for search engine to understand too so you can actually use web accessibility for communicating with the search engine in a better way website redesign process yeah it also happened too we change the entire web website design and for that too especially for home page and blog Pages based on the some observations and again the patterns but I won't be going to the website representation vectors here I have actually a special above the fault area I have made really good contributions there even for actually the voting system in the website or the author box and mobile and desktop matches too you should always check your mobile and desktop mismatch problems whatever you have in the desktop or mobile they should match each other for existence and also for order or prominence of these things you can't put something to the top in Mobile and then take it to the bottom in desktop they should be aligning you should check this as I say I won't talk about patterns here I won't talk about read time calculation too but lately Google started to use Quick read quick read warnings quick reads labels inside the serve then everyone starts to talk about it again so if you see something on the patent accept it as a search essential accepted as a search principle do not care about these five days changes or six seven days of updates or Trends just do your thing for overall search principles as much as possible visual segmentation of the web pages based on the gaps and the text blocks I won't talk about this area but basically with these learnings I have actually helped a lot for the design of the website and the HTML the three structure so that everything can be connected to any any other thing in a really good way you can check this later this is an important concept so Chris gobert I really like him and the lift model the lift model for web page layout design it really explains many things for semantics too for Content marketing too for user experience too so it says something really really fundamental based at the top of the page middle of the page and the bottom of the page there are four different audiences and different searching tests that are connected and you should be able to actually use it properly and this is actually an indexing and re-indexing and reordering system design based on the Google's explanations from the search of the record pause podcast so I will suggest you to check this as well it's a good design I can tell so if they realize that the rankings on the web serp is bad they just ret Regal entire system again so as I say Loop of the search decision trees or search engine decision trees yes so I have done a lot of log analysis for the website and I use elasticsearch for it so you can actually learn how to use elasticsearch for log file analysis in in the live web server if you are able to do actual live web server log analysis you will be able to understand very much better things so you can actually do log analysis every day if you watch Google analytics data hour by hour and the log files are by hard you will be getting actually what search engine focusing on what they try what kinds of quality tests that they are doing and you can actually give a reflex to them I call this search engine communication they test they give the feedback I take it I retest or I reorganize or reconfigure the content or the entire website it is a continuous process and it requires two two big teams or a really good SEO so usually you won't get that SEO to be honest but you can at least get multiple people that can focus on multiple other segments it will be helping in this area a lot so branding digital PR entitization I call it entitization of the angels.com and every mention matters so as I say every pixel every letter every millisecond every user and every mention so in this section too we act here actually this data it comes from the color code Pro I also have written a review for it you can check it dear Jason Barnard he has a really great technology and mind and vision I will suggest you to check him and based on his data here Wikipedia LinkedIn crunchbase Bloomberg fandom I use this data for actually entities the NG asap.com and the founder of the website and at the same time here there are some other informations yes the knowledge graph API you can also check it API python holistic SEO I am writing all the keywords so I would suggest to check this article because here I compare Spotify to the Netflix I also explain the theoretical section of the knowledge graph and I also explain how to use Knowledge Graph API together with the python as well so if you check these things I believe it will be really useful for you to understand why you should entitize your brand or website and here I am checking whether the angelsup.com is actually ended or not sorry this is about for Baba cars which is another project that I managed and I made it an entity by creating Wikipedia page by myself so you can do it and here we have some search demands as you see trending search events and then we see that ngos is a topic which means it's an entity social media's effect on becoming an entity an entity-based SEO I also used hashtags for electric or electricity or electric electric law electric vehicle type of Hashtags with the social media whenever I was actually be appearing on further and more for these hashtags and whenever Google actually in the indexed these hashtags I was actually I was becoming more relevant to these Concepts and then I was also ranking for all these video carousels or Facebook credits things or other areas and then actually it was helping me for giving further Brent let's say brand signals from third-party sources if I search one of my competitors Facebook and their name I see 50 000 results if I search my name Facebook and njazip I see actually 12 Million results so this is different between Brands they have real social media existence and which is a quality and Trust signal the same goes for Instagram too same also goes for gpp as well in local for local SEO all these reviews come from the customers I found some some solutions for that I activated the Google posts area with even hashtags another the area other types of things and by the way I have even I call this exercise migration route to show the to show the past of the website so by the way to be able to perform a kind of design lift you should understand the previous design of the website you can change everything directly so some of the home page elements some of the Hidden remaining elements they should be really similar to the previous version so that the search engine can actually adjust the system easier another thing is that I have protected the past of the website by actually recording or registering it to the Wayback machine a lot repeatedly and automatically I have even written some secrets for that you can take them from my website the purpose there is actually again showing the importance of the website to the Google and also the previous versions of the same website in a continuous way as well and semantic SEO should I talk about it so you can check these things out later I won't be able to dive into this area but just check this okay this is important the Smith algorithm no one talks about it still but it is an improvement for actual bird Google tells that I don't use it for now at least I told it one year ago but usually as you know I believe what they do rather than what they say and this is my topical Authority case that you can check it and this is important you should get this different context within semantic topical graphs so you see this Google call this structure as Dynamic content serving so I write an entity here but for the same entry there are different dimensions history types animals even examples news videos so in SEO I like not giving things or not living things into the chats so in my topical map I covered all even animals I covered all of them history scientists and everything and then whenever Google whatever Google looks whenever they look somewhere they see me so I like that option there actually whenever they see me it means I get more historical data which means more trust I am relevant to all the contacts so they have to rank me they don't have another chance it's the strategy of the holistic SEO do everything perfect because Google won't have any other chance besides pranking you is the purpose there image search and visual content creation I can tell that actually I have even a really good amount of patterns for just the image search so but it is a long topic these patterns here they are they are the fundamental image search patterns but there are way much more uh I won't type in the patterns as I say but I will suggest we check these things to understand how Google actually uses query features to match them to the actual landing page features to the image features okay here I use an example actually to explain how the image selections might change based on colorizations based on objects or based on faces if there is a human face it's not that just about actually color anymore it's about the owner of the face and yes this is actually 12th November I believe and two days ago it was the anniversary of losing them came a lot of circuit okay I am getting sad and sent every year whenever I remember him anyway so image site map I used email sub map I used image structured data and I use representative images for landing pages as well I call this representative images actually as headline images by the way many times I told that actually I am writing for myself for training myself so that's why these articles that are that much long and that much detailed and even these concepts are there so that I can remember I go back I can remember what happened before when I do these things in what way and I have even many other things that I didn't publish even but they are only for me because you wouldn't even live in there they're way much more complicated and then here these are the examples of the image sitemap that I am using there are two different XML namespace here one image one regular sitemap and I have you continue to use the Lost mode and priority sorry priority section here especially and sometimes change Freight the pirate is not used actually by Google but you can keep it there for the index and sometimes for pink pink Axel doesn't use it according to the fabulous Channel but still you can keep there and I use this images to actually represent the representative images there is one more thing I actually put all the header images represent the images to a different subfolder because there are some designs to which is called subside retrieval the URL of the images they can actually or location of the certain resources and some URL words that can actually represent the role of the resource so that's why I put all of the represented images to a different folder directly you can read these sections or these sections a little even as slightest possibility is enough for me to do that so about the image search just search for this section then check what appears there then you will be actually you will be able to rank that area this was the image that I am using I always put a text here a logo and the object so sometimes I also use these type of things too or these type of labels in this area importance of clear communication and passion for the SEO on the customer side so as I say if you are an SEO you have to behave like as your vice president and that's why you actually have to train them and you have to change their culture as much as possible but lately since I own an agency lately I accepted that every brand is a personality and you can't like every personality so sometimes you shouldn't do it as well and as a confession in the old days before the medic updates actually everyone knows it already and that's why actually I have I am using all these things or that's why I'm really skeptical about Google too because they were lying really good amount in the before and this section I explained to how I educated educated the client and they I still love all of them by the way even the old employees they're really good people and Broad core algorithm updates strategy for SEO I actually have a really good Turkish webinar about this but I didn't translate it to English any any of these times basically basically here actually the world Coral grid multi strategy is about actually putting a date in the calendar and telling that on this date there is a broadcast algorithm update even if there is there isn't but you have to accept that there will be an update on that date and you should start optimal start to optimize everything before the let's say before that update or before the defended days of the update just be reading if you can fix everything in that let's say two months three months or four months it means that there is a high chance that you will be gaining the protocol algorithm update at most case studies that I have created there about broker algorithm update because I am using topical relevance Authority gaps consolidation and also re-ranking an initial ranking Concepts all of them are actually about broadcast algorithm updates every b-roll core algorithm update is actually refreshing the authority signals or the SARS prioritization all the time and it is about that and these are some of the analyzes for the let's say ranking changes this is for the December December and this is the competitor they go down this is also another competitor they also go down to I don't read their names all the time at the beginning I just said first competitor second competitor because Brandon was a little scared and this is our change here so as you see by the way the number is here thirty thousand funny really funny now this is a single page anyway as I say these are my last updates for these case studies most probably I will update this one one more time too if everything goes correctly for other areas but my power it is I say changed lately so these changes are coming for the competitors and this is for us and then we have another one June 2021 core update and these are again the competitors they go down this is us we go up and then we have July 24th update on okay and the competitors saying I say let me just check the previous one in June and yeah there were two uh two-step launch or launch of the update a bigger jump than before a bigger jump than before these are the competitors they continue to go down and this is our changes and finally we get over the actual 10 000 clicks from 700 here I tell but actually it's for a little protecting a client you can tell it it was actually 200 sometimes 300 Etc and then we have the November in November we doubled it again as you see here 140 from 140 to 240 this time nearly double let's say according to the ashtrays I guess we yeah according to the ash layers it is like more than even triple or something number is here then the migration because from we actually performed a rack.js and nextgs migration and I am coming to this killer point now during the migration yeah I know that most people won't watch this part but this is it is the magic of the SEO only the people who follow everything wins so basically I have to explain one concept before that let me tell this once we actually migrated to the next JS or direct.js we lost all of the technical SEO improvements all of it but we didn't lose the traffic so at the beginning we were updating everything optimizing everything it was helping us for ranking higher but when we lose it we didn't lose the rankings and why do you think that it happens so I can't tell you that if you want to wrap lies and Authority you have to do everything right but if you are already an authority you can do the mistakes it is actually called search engine tolerance so they will tolerate you because you are already an authority and they need you in their serp because you explain everything in a perfect way and you already have proven yourself with the historical data you earned the search engine trust they trust you so for these type of areas you can have a bad technical SEO or you might not have proper page rank or you might not have an outward bio which is silly already as learning but I am giving that argument because it's popular so if you are an authority yes you can do the mistakes but if you want to just overthrown The Authority you have to do everything perfect including technical issue and sometimes there is an there is a section here imagine this question how much historical data you need for dropping all of the technical SEO improvements when the search engine will start to tolerate you how many months how many clicks how many Impressions that section is important and usually I call that section actually let's say uh sort of sometimes I call it actually as search engine Source prioritization requirements I know that is a long name but to be able to prioritize you as a source on the serp what kinds of proof I need how many clicks or how many queries you rank or how long so that's why inside the topic Lottery formula I told you that you have to cover everything you have to cover them deeper than your competitor more comprehensively but also for longer time but how long so if you are able to calculate it it's a really real Mastery for the SEO so this section is important because even if we lose everything we don't lose the traffic and there are really important things here because I wanted to use the virtual Dom but we couldn't do it it is also one of the issues if you're a search engine optimization expert or consultant you will realize that employees of the client change too and you have to retrain them so it was the hard for me because I already owned an agency then and I couldn't train every employee from the zero so that's why actually some of these improvements they didn't come back you realize the virtual Dom and let's say regular Dome traditional Dom differences in this area basically here it's it's like liquid flow it's really easy to transfer it from a page stage but here it's not like that we can explain them later too and here during the Migration by the way technology Migration by the way a couple of things happened there and we started to lose some traffics which is also a little bit here too it was a little warning area and one more thing here is that there is also a hosting migration too this is not a domain migration but if you just change your hosting by saying host let's say w version and Etc we had to use doubly version now because we changed also our hosting so it means that all of the URLs will be re-indexed so I was able to manage it in a really proper way but at the same time I can and I can tell you that as you say this is the without this is the with W version this is the Vita W version and you see even from even after the migration for a really long time still it continues to take the Impressions always check them and if you want to actually perform these type of changes in a faster way try to do type these type of things around the trending event or use a little bit of diesel pair or the popularity so the search engine can be sure and change be sure that you are changing all of the internal links but even if you do that 25 percent of the website actually still it was taking traffic from the without www version with a redirection so to be able to convince the search engine in this area imagine that you already transferred to host and the search engine contents to keep these old URLs it happens if that URL is really authoritative for a really long time if that URL was really quality and useful search engine continues to keep it and at the same time there are some other issues there as well because changing design changing crawl pattern because from next year's to do something else or from let's say because of the video link again I'm a little tired sorry from.net to something else once we actually change things it means that Google also needs to change the scroll crawling pattern it will take a longer time but eventually it happened so happy for that and you can see the time here it nearly 10 has taken like actually two months from ninth to the 11th month it's continued like that for two months nearly and also you should understand why seven to five percent of the URLs are transferred directly but the rest is not it sounds important and here we have all the index URLs like without the W version and there some of them are also here too you can also check them and also there are other things here like this static and just but I didn't explain everything as I say we also have started to use a CDN on our srr subdomain and Etc here you will be seeing that actually image search red things created and to copyright use these type of things for being an authority it is about exif data as I said before too and there are some other things here for the image search new content Sprints I have created and URL account restriction yes I brought this because since I have a lot of technical improvements I didn't want to create new pages there I wanted to actually make the existing ones more alternative because whenever you create a new page you are actually diluting the ranking signals if you check this article ranking signal delusion I know that I have done a typo there but Google can't get what I mean if you check this case study you will get actually what I mean in this area for the ranking signal delusion or ranking signal consolidation with another case that I have published before with the jet octopus and the hookups has a really loud owner I will suggest you to check they are 300 people and here this is the competitor again they go down competitor they go down this is also competitor by the way they go like zero this is competitor against zero and then we come to this area this area we we are on that sections and you can see these May color update section two as it comes back and this is again another competitor by the way they come to a little more stable I can tell but they are not taking that much traffic anymore and some others are also around some of these competitors they are not direct competitor by the way we are aggregator this is provider so they're actually our client so we are like a little bit alive we are a little bit like competitor but they actually again can't lose a little bit more traffic in this sections too by the way I have some broadcast algorithm update analysis and conclusions if you want to check it further by the way I will suggest you to actually check the latest article or the the case study article that I have published already let me go to the bottom it will be easier to show you here I didn't put an image yet and yes I don't optimize my website on purpose most of it for actually testing and if you can check these are this specific article for the entity SEO it will be useful for you to check but there is something with this section but basically if you check these end-to-oriented search engine case study section if I can find here actually the article is directly here you can check it at the bottom area you will be finding actually a broker algorithm update analysis I am using basically Google tests to understand what Google is actually doing there and sorry for that email part okay so these are the for the May update and then actually there are some other improvements there too September update also affected the website in a positive way and we came to this area in a better than better way than before and here also according to my update there is a switch between aggregators and the providers and this is that comparison that's why some of them come back slightly back by the way maybe one day I can create a video for the page rank secretly but this website here actually they are there they have to support us in another way because when you are able to get some indirect links if you're able to understand the link graphs you can actually choose some pinpoints for your digital pair campaigns and here we have some other metrics too and topical consultation for the new sectors and initial ranking as I say I will be explaining this part later but this is the initial section of the insurance sector I have shown the con existing version already this is the Google search console screenshot for that is the credit right now it is double of this you already know that if I show it back in this area this is the insurance right now and this is the credit right now here 90 000 and when I go to the go back to the specific article it is like 44 and it is just October by the way like last month so in one month it is doubled it doubled itself and these are the new latest metrics I have put some of the videos in this area and to be able to help you to understand it and these are the updated Graphics from the CM rush so again if I show you these areas I will be publishing and how to expand a topical map by protecting the source Authority by protecting the ranking signal consolidations and also Source Authority and identity together with that so it is a little different than creating a topical map or a brand new domain if it is an existing domain it will be a little different than before so I have tried to explain many different things in this case study I didn't talk about patents that much I hope you enjoy it because I know that you also like the theoretical section A lot and I love my audience but there are different levels of people a beginner finds my videos hard and an advanced person they actually love my videos because it means to mess your media we can find these things there is not that much nerdy people anymore and when we look at this situation when I actually satisfy the beginners the advanced level gets angry when I satisfy the advanced level and the beginners get angry so I am trying to balance these things with this type of videos I hope it helps to you and as I say subscribe to the newsletter subscribe to the YouTube channel I will be more active in this area we will talk a lot and I will be answering your questions in my YouTube channel too if you want to ask me an answer question instead of sending these questions as DMS just send me a comment in anywhere in the YouTube channel I will be answering it come to the live chess I will be answering them again and I'm happy to see you here because we are finishing the 90 minutes I am a little tired after the let's say one hour it's hard to talk all the time you see the hour here I will need to sleep and there are many other things that I need to do probably I have like stayed emails anyway love you all and see you later [Music]
Info
Channel: Koray Tuğberk GÜBÜR
Views: 13,760
Rating: undefined out of 5
Keywords: seo, seo case study, search engine optimization, seo course, seo tutorial, holistic seo tutorial, seo tutorial for beginners, local seo case study, saas seo case study, technical seo tutorial, holistic seo, koray tuğberk gübür, image seo, best seo case studies, holistic seo case study, seo tutorial advanced, seo guide, seo guide for beginners, advanced seo, advanced seo tips, seo tips, seo tricks, seo success story, seo success factors, ecommerce seo case study
Id: JNCGp7c-Yy0
Channel Id: undefined
Length: 87min 44sec (5264 seconds)
Published: Thu Nov 24 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.