SEO for Developers in 100 Seconds

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
the first rule of search engine optimization is create really good content the second rule of search engine optimization is create really good content if human beings don't want to engage with your content then google doesn't want to either we live in the age of machine learning and quantum computing you can't just stuff a bunch of keywords into a page and expect to do well when google first came about in the late 90s it was based on an algorithm called pagerank which weighted relevance and search ranking based on the number of inbound links that a site had people quickly learned how to exploit the algorithm by spamming backlinks all over the internet to increase the site's page rank because a high ranking in google can literally be worth millions of dollars it brought us an entire industry of seo experts the good guys wear white hats the hackers wear black hats but the most effective ones wear grey hats but some say it's a dying industry y'all want to see a dead body because it's becoming harder and harder to manipulate google's technology there are over 200 factors that go into a site's ranking that are geared mostly towards how useful a user found your site did they immediately bounce by clicking the back button or did they dwell on the page for a long time and click other links absorbing all kinds of useful content content is king but the third rule of seo is to render html that can be reliably understood by bots your main content goes inside the body tags when google crawls your site it will use semantic html elements to understand the content on the page you might put your main content in an article tag then put your most important keywords and headings or h tags to signal what your page is about furthermore your html should be accessible use alt tags and images and aria tags where needed to make your site usable on assistive devices in the head of the document we have metadata that's not directly shown to the end user but bots can use this data to further understand the page and format the actual appearance of your search engine listing the fourth rule of seo is to get your fully rendered html loaded fast if you have megabytes of blocking images styles and javascript both users and bots will pass on your site but going fast is easier said than done that's why today we're going beyond 100 seconds to look at the many different strategies we have to render html and how they impact search engine optimization if you're new here like and subscribe and today i have a big announcement my full next js firebase course is now available if your goal is to build a highly interactive web app that is also fully search engine friendly then you'll definitely want to check out this course now the four most important rules for seo in my opinion are create awesome content create awesome content render properly formatted html and load your html quickly the first two rules are very subjective and depend entirely on your audience but the general goal is that when someone clicks on a link to your site from a search engine ranking page they should engage with your site as long as possible there are a few metrics that you'll want to be aware of here the first one is the click-through rate or ctr that defines how likely a user is to click on your link when displayed in a search engine ranking page or serp the higher the ctr the better and that usually means you have a very relevant title and description now if a user clicks on your link and then immediately clicks the back button that's called a bounce and the higher your bounce rate is the less likely your site is to rank well in the long term because apparently the content on the page is not very relevant if the user does stay on the page google will keep track of the dwell time which is the amount of time they spend there before clicking back to the search results the longer the dwell time the better but the best possible thing that can happen is that the user never clicks back their session will last forever and they'll never need to go to another website ever again that doesn't happen very often so what you keep track of is the average session duration and the average number of pages viewed per session these are metrics that you want to maximize there's no absolute rule for creating engaging content but the first thing the user sees should hook them in to want to read more if you look at something like buzzfeed all you have to do is put an animated gif at the top then maybe a few more in the body and you should be good let's move on to rule three where we talk about the actual structure of the html i'll be using my site fireship io as an example on a lesson or article page you can right click and hit inspect element or hit control shift i this will bring up the elements tab in chrome devtools showing you the fully rendered html markup we have a head and a body let's go ahead and open up the body find the main element then inside the main element you'll notice we have an article an article element has semantic meaning and although it will never be seen by the end user it tells the search engine here is the main content of the page in addition you'll notice a couple of extra attributes here one is item scope and the other is an item type as a schema.org article now it's totally optional and whether or not it will improve your search engine ranking is debatable but what schema.org allows you to do is define a bunch of metadata about the actual content on your page making it easier for search engines to interpret it's especially powerful if your content is something like a recipe or a star review because google can then take the schema data and format it properly in a serp page now in this case we have a bunch of metadata that make up an article and one thing that is known to improve search ranking is when an article is written by a known author further down the html tree here you'll notice we have an item prop of author which points to the authors page that link goes to another page on fireship io and on that page we also have an article element this time with an item type of a schema.org author along with a bunch of links that point to authoritative sites for that author outbound links on a page are really important because they further signal what the page is about in this case google will first crawl the article then crawl the authors page then crawl these other sites to understand who that author is a good strategy is to use outbound links to other really good sites that are related to the content on a given page now in addition to schema.org there's other ways you can add metadata to your content and this can be very important for seo and also accessibility one of the most fundamental techniques is to add an alt attribute to images which is basically just some text that describes the image this metadata can be used by search engines and also by screen readers for those with disabilities for other elements that are a little more complicated like a progress bar for example you can use aria attributes which stand for accessible rich internet applications and they help provide additional meaning to highly interactive widgets on the page at this point we've only been looking in the body of the document but the head of the document contains all kinds of useful metadata for seo most importantly this is where you have the title you should choose your title carefully because it's displayed in a cert page and will ultimately control your ctr rating in addition to the title you may also want to have meta tags here which define things like the description featured image author canonical url and stuff like that these meta tags are also essential if you want your content to be shared on social media sites like twitter or facebook when you post a hyperlink in social media it fetches that page and looks for the meta tags to understand what image and title to display there if you want to see how your site's doing right now you can post a link into the twitter card validator and it will tell you whether or not it can use your current meta tags so that gives you some things to think about when it comes to the actual structure of your html but the bigger question is how do you render that html or in other words what part of your tech stack is responsible for generating the actual html markup that is received by a bot or end user there are three fundamental ways to render html the first one we'll look at is client-side rendering if you're building an app with something like react or angular the default mode is client-side rendering or a single page application on the initial page load the user gets a shell of html without any meaningful content the javascript code then bootstraps then asynchronously fetches any additional data needed for the ui apps like this are great for interactivity because it gives the end user an app-like feel similar to what you'd expect on ios or android the problem is that because the initial html is just a shell search engines may have a hard time understanding and indexing it if you take a link generated by javascript from a single page application and post it into twitter you'll only see the initial shell you won't see any additional meta tags that were generated by javascript after the fact that's not great for social media but google as a search engine is able to index client rendered apps but the reliability is questionable and personally i wouldn't trust client rendering if seo was a business critical requirement so another option is to pre-render or statically generate html in advance let's imagine your web app has a hundred different routes or pages instead of sending a shell down to the user we could generate all the html for those pages in advance then upload the static files to a storage bucket that could be cached on a global cdn so the first thing the user sees is fully rendered content then the javascript loads after that and makes the page fully interactive that's great for seo because bots get fully rendered html and they can easily interpret the content on the page it's also highly efficient because if you're fetching data from a database you only have to do that once at build time then you can cache the page on a cdn and serve it to millions of people without having to refetch your data the trade-off with this approach though is that the data in the pre-rendered content can become stale which means bots will be getting outdated information until you rebuild and redeploy the entire site that's no big deal if you have a few hundred pages that don't change very often but if you have millions of pages with highly dynamic data then it doesn't really scale and that brings us to option number three server side rendering in this paradigm when the user makes a request to a page the html is generated on the server this is also great for seo because bots get fully rendered html on the initial request in addition the data will always be fresh because you're making a new request to the server each time but the drawback here is that it's generally less efficient you might be fetching and rendering the same html over and over again it is possible to do server side caching but that's not as efficient as edge caching on a cdn and will cost a lot more to operate at scale and if things aren't cached efficiently that means a slower first time to meaningful content which can negatively impact seo so basically between these three methods we have a trade-off between data freshness performance and client-side interactivity but what if there is a way we could have our cake and eat it too allow me to introduce you to incremental static regeneration this is the way this is a new form of rendering available in the next js framework remember earlier i said the drawback with static pages is that the data may become stale and require a redeploy of your site what isr does is allow you to statically generate your pages then rebuild and redeploy them on the fly in the background as new requests come into your site that means you get all the performance benefits of static pages while ensuring that these pages always contain fresh data that eliminates all the trade-offs that we've talked about but it's not without a cost deploying a static site is as easy as uploading your files to a storage bucket but incremental static regeneration would require a more complex back-end server deployment for most of us that means paying for a host that supports it like versel but hosting anywhere else will likely be much more painful until more companies start adopting these techniques now one very cool thing going on in the web development world right now is that more frameworks like next and angular are supporting hybrid rendering that means you can implement some routes as static pages configure other routes to use full server-side rendering while other routes can be fully client rendered so you're not pigeon-holed into just one rendering technique you can pick and choose what works best for a given page and in my opinion that's the future of full stack web development i'm going to go ahead and wrap things up there if you want to learn how to implement a real hybrid rendered application with next js consider becoming a pro member at fireship io you'll get access to the next course along with a whole bunch of other pro content thanks for watching and i will see you in the next one
Info
Channel: Fireship
Views: 172,605
Rating: 4.9701791 out of 5
Keywords: webdev, app development, lesson, tutorial, seo, search engine optimization, 100 seconds of code, search ranking, html, html tutorial, seo tutorial, seo developer, js, react, nextjs
Id: -B58GgsehKQ
Channel Id: undefined
Length: 11min 51sec (711 seconds)
Published: Mon Feb 08 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.