English Google SEO office-hours from October 29, 2021

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
JOHN MUELLER: All right. Welcome, everyone, to today's Google Search Central SEO Office Hours Hangouts. My name is John Mueller. I'm a Search Advocate at Google in Switzerland, and part of what we do are these office hour sessions where people can join in and ask their questions around web search and the website. We have a bunch of things submitted on YouTube which we can go through, but if any of you want to get started, feel free to jump in. Oh man, so quiet today. Fine, OK. I hope I don't-- ROBB YOUNG: I don't have a question. I just didn't want to go first and dominate anyone's time. JOHN MUELLER: Go for it. OK. ROBB YOUNG: Do you mind taking a look at our new site, John, theexperiencegifs.com? My question is that we have a landing page now for-- because that's a global site. So we have, on the main home landing page, you have links to the US, UK, South Africa, Germany soon. And in the US, it seems to be being picked up as our main home landing page rather than the forward slash US. And so what tags or what kind of indexing would you suggest on that main experience guess landing page. So that Google starts to learn that it's really just a doorway to the other countries? There's nothing on there really other than the links to the other countries. We've set up the country tags in Webmaster Tools for the other individual country sites. So I'm not sure what more we could do. JOHN MUELLER: The hreflang would probably be the right approach there. And what is important is that default page is set as an x default. ROBB YOUNG: X default. JOHN MUELLER: Yeah, so the idea being there that we understand that it's a part of your set of pages whereas if you don't specify an x default there because you say, oh, it's like different from these other pages because the other pages have content and this one is just kind of like a doorway, then we'll treat it as a separate page, and we'll say, oh, it's like, we could show a content page or this main page thing. And then we might show the main page thing. ROBB YOUNG: So we put the x default tag on the kind of directory landing page or on the US home page? JOHN MUELLER: On the directory page, kind of that default page that applies if none of the specific country versions apply. AUDIENCE: OK. All right, fine. Thanks. Cool. JOHN MUELLER: All right. If there are no other questions before we start, I'll just go through the YouTube questions, and feel free to jump in if anything pops up. And of course raise your hands if you'd like to ask questions later on as well. All right, let's see. First question I have here is about the podcast knowledge panels. I would love to know more about the podcast knowledge panels and the move behind making them happen. Are Google Podcasts going to be one of the focuses moving forward? So I don't actually know anything specific about the podcast knowledge panels, but usually what happens with these kind of things is we recognize that there's maybe a type of content or specific type of entities that are getting more and more popular or that have been popular, I guess, in the case of podcasts, and we recognize it makes sense to kind of package these in a special way. And because of that, we might take something and create something like a knowledge panel for it and try to put it together like that. With regards to Google Podcasts and these knowledge panels, in general, we try not to treat any Google products or service as special in this regard. But rather, if we see that this matches really well to something like a knowledge panel that we have, then we'll try to integrate that there. But that's similar to any other product or services that we find outside. So I don't think there's-- at least as far as I know, I don't think there is any secret strategy to try to get Google Podcasts listed in all of these different places. But it's something that sometimes is just a good fit. Let's see, next question-- can no index pages affect Google's evaluation of quality website at a site level? That is, as it's used in core updates. And 99% of the pages of this website are relatively lower quality and not indexed. So 300,000 no index and 3,000 index pages. So I think, first of all, the amount of low quality pages that you're kind of like saying to yourself are low quality pages, that feels kind of tricky or problematic to me just independently of anything with regards to Google core updates. If you find that you have so many pages on your website that are really low quality in the sense that they're not good pages, they don't have any useful content on them, then it feels like an opportunity for something to clean up there because even if we don't index these pages, users might go to those pages. And if that's what they build the perception of your site on and you know that these are bad pages, then that feels like a recipe for people just not coming back. So kind of outside of anything specific to SEO, it feels like something that would be worth cleaning up. Sometimes people see things as being lower quality just because of technical reasons. For example, if you have category pages and you can filter them and sort them in different ways, you might say, well, this is lower quality because it's not actual content. From my point of view, that's more a matter of just technically not interesting content. It doesn't mean that it's actually a bad page. So that might be kind of a misunderstanding there. But going back to the question itself, with regards to the core updates and kind of Google's understanding of quality of a website overall, we don't take these pages into account. So we really focus on the content that we have indexed for our website, and that's kind of the basis that we have with regards to all of our quality updates and all of our quality algorithms and understanding of the website itself. On the one hand, because that's what we're showing in search-- so if there's something on your website that we're not showing in search and we're not using it to promise anything to users who are searching, then from our point of view, it's kind of up to you what you do with that. The other point I think is a little bit more practical in the sense that if we don't have these pages indexed, then we don't have any data for these pages. Then we can't aggregate any of that data for our systems across your website. So from that point of view, if these pages are no indexed, we don't take them into account. Then a question about authors, a question regarding author.url and structured data-- on our About Us page, we have different paragraphs for every author. Would it be OK to make anchor links for each author on this page and use these anchor links for the author URL property, or does it have to be a dedicated page for every author? I don't think we have any guidelines specifically around that. So this is something where it's less a matter of there are technical requirements for how you link your authors and more a matter of, well, it has to work well for users. So that's kind of the focus that I would use there. So if this is a page that works well where it makes sense that authors can or people can find information about the authors on your website, then that seems fine. One thing I might caution here a little bit is that sometimes for individual authors, it makes sense for us to understand a little bit better how that author fits in overall. And for that, often these authors link their different profiles together, or they take one author profile that they use across the whole web. And for that kind of scenario, I think it does make sense to have individual URLs for each other. But if this is purely within your website and purely for informational reasons for users, then probably that would be perfectly fine like that. Is the country code top level domain a ranking factor, especially for local businesses? I would say-- on a very rough basis, I would say yes. We do use the country code top level domain as a factor in geotargeting. So in particular, if someone is looking for something local and we know that the website is focused on that local market, then we will try to promote that website in the search results? And we use the top level domain if it's a country code top level domain. And if it's not a country code top level domain, then we'll check the Search Console settings to see if there's any country specified there for international targeting. And if you have a generic top level domain like that, then setting that in Search Console if you want to focus on a specific country definitely makes sense. And like I mentioned, we use this for queries where we can tell that the user is looking for something local. So an example that I've used in the past is if you're searching for something like washing machine repair manual, then you're probably not looking for something local whereas if you're just searching for washing machine repair, then you probably are looking for something local. So those are kind of the different scenarios there. And from that point of view, it's something where sometimes, it makes sense to look at your website and think about, well, do I need to target these local queries, or am I more looking for something kind of like to cover the broad range of people who are searching globally? And both of those can be useful strategies. It's more a matter of picking a strategy and then executing on that. It's been a while since the product review update launched in the US, but I haven't heard anything about other regions. Can updates like this be part of the core updates and therefore be live theoretically? I think changes like this that we tend to announce individually tend not to be part of the core updates, so I wouldn't tie that to any particular core update. But it's very possible that this has rolled out in other regions. Usually, we tend to announce bigger updates when they launch, and when they launch in just individual regions, we'll try to mention that just to make sure that people are aware of where they launch. But usually, the goal is really to roll this out globally as much as possible for pretty much any update that we do. And with that in mind, it's something where my estimation is that these kind of updates have started rolling out in other regions as well, other languages as well. So I would not assume that just because you haven't heard any specific announcements saying, oh, this is also live in Switzerland, that it's not live in Switzerland. Usually, we just start somewhere, and we let you know where we started. But it expands over time. There are very few changes where we explicitly call out individual countries and languages when they launch as well. In the summer, Google released an update in title generation for web pages. Could you tell us what factors this new algorithm decides which titles should be changed? We tried using the new documentation on this, but nothing seems to work. The update affected some of our pages. Most of the time, it's a category page. Its title is cut, and the brand domain name is added. We've noticed some other sites in the search results have this problem. At the same time, we see that our main rival in the search results has the same titles they've been using since before the update. Yeah, so I think we have some information in the last blog post that we did about these titles changes. So I would definitely check that out. One of the, I think, bigger changes here that happened is that the titles are no longer tied to the individual query. So it's something that is really on a per page basis. On the one hand, this means it doesn't adapt kind of dynamically. So it's a little bit easier to test. And on the other hand, it also means that it's easier for you to try different things out in the sense that you can change things on your pages, and then you could use like the submit to indexing tool and see what happens in Google Search results. What does it look like now? And because of that, it's something where I would recommend if you're seeing weird titles on your pages just to try different approaches out and see what works best for your website for your kind of content. And based on that, then expand that to the rest of your website. So that's kind of the direction I would take there to essentially just try it out and try different approaches out. And because it's really static on a per page basis, it is something that is a lot easier to kind of experiment with a little bit and to see, well, what are the different options that I can do here? How can I show maybe my company name or my website's name? How can I show the title that is relevant here and all of those different things. And from that point of view, I just try things out. It's definitely not the case that we have any manual lists on our side, where we say, oh, well, your competitor is on the good list for titles. We'll show nice titles for them, and you'll get the messy titles. It's all algorithmic. So it's not something that is kind of manually held back for individual sites or pushed into individual sites. How does it affect the search rankings when page and search titles don't match? Often we experience that the page title has been shortened and our company name added to the search results title. We do add our company name to the end sometimes, but the concern is that this is to all our page titles and will limit how much we can write in the title. So the question is really, is it better to have shortened titles that can be displayed in the search results, or is it better to keep the page titles we have already and let Google choose a different title? I don't think there is any explicit what is better from our side. One of the things I think is worthwhile to keep in mind is we do use titles as a tiny factor in our rankings as well. So it's something where I wouldn't necessarily make titles on your pages that are totally irrelevant, but you can try different things out kind of like I mentioned before. And It's not a critical issue if the title that we show in the search results-- we call these title links nowadays-- if that doesn't match what is on your page. From our point of view, that's perfectly fine. And we use what you have on your page when it comes to search. So from that point of view, it's like, you can put the things in your title tag on your pages, and maybe we'll show that. Maybe we'll tweak that a little bit. But essentially, your page is what we use as a basis for the rankings. And with regards to the company name or not, I think that's a little bit up to you and a little bit also in our algorithms as well in that we do see that users like to have an understanding of the bigger picture of where does this page fit. And sometimes, the company name or brand name for the website makes sense to show there. Some people choose to put it in the beginning or in the end. Some people have different kinds of separators that they use. From my point of view, I think that's more a matter of personal taste than decoration rather than anything related to how ranking would work. Let's see. Question on the Disavow tool-- does using the Disavow tool raise a flag in the algorithm and trigger a soft penalty on a website for possibly engaging in link building in the past? We've used this tool to remove hundreds of spammy links on our site collapsed a few days later. Should we remove the Disavow file, and how long will it take for a site to return to normal traffic and ranking? Or is there a permanent black mark against this website for using the Disavow tool? Good question. No, there is not any kind of penalty or black flag or mark or anything associated with using the Disavow tool. From our point of view, this is purely a technical tool that you can use if you have any links that are pointing at your website that you don't want to be taken into account by Google systems. And it doesn't mean that you created those links. It can be something that you found where you're really worried that Google might get the wrong picture for your website. It's essentially up to you. It's essentially a technical tool that helps you to kind of manage the external associations with your website with regards to Google Search. In most cases, if you're just seeing random links coming to your website. You don't need to use the Disavow tool. But if you see something where you're saying, well I definitely didn't do this, and if someone from Google manually were to look at my website, they might assume that I did this, then it might make sense to use the Disavow tool. But from that point of view, it doesn't mean that you did it or it's not a kind of a sign that, oh, you're admitting that you were doing link games in the past. From our point of view, it's really purely a technical tool. And also in general, with regards to pretty much-- I'd say like most manual actions in general, if the manual action is resolved and if the issue is cleaned up, then we're treating your site as we would treat any other website. It's not that we have kind of a memory in our systems that would say, oh, well, this website had a manual action in the past. Therefore, it might be shady in the future as well. From our point of view, if you've cleaned up an issue, then you've cleaned up that issue. With some kinds of issues, it does take a little bit longer for things to settle down just because we have to reprocess everything associated with the website, and that takes a bit of time. But it's not the case that there is any kind of like a grudge in our algorithms that's holding back a site. With regards to this particular case, where you're saying you submitted a Disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related. So in particular with the Disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website. And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of, I don't know, maybe three, four, five, six months kind of step by step going in that direction. So if you're saying that you saw an effect within a couple of days and it was a really strong effect, then I would assume that this effect is completely unrelated to the Disavow file. That said, it sounds like you still haven't figured out what might be causing this. So that might be worthwhile to maybe jump in on another of these Hangouts at some point, and maybe we can go through some of the different options that might be kind of affecting your website there. It's really hard to say because it's definitely not based on the Disavow file, but what else could it be? There's just like so many different options. I'm working on a Tanzanian website user search in two languages-- in English and Swahili. We would like to publish the same content in both languages for better UIX. Would that cause any duplicate content issues in the search results in general show a mix of English and Swahili content? How would we best use the canonical tag and hreflang? So the good news is anything that is translated is completely different content. So it's definitely not something where we would say this is duplicate content just because it's a translated version of a piece of content. From our point of view, duplicate content is really if the words and everything match and are really duplicates. And then in cases like that, we might pick one of these pages and show and we might not show the other one. But if they're translated, they're completely different words. They're different pages, essentially. So it's definitely not something we would consider duplicate content. The ideal configuration here is to use hreflang between these pages on a per page basis. And this is something that I would assume is almost optional in a case like this. So it's something where I would, before you go off and do a lot of implementation work for hreflang, especially for our larger website-- it's a lot of work-- I would double check if you're actually seeing any issues that users with the wrong language are going to the wrong page or user with a specific language are going to the wrong page. And you can kind of see that in Search Console in the performance report when you look at the queries that reach your website, especially if you're looking at the top queries. You can kind of, based on your knowledge, estimate which language that query is in and then look at the pages that were shown in the search results or that were visited from there. And based on that, you can kind of make an estimation of is Google showing the right pages in the search results. And if Google is already showing the right pages in the search results, then I think you can probably save yourself the effort with hreflang. But if we're showing the wrong pages in the search results, then definitely the hreflang annotations would help here. Usually, this is something that is more an issue on almost, I'd say, generic queries where people are searching for your company, for example. Then, just based on someone searching for a company name we might not really know which language this user is searching for, and then we might show the wrong version of the page. So it might make sense, especially if you're setting these annotations manually, to first of all double check is it a problem at all. And if it is a problem, does it just affect individual pages? And if it does just affect individual pages, then put the hreflang annotations there, which might be, like for your home page or your main category page, you add those annotations. And for everything else, probably it might be working well. So in particular, if someone is looking for something somewhat broad or generic, like, I don't know, for example, blue running shoes, then obviously in English, they'll be typing in blue running shoes. And then we can match that to your blue running shoes pages in English. If they're searching in Swahili, I don't know what the term is, but I imagine it's a different term. And because it's a different term, we can automatically match that to your existing Swahili pages. So for many cases, you might not need to do anything special here, but I would kind of double check. Let's see. Does writing comprehensive articles covering a specific subject build trust with Google? I don't think we have any measure or metric or anything like that where we'd say you have built trust with Google, and you've built that based on writing comprehensive articles. I would see this kind of, I don't know, work as being focused a little bit more on the user side. So does this build trust with your users? Do users appreciate this kind of content? That kind of thing. And probably users appreciate that kind of content if you're actually writing something comprehensive and useful for them. The important part, I think, here is really to figure out which users you want to target and to make sure that your content actually speaks in their language. So for example, if you have technical content and you write a really detailed technical article about that, if your users are looking for something that is more general or more, I don't know, simplified that explains the basic topics a little bit better, then maybe that highly specialized technical article is not the best thing for them whereas if your users are really kind of the specialized technical people and they want to find all of this kind of highly technical content, then maybe that is the right match. So that's something where you almost need to think about which users do I want to target and what kind of content are they looking for, how can I write it in a way that matches what they search for and what they would like to find. And then based on that, you can kind of build out your website. So don't just blindly go in and say, oh, I would like to have my website ranked for rental cars. Therefore, I will write long comprehensive content on rental cars because probably, that's not what users are looking for. You almost need to figure out your users first and then work on your content. Let's see. We keep coming across sites that scrape our content and republish it on their websites, sometimes including a link to the original article and sometimes not. How does Google handle this? Is DMCA takedown necessary for every case? What happens if Google indexes the scraped content first? Would this then be seen as the original? Yeah, I think this is always a bit tricky because it's kind of a mix of search and almost legal topics here, and it's something that just happens quite a lot in that some sites don't care about things like copyright, and they just take content from other people and republish that. So kind of the way we handle it is kind of nuanced and includes lots of different things. The first thing I would consider as a site owner if you're seeing this with your content is to think about whether or not this is a critical issue for your website at the moment and kind of on a case by case basis. And if it is a critical issue, then I would recommend trying to see if there are legal things that you can do to kind of help resolve this even outside of anything SEO-related. And that could be the DMCA. I can't give you advice on legal topics, so that makes it a little bit trickier for me to say, like, you should use a DMCA or not. But in many cases, the DMCA process would be appropriate here and could be something that you could use here. So I would, on the one hand, read up on that process, on the other hand, get local legal advice as well so that you're sure that you're doing the right things when it comes to the legal side of things. On Google side, in the search results, I think there are a few things that come into play here. On the one hand, sometimes copies are also relevant in the sense that, like, say if I'm-- let me see. How can I frame this? In the sense that especially when it's not a pure one to one copy of something, but rather you're taking a section of a page and writing about this content, we see that sometimes, for example, when we publish blog posts that other sites will take our blog posts and include either the whole blog post or large sections of it. But they'll also add lots of commentary and kind of try to explain, well, what does Google actually mean here, or what is Google saying between the lines, or maybe giving some more simple examples of how this could apply here and essentially building out a bigger picture. And on the one hand, they're taking our content and copying it. But on the other hand, they're creating something newer, bigger, based on that content. So in the search results, if someone were to search for that content, I would expect to see these kind of other pages ranking as well because they're providing a slightly different value than just what our pages are providing. And we do see this happening. and sometimes, these pages rank above ours, and that's all fine, I think. With regards to indexing the scraped content first or not, I think that's something that is kind of tricky to do there because what we've seen in the past, especially when I was looking at this, I don't know, maybe like 10 years ago, a little bit more, what I noticed there is that oftentimes, spammers or scrapers will be technically very [INAUDIBLE],, and they'll be able to get content indexed almost faster than the original source. And then if we were to purely focus on who got this into Google systems first, then it can be that we're accidentally kind of favoring those who are technically better at publishing content and sending it into Google versus those who are publishing the content naturally. So from that point of view, I think just purely focusing on the publish date doesn't make much sense. What I've seen in our systems over the years is that we tend to look at the bigger picture for a lot of things when it comes to websites. And if we see that a website is regularly copying content from other sources, then it's a lot easier for us to say, well, this website isn't providing a lot of unique value on its own, and we can treat it appropriately based on that. So that's something where usually the ranking side kind of settles down there a little bit. I feel, though, it's kind of a confusing answer. I don't know. So I think, like, stepping back, the first thing with these kind of problems I think I would always do is first figure out, is it actually a problem for you. And if you do see that is a problem for individual pages, then consider if there's a legal solution that you can apply here because if you can solve this by having the content removed, for example, then you don't really have to worry about the SEO side of things. Then the third one, I think, is sometimes it's OK for copies to also appear in the search results or some kinds of copies, I guess. But essentially, it depends quite a bit on the individual use cases there. And I think also maybe as a last step, if you're seeing that this is really causing problems, then submitting spam reports to us is also a good way to let us know about these kind of issues. Maybe that's a little bit clearer. OK, let's see. Does presence in social media channels influence SEO? For example, more followers likes, shares, social media links equals better page rank. No. So for the most part, we don't take into account kind of the social media activity when it comes to rankings. The one exception I think that could kind of play a role here is that we don't special case social media sites, but we do sometimes see them as normal web pages. And if they're normal web pages and they have actual content on them and links to other pages, then we can see them as any other kind of web page. So for example, if you have, let's say, a social media profile somewhere, and it links to individual pages from your website, then we can see that profile as a normal web page. And if those links are normal HTML links that we can follow, then we can treat those as normal HTML links that we can follow. Also, that profile page, if it's a normal HTML page, it can be something that can be indexed as well. It can rank in the search results normally like anything else. So it's not a matter of us doing anything special for social media sites or those social media profiles but rather, well, in many cases, these profiles and these pages are normal HTML pages as well. And we can process those HTML pages just like any other HTML page. But we wouldn't go in there and say, oh, this profile has so many likes. Therefore, we will rank the pages that are associated with this profile, higher. It's more that, well, this is an HTML page, and it has some content. And maybe it's associated with other HTML pages and linked together and based on this kind of better understanding of this group of pages, we can rank those pages individually. But it's not based on the social media metrics. Is the Penguin penalty still relevant at all or less relevant? Spammy toxic backlinks are more or less ignored by the ranking algorithms these days. I'd say it's a mix of both. So for the most part, when we can recognize that something is problematic and kind of a spammy link, we will try to ignore it. If our systems recognize that they can't isolate and ignore these links across a website, if we see a very strong pattern there, then it can happen Better algorithms say, well, we really have kind of lost trust with this website. And at the moment, based on the bigger picture on the web, we kind of need to be more on almost a conservative side when it comes to understanding this website's content and ranking it in the search results. And then you can see kind of a drop in the visibility there. But for the most part, the web is pretty messy, and we recognize that we have to ignore a lot of the links out there. So for the most part, I think that's fine. Usually, you would only see this kind of a drop if it's really a strong and a clear pattern that's associated with the website. I work for Travel [INAUDIBLE],, and we're wondering why we don't appear in the search results, find results on box for certain search terms. Is there a way to better our chances for appearing in this box, or is it settled outside of SEO? Any information on how the content in this box ends up there or is decided would be much appreciated. So I think these are the little links that we sometimes show in the search results where we recognize that a website has more content on a specific topic. And then we start to show this, that you can also search on these websites for these topics. My understanding is that this is purely algorithmic, and especially if you're seeing this happening for certain search terms, then you're already kind of seeing that algorithm in play. And this is something where we try to understand what kind of content you have on your website and how we can link to that. Probably if you're already seeing this for certain terms, we've figured out how we can link to that on your website. So that's a good thing. That means the structure of your website is at least understandable to our systems that we know which parts to kind of link to if people are looking for more information. But it's not the case that you can, I don't know, just create more content or put the keywords more on your pages, and then we'll start showing these links for other topics as well. It's really something that our algorithms have to figure out and learn over time with regards to these individual websites. At what point should we start worrying about page speed if it's in the red zone? Faster sites increase conversion rates, but we can't spend tons of money on little sites that might not deliver much in the first place. Yeah, I don't know. At what point should you start considering page speed? I do think it's something that pretty much all sides should consider and think about. One of the nice parts of everything around Core Web Vitals, I think, is that because of these very public metrics, a lot of the platforms have also started to think about speed a lot more. That means if you're using a common CMS or popular themes on a website, then almost by default, the speed will have increased as well. And every now and then, someone will do a study and look at the different CMSes and the different hosting platforms and say, oh, and it's like, Wix has done a lot of work, and their metrics have improved by this overall. And that means if you're using one of these platforms, then even if you don't do anything on your website, you're kind of also profiting from all of the work that people are putting into the platforms themselves. And we see that across the board for pretty much all platforms and CMSes. And I would assume if you kind of have especially smaller business websites where you tend to use the more default setups on a website where you tend to use more default CMSes and hosting platforms that you would kind of automatically profit from this general shift to a little bit faster. So from that point of view, it's something where sometimes, you don't need to do a ton of work provided you're actually using a commonly used platform. With regards to when you should start thinking about speed, I do think that's tricky because while speed is a ranking factor, it's not the only ranking factor, and relevance is really key when it comes to ranking. So it's hard to say. It's like, when do you focus on this particular part of ranking with Google? But I would see it similar to like a question like when you should you focus on usability on a website, or when should you focus on making your images so that they can appear well in Google Images. All of these things are individual elements of appearing in search. And when you should focus on any of these individual items is kind of up to you, and the nice part about search is you don't have to do everything perfect you can pick and choose, and you can say, well, I will focus on speed at the moment and make sure that the images appear well and make sure that all of my headings are aligned, all of these things maybe. And other people will focus on different aspects. And we still kind of have to find a way to show those top 10 rankings or whatever, however many we have at the moment, in the search results. So I would leave it a little bit up to you. One of the things also, I think, to keep in mind, especially for very small websites, local websites in particular, is that oftentimes, they don't rank for these competitive generic terms anyway, which means they tend to rank more for things where their website is really only relevant. And that could be for local businesses. Like, if you're searching for this business type in this city, if we have 20 business websites that are like that, then you're automatically in those top 20 anyway. So it's not the case that your website would disappear from there if your website is slow. Similarly, if someone is searching for your business name explicitly because you're a local business and they know you exist and they just want to check the opening hours or whatever, then your website will automatically be relevant for those queries anyway. So it's not something where suddenly, this website disappears just because it's not fast enough. And similarly, I would also be cautious with regards to the positive effects with regards to speed. If you're focusing on these kind of local queries, then just by having a faster website, you're not going to get much more traffic than you already are if, for example, most of your traffic is based on people searching for your business name. If they're not more people searching for your business name, then they're not more people that would be able to find your website like that. So kind of the expectations, I think, especially for a smaller local businesses is something that is a bit tricky to manage there. Of course, if you're working on a larger website that is active globally where you're trying to rank in more competitive queries, then that is something where you might see more visible changes in the search results over time. Oof, I've been talking for a while. Let's see. Ritu, you have your hand raised. RITU NAGARKOTI: [INAUDIBLE] JOHN MUELLER: A little bit. RITU NAGARKOTI: Now it is fine? JOHN MUELLER: It's better. Perfect. RITU NAGARKOTI: So I have a few questions this week related to page experience. My first question related to page experience-- So I have we have some paid experience issues coming like in Search Console. Still, we have no [INAUDIBLE] issues, no mobile usability issue. That is showing page experience having no [INAUDIBLE].. Europe was in [INAUDIBLE]. I don't know why it is showing this because still such concern having no issues, we have resolved all core [INAUDIBLE] issues, mobile usability issues. But it should not show like that. [INAUDIBLE] JOHN MUELLER: Yeah, I don't know. It's hard to say without kind of seeing more, but I think there might be two things at play. On the one hand, we don't have data for all websites. So especially the Core Web Vitals relies on field data. So what people actually see and what is reported back through, I think, mobile Chrome that we can kind of aggregate with regards to speed-- so we need a certain amount of data before we can say, oh, like, we understand what the individual metrics mean for this website. And if you're not seeing any data at all in Search Console with regards to the individual Core Web Vitals metrics, then usually that matches to that. It's like, we just don't have enough data at the moment. And that means also from a ranking point of view, we can't really take that into account. So that might be why you're seeing this number where it says, like, zero good URLs But because we just have zero URLs that we're tracking for Core Web Vitals at the moment for your website. RITU NAGARKOTI: So my second question is related to Web Stories. Like, I have seen some brands having Web Stories on their Google [? CRP. ?] But I'm not seeing for my brand. So I was exploring how it will come for my brand. So can user just be [INAUDIBLE] like to have your Web Stories for a brand in Google [INAUDIBLE]?? JOHN MUELLER: OK, so I think there are two aspects here. On the one hand, Web Stories are normal pages. So they can appear in the normal search results as well. From a technical point of view, they're built on AMP, but they're normal HTML pages, essentially. And that also means that you can link them normally within your website. So that's really, I think, critical for us that we understand these are part of your website. And maybe they're an important part of your website. And if you think that they're important, they should be linked in an important way. That means maybe link them from your home page or some other pages which are very important for your website so that we can understand this is an important page. Then the other aspect here is because these are normal HTML pages, we need to find some text on these pages that we can use to rank them. And especially with Web Stories, I think that's tricky because they're very visual in nature, and it's very tempting to just say, oh, I will show a video, or I will show a large image in my Web Stories. And if you do that without also providing some textual content, then we basically have very little that we can use to rank these pages. So on the one hand, they have to be integrated within your website like a normal HTML page would. And on the other hand, they also need to have some amount of textual content so that we can rank them for queries. And then I think another aspect here is that in some locations, we show Web Stories slightly differently in the search results when we can recognize that there's maybe a block of Web Stories that we can show. I don't know. I think in India is one of those places or in the US is also where we show them slightly differently. And there I think you almost have an advantage because then we would try to find more Web Stories to show for your queries. So from that point of view, it's almost like a good situation to be in, but you still need to make sure that you have the basics covered. We also have the Google Creators channel. I don't know if you've seen that. It's a separate YouTube channel. They also have a Google creator's blog, and they have a lot of content on Web Stories. And they also have some guides for optimizing Web Stories for SEO that I would recommend kind of going through. RITU NAGARKOTI: OK, thank you so much sir. Sir, one more question-- actually, I earlier asked this question with you. You have written blogs on it. Recently, I'm using my brand name as an author name, and I'm not using real name [INAUDIBLE] it because that is generic, and we can-- generic names, we can't use [INAUDIBLE].. That's why I'm [? holding ?] it. So is it appropriate strategy using brand name as an author name, or we should leave it as a [? flag? ?] JOHN MUELLER: I mean, ultimately, you can choose how you want to do that I think for users for certain topics, it makes sense to really have names associated with it. And for other kinds of topics, it's less a matter of having clear names associated with it. So in particular, if you look at our quality rater guidelines for things like medical topics, you want to make sure that it's actually someone who's qualified, a qualified medical expert writing this content, and not just a brand. So for those kind of topics, I think it definitely makes sense to have a name associated with it. For a lot of other topics, if you just need to have an author link there, then maybe that's OK to have a brand name there. The one thing I would avoid is using something like admin or, I don't know, like a generic name as an author because that really doesn't tell users anything at all. But a brand name at least says, like, our company stands behind this. RITU NAGARKOTI: OK, we can use brand name, but we can't use generic admin, and we should avoid these type of things. JOHN MUELLER: It's not that you can't use it. It's more that I would recommend not to. So it's not a requirement that you have to do it in any of these ways, but especially from a usability point of view, if you're going to have a link that says this was written by someone, then make it someone or something. Yeah. RITU NAGARKOTI: Yeah. Sir, one more question-- like, we are posting some news on third party sites, and that is appearing as the top stories in Google CRP. And I guess it is appearing like my search queries. People are using search queries with the help of that news are coming as highlights in Google CRP maybe. But we are also posting news, and maybe what could be the reason they are not appearing as top stories in Google CRP? Maybe it can be the quality of content. ROBB YOUNG: Sure, yeah. I mean, the top stories takes into account various factors. So it's not automatically anything that is news related that you publish that will appear there, and it's not automatically every website's content that appears there. But it sounds like you're on the right track and that some of these things are showing up in the top stories. RITU NAGARKOTI: OK. OK, so [INAUDIBLE] speak. Thank you so much and happy weekend. JOHN MUELLER: Thanks. You too. Let's see. Theresa. TERESA HUNTER: Hello. Hi, John. JOHN MUELLER: Hi. TERESA HUNTER: Hi. So I'm working on a recruitment website, and they've got these jobs that they're kind of almost evergreen because they have a bank of staff, and they have continuous work. And it's like just for a contract ongoing. And so I've got these jobs on the website. And the question is, do I-- I can't just leave that job listing because then it just looks like it's super old. So should I just be creating a new page like every 30 days? I can't really change the structure data on there because of the CMS that they're working with. So what should I do, I guess. JOHN MUELLER: And it's basically, I guess, like an ongoing opening position, or-- TERESA HUNTER: Yeah, it's like just sites for that they've got a contract for a council. And it's like, they have to keep doing this particular task. And they have a bank of staff, but they need more people. And so they're always looking for this particular role. And it's just an ad hoc role that they can do remotely. So it's always available in a way at the moment, and it's the same role in the same area. JOHN MUELLER: Yeah, I think purely from an SEO point of view, you could just leave it open. I don't think you would see any big, I don't know, effects if you were to delete that page at some point and just create a new one kind of on a monthly or yearly basis. I don't think you would really see any effects there. The one thing I'm not sure about is how Google Jobs would deal with this if you have to do anything like having specific dates on there for the job listings. So if it's something specific to Google Jobs, it sounds like you probably would need to double check their guidelines. But purely from an SEO point of view, I don't see any issues with keeping this live like that. TERESA HUNTER: OK. JOHN MUELLER: And from my point of view. I also don't see any issues if you were to update the dates on those pages from time to time where you could say last updated-- TERESA HUNTER: [INAUDIBLE] JOHN MUELLER: Sorry? TERESA HUNTER: I can't change the dates on. It it's like this custom-- JOHN MUELLER: OK. TERESA HUNTER: --locked down thing. So when I create a new job, it creates a new URL, and I've got no-- I can't change the URL. And yeah, so if I leave it as the same page, then on job listing, it will come up as like-- JOHN MUELLER: The last one. TERESA HUNTER: --month Old or something. JOHN MUELLER: Yeah. TERESA HUNTER: I [INAUDIBLE] just-- JOHN MUELLER: Yeah, I mean, from that point of view, it sounds like it might make sense to update this page from time to time or create a new version of it. I don't think you would see any SEO effect either way. I could imagine if you created new pages very regularly, then it would be hard for us to understand which of these pages to show in search. But if you do this, I don't know, monthly or quarterly or yearly, then we would pick that up as a new page, and we'd have enough time to build signals for those pages and show those in search. And I think that would be fine. TERESA HUNTER: And should I just-- [INAUDIBLE] the expert, but the page that's gone, should I just redirect that to maybe the new page or-- JOHN MUELLER: If-- TERESA HUNTER: --to the-- JOHN MUELLER: Yeah. TERESA HUNTER: Yeah. JOHN MUELLER: I think if you can do that, that's optimal. TERESA HUNTER: Yeah. JOHN MUELLER: It sounds like your CMS is a little bit limited there. So maybe it's not always possible. But if you can do that, that definitely makes sense. TERESA HUNTER: OK. OK, thank you. JOHN MUELLER: Sure. Cool. Let me take a break here with the recording. I still have more time. And [? Neeraj, ?] I see your hand is up as well. So we can get to you too. If you're watching this on YouTube, thanks for watching to the end, I guess. If you'd like to join one of these Hangouts in the future, feel free to watch out for the link and jump in when it pops up. So with that, thank you all for joining in. Thanks for all the questions that were submitted and hope to see you all in one of the future episodes.
Info
Channel: Google Search Central
Views: 7,389
Rating: undefined out of 5
Keywords: Webmaster Central, Google, SEO, Search Engines, Websites, Search Console, Webmaster Tools, crawling, indexing, ranking, mobile sites, internationalization, duplicate content, sitemaps, pagination, structured data, rich results, English Webmaster Office Hours, Office Hours English, Search, office-hours, office-hour, SEO office hours, English SEO office hours, Search Central, Google Search Central, GSC, SC
Id: mrKcjUn27bI
Channel Id: undefined
Length: 59min 49sec (3589 seconds)
Published: Sat Oct 30 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.