This video and transcript is from the English Google Webmaster Central Session that was held 31st May 2019.

The session is open to anything that is webmaster related and is an informal chat  about items such as crawling, indexing, mobile sites, internationalization, duplicate content, Sitemaps, Search Console, pagination, duplicate content, multi-lingual/multi-regional sites, etc.

Video transcript

all right welcome everyone into today’s  webcast the central office hours hangout  my name is John Mueller I am a webmaster  trends analyst here at Google in  Switzerland and part of what we do are  these office hours hangouts where  webmasters publishers SEO is can jump in  and asks their questions all around web  search and the websites looks like a  bunch of questions were submitted  already but if any of you who are here  live want to jump in with the first  question feel free hi John can you hear  me  hi I’m Nick hi if you wouldn’t mind it’s  1:00 in the morning where I am so I’m  getting ready to fur bed very soon oh my  gosh  okay um anyways I kind of submitted a  question a couple sessions ago and it  was pretty broad so it wasn’t possible  to go to in depth which I understand so  I’ll try to frame it like this  I guess the website you know that I run  is basically it had been around and has  been around for quite a long time by  over ten years and it was getting what I  would call pretty consistent performance  in search until about the past year  where we were seeing precipitous drops  and now it’s basically at 5% of the  traffic it had a year ago that being  said I do think it’s algorithmic because  nothing is showing up under manual  actions or security issues one of the  tell-tale signs that I briefly kind of  touched on is a fact that if I were to  search the actual brand name which it’s  a very specific sequence of words like I  could say with 99% certainty if somebody  types in the brand they’re looking for  my site um so the site is actually  showing up on page two of the results  when I search for the brand name it’s  being out ranked by the LinkedIn page or  a Glassdoor page it’s being out ranked  by other sites referring to the main  website but the main web site itself is  nowhere to be found on the first page so  I mean not being said to me that means  like for some reason the website ended  up on  a bad list or something I don’t know how  else to to describe it but that’s what  it feels like to me so if it would be  okay with you I would I would love to  share just a bit more details in the  chat window for you to go back to you  later okay  I guess for someone in my situation like  I said with a very broad description  like this I mean we’re definitely trying  to clean up some of the content there’s  a lot of pages it’s it’s a community  site so it’s lots of user-generated  stuff but have you seen other websites  go through something like this in the  last year  I mean changes can always happen so I I  mean seeing other websites news traffic  is something that we do see over time  and it’s I guess it kind of depends on  the website itself the kind of content  it has with regards to what what  directions it might take from there so  there are certainly situations where I’d  say maybe the website model is not  something that that is a sustainable  anymore as it was in the past which is  something that that comes into play for  example if you have a website directory  where you’re just collecting links to  other companies and you have the phone  numbers and addresses but essentially  you don’t have that much more  information it feels like that kind of  model is something that like that  information is available now everywhere  in pretty much every website every  company has their own website so there  is no need for these kind of or less  need I guess for these kind of directory  pages so that’s that’s one aspect where  I’d say maybe the the overall model is  something that worked really well in the  past because it was really needed in the  past but maybe things have changed over  time and then of course there are other  kinds of websites where I’d say maybe  our algorithms in the last year or so  have been focusing more on and where  they’re a bit more critical for example  one one area that I’ve seen mentioned a  lot is the whole medical space where I  see people posting online talking about  authority expertise those kind of things  where I’d say that definitely makes  sense where in the past it was really  hard for us to judge kind of the the  quality of the medical or medically  oriented site and over time our  algorithms have gotten better in that  regard and that’s an area where I’d say  maybe if you if you had a low-quality  affiliate site that was focusing on  these medical topics then maybe you you  would be seeing changes there even  though perhaps over the last ten years  or so you had a really good run so  that’s that’s another area where I’d say  maybe from an algorithmic point of view  you might see bigger changes but I  really don’t know about your specific  case so if you want to drop some more  information into the chat I can take a  look at that and see if there’s  something more specific I can tell you  about that that would be great thank you  it is a medical website it turns out  okay no coincidence there but yeah I  will share more details in in the  comments pertaining to our situation so  okay great thank you thanks  all right anyone else that wants to jump  in or the first question yeah I can jump  okay yeah I mean interested about well  they’re so uh you know example in Google  Earth or Facebook ads or YouTube when I  stopped for a ticket Iseman I can play  you know if age with your cooperation  behavior and you know example in the to  form I was surprised when we provide  Russian videos about is your bottom of  marking and quite yet this your and I  was surprised that I thought that my  audience is older than 25 years and I  thought it’s the same male or female  part it’s not like this and our audience  is less than 25 years I think white from  18 to 25 and yeah and until it was a  limited  you know to access to this but what  about Google for example if you have  some keyboard and we need to combine  before user intentions of course I know  about the geolocation Google considers  this but what about other I don’t know  like behavior age and these parameters  we will consider I don’t know to use  like it renews your mouth like smell is  no other so so I guess the question is  if we use demographic information our  search results ranking yeah I know that  you use a geolocation yeah local search  but what about the user behavior their  age you know their I don’t know  interests for this Google combined this  information if personalization data  I know I don’t think so I’m pretty sure  we we don’t have anything explicit in in  our ranking systems where we would say  this person is a younger person  therefore we should send them to younger  people friendly websites I don’t think  we would have anything specific there  it’s it might be something that you  would see from a personalization point  of view on a per user basis but I don’t  think we would have anything on a  demographic level when it comes to  search ranking ranking factors mm-hmm  it’s an interesting question though yeah  never thought about that but I don’t  think we eat we have anything specific  in that area yeah because in paid ads  you can play you know the time to choose  which audience you want to show your ads  but in Google it means the Ainu because  you know I am I wanna I can explain why  I asked about this question because you  know we write blog posts and we want in  order to write information to our  personae not have our readers and try to  one persona not to crowd and yeah and  I’m English interesting how I can  combine my blogpost with this persona  and music in town yeah yeah I I think  that’s something that you can definitely  do not not specific that we would use it  as a ranking factor but if you can  target your user is better and right  right for your user as well  but I think that’s something where you  will have success regardless of how we  use that is as a ranking factor mm-hmm  like I think it’s a good approach to  take those different things especially  if you have a YouTube channel that kind  of aligns with your blog then maybe you  get some more information on the  demographics of the users there I  suspect you also get some of that in  Google Analytics or in other analytics  packages but I don’t know the details  there but you know I see that the old  data are different and audience and  different on YouTube on Facebook and  Google you know I don’t see one audience  you know okay  they have different mood and pick up it  influenced yeah and I’m interested in  the politics you of course I can check  in Google Analytics data but you know I  wanted to provide value to this to my  readers if yeah if Google considers  Google doesn’t consider I I got it okay  thank you yeah yeah cool all right let’s  see what kind of questions we got  Patrick asks where his favicon is I I  think that’s that’s one question that’s  been running around for a while so we  recently started showing the the favicon  in the search results in the mobile  search results and it looks like people  are trying different things out I I  think in general that’s that’s okay we  we have some guidelines with regards to  what to watch out for and we generally  work to update the favicons  automatically so if you decide you want  to change your logo or if the favicon  then that’s totally up to you I have a  client website which is in the  plastic-surgery in industry and the home  page contains a gallery of  before-and-after photos to help sell  potential results that they can achieve  I found that the home page is almost  non-existent in the index when the  gallery is on the page and after  removing it the home page ranks again  can you elaborate on the level of nudity  or skin showing the the systems or like  what can we do to avoid this I don’t  know about your recipes my question is  you have a new okay on a specific  response cool so so I don’t know exactly  how it is with your specific site but  I’ve seen this kind of theme come up  before with regards to especially  plastic surgery websites because that  seems to be where you tend to have that  more and in talking with the the  SafeSearch team that kind of tries to  figure out which which page there should  be shown and with with the same search  filter on or with the cert SafeSearch  filter off they feel that this is kind  of working as intended so when we see  images where we’re kind of afraid with  regards to maybe the nudity or the the  amount of I don’t know skin or I don’t  know how you would frame it with regards  to Plastic Surgery sites but essentially  with regards to the images on the site  or on the page then that’s something  where our filters might kick in and say  like where we’re unsure here we’re going  to stay on the safe side so that’s  probably what what you’re seeing there  and especially if you’re saying that  when you include the gallery or when you  remove the gallery you see these changes  then that’s kind of points in that  direct  thanks for yeah so with regards to the  direction you can go from there I think  that’s kind of up to you if you feel  that this gallery of images is really  critical for your homepage then  obviously maybe it’s it’s worth kind of  taking that into account and saying well  I want these images on the homepage and  maybe my ranking would SafeSearch turned  on might not be as good or you could say  maybe there is a middle ground where you  Said’s like you have have a button kind  of like show show gallery and it takes  you to the gallery directly from the  homepage so that it’s not exactly the  homepage that’s that’s affected yeah  that was my own proposed solution so I  think you can find that cool alright  question about privacy policies terms of  services contact us about those pages if  you set up a website and such pages are  not published and a search quality rater  rates or guide your site negatively to  due to this will the Raiders later we  judge your site when they’re added at a  later date is there anything the  webmaster needs to do once these pages  are added so I I think you probably  mixing up two things here on the one  hand the the search quality Raiders are  people who do look at the search results  and the pages that we show in the search  results and they do follow the the  search quality evaluators guidelines  that I think we published or that we’ve  been publishing for a while now and that  does include things like watching out  for terms of services privacy policies  those kind of things  however those Raiders are there  primarily to judge different algorithms  that we want to test so what usually  happens is that one of the teams comes  up with an idea and says I I think  ranking would be better if we took this  factor into account or if we rank things  slightly differently  this new algorithm that they’re working  on and then for the e to kind of  double-check that their suspicions are  actually good we generate a set of  search results pages essentially with an  a version without that change and a B  version with that change and we send all  of these to the search quality raters  and then the Raiders look at these  search results they look at the websites  that are linked from there and then they  say well in this case the the change  with with the the algorithmic change is  better or in the other case maybe they  change without or the the version before  was better and based on that we try to  refine our algorithms so it’s not that  the Raiders would look at your site and  say this is a bad site it should drink  lower in the search results it’s more  that the Raiders would look at the  general algorithm and say well in this  case this version is better or this  version is better so what what tends to  happen there is that our algorithms kind  of take the the feedback that we give  Raiders with regards to rating things  into account  so that’s something where potentially  indirectly our algorithms could be  watching out for things like privacy  policies or Terms of Service about us  pages and that’s something that they if  if we were to pick that up I don’t know  if we would have an explicit algorithm  just for that I I think that probably  wouldn’t make so much sense but it’s  possible that there is some effect from  that taking into account our algorithms  and so if we were to look at that  explicitly and look for those kind of  pages what would happen there is when  we’ve repress time we crawl the pages  and we see that these pages now exist  and they’re linked on the website then  our algorithms would automatically take  that into account so there is nothing  special that you would need to do if you  make changes on your website in in a  direction where you think that this  actually improves your website overall  our algorithms would try to recognize  the new state of  the website as we find it and take that  into account when it comes to ranking  things so you definitely don’t need to  kind of wait for some rigor to come and  visit your website again this is  something that or other them’s tend to  do automatically over time John can I  can I have a question on on search  quality as well sure  so this is something that some of the  folks from the Romanian SEO community  brought to brought to me and it’s  regarding certain queries that after the  March update results seem how can I put  it  I’ll just give you an example so so mine  easier to look at so there’s one very  large retailer here that has a lot of  pages indexed including search pages and  those seems to have indexed these kind  of e-commerce search pages that you know  a contained all kinds of keywords and  they seem to show up and even though  it’s it’s basically the same kind of  pages that the content is almost the  same because that’s for example for bed  sheets a certain type of bed sheet so  you’ll notice that the first three  results are basically the same thing  just the keyword is singular plural  things like that so it’s kind of the  same thing and it’s search pages so I’m  not sure if that’s the best user  experience so to speak and that that  this happens for a lot of so I got I got  a lot of examples like this and this is  a this is one or more a bit more extreme  or it’s like the first eight results are  from the same website that’s kind of  dominating now high position pages yeah  I bet they’re happy though but yeah I  mean I’m just I’m sure it  there really no other website that’s  good enough to you know dope go against  three almost duplicate pages yeah I have  heard this before I think specifically  about Romanian websites maybe some some  other country as well where we  especially with some of the Equality  updates that we may eat I think over the  course of this year earlier this year  where people have been complaining and  saying that the quality overall isn’t as  good as it used to be I I don’t know  what what the plans are there I I do  know that the search team is aware of  these issues so it’s it’s something that  they’ve also seen they’ve also kind of  seeing the complaints that we passed on  to them so it’s not it’s not something  where from from our point of view I’d  say this is the way it should be but  rather this is maybe I don’t know  something that we could still be working  on okay it means aware of these kind of  yeah yeah I mean these kind of search  queries are really useful for us so if  you run across queries where you see  this is actually really bad or that the  search results are kind of okay but  they’re filled with essentially copies  of the same thing over and over again  they’re not known that’s that’s really  useful for us to have yeah I’m sure I’m  sure people are buying from from that  website it just doesn’t see organic to  have the same multiple copies as you  mentioned a basically in page ranking  out ranking it everything I’ll leave you  in the chat are linked to my product  form stage’ if you have any updates  cool sorry can I follow up on this  question sure isn’t it like a rule that  you should not index like tags pages and  like search pages as well you shouldn’t  indicate them we we do have a guideline  that you should block search results  pages which there are two reasons for  that  on the one hand what can easily happen  is that we crawl a lot of pages and we  overload the server and on the other  hand what often happens is that these  pages tend to be lower quality pages and  that they’re not so useful for users as  maybe something like a clean category  page would be with regards to tag pages  that kind of is is a question there as  well is is the tag page of search page  or is it a tag page more like a category  page and that’s something that really  kind of depends on the way that the  website is set up in the way that the  website tends to handle that and for  both of these I tend to see it as  something where we would see it as a  strong recommendation to block these  kind of search results pages from being  indexed but not as something where we  would say if we ever see a search  results page then we would apply a  manual action to that website because  what we’ve also seen a lot of times is  that there are some really good search  results pages where you look at them you  say well this is actually not an  auto-generated page like you would just  generate a page based on a million items  and you filter for keywords it’s  actually more like a category page and  in those cases I think it makes sense to  have those index because maybe people  are looking for I don’t know blue  running shoes and you have a search page  that’s blue running shoes actually it’s  it’s the search search page and not a  category page but it contains all of the  blue running shoes that you have which  is maybe a good result for users so if  the search page provides value today or  visit your user that’s okay to index  these pages yeah yeah I feel there is a  huge opportunity for manipulation to  ultimately generate these like for all  these long tail keywords and yeah yeah I  I think I think that’s that’s something  that has always been around it is always  a kind of worry but for the most part  I think our algorithms are generally  pretty good at recognizing the quality  of the page  so if we see that a site is  automatically generating search results  pages and they’re generating millions of  them then our algorithms would probably  look at that and say well these are  pretty low quality pages therefore maybe  there’s more of the website that’s also  low quality maybe should we should be  more critical with regards to ranking  this site overall so that tends to to  balance things out a little bit  sometimes things get get by our filters  or by our algorithms and we we show them  in search results more visibly so we’re  not perfect in that regard  thank you okay John I have some  follow-up question that so users may  come in to look at the quality  perspective is what the pages are good  enough to being so on the social pages  my question is regarding that this thing  or the category page is when you’re  talking about so how we can ensure that  I mean the quality is maintained on  those pitches really high as I mean  considering the we are only going to  provide the product details there like  an having good on these order price  details and everything so now then how  Google can actually decide or relate  that prospective as in quality which  pages has a better in towards like as  compared to others so how we can  diagnose that particular things I think  that’s hard yeah I there is no simple  answer there with regard especially when  you’re talking about pages that are  automatically being generated that’s  something where you almost have to take  your own expert knowledge of the the  topic area and apply that so what I’ve  sometimes seen is sites try to find  other metrics that they can follow to  get a light understanding of how users  would perceive the quality so I’ve seen  sites use things like analytics  information with regards to number of  visitors that actually go to these pages  they the behavior of the the users on  those pages those are sometimes useful  ways of looking at this I’ve seen some  really large sites say that  any any search results page for example  that has more than I don’t know two key  words in the query is something that  they would automatically block there are  different approaches that you can take  it’s not something where I’d say there  is one simple answer that works for  everyone I think overall if you have  good or reasonable category pages then I  would I would lock the search results  pages from being indexed just so that  you don’t have to worry about all of  these fine details like is this search  results page good enough or is this kind  of not good enough and how do i quantify  that if if you have clean good category  pages for your website then you don’t  need to rely on the search results pages  to also be indexed sure thanks John  Chuck another pin to the next question  all right hi sorry to jump into the  second one but I’ve got a website that  I’ve been battling with till about the  last six months I’ve just tries to be a  question and a link to website it’s  called rocket and still come to the  youth in the channel so I have three one  reader to the main thing so if you go to  any home today you on any variation of  301 redirects and I’ve completed the  change of address from Google search  console yet if you do a site colon  rockin don’t come today you search a  home page and all a lot of the other  page that still has this tool in the  index and you kind of help me diagnose  that if weather is something we can do  now or if the song we can do privately  because it’s been breaking my brain  that’s probably normal  so the confusing part there is when when  we process a redirect like this we we  still understand that the old site is  associated with these new URLs so what  what tends to happen there is with a  site query when you excessively look for  the old site will say well we know this  old site and we still have a bunch of  these URLs in our memory so we’ll show  them to you because you’re explicitly  asking for them and in practice though  when you look at the pages themselves  for example when you look at the cache  page or when you look at the the page  with the inspect your L tool you’ll see  that the canonical is actually the new  URL I don’t know the cache page it’s  kind of hard to tell because you  overwrite I think part of the top but in  general what you’ll see is that we’ve  moved on to the new URL as a canonical  URL which means we’ve processed the site  but if if someone explicitly asks us for  the old domain that will so show it so  essentially it’s it’s working it’s  probably working okay I don’t see it  offhand but it’s probably working okay  and just because you’re seeing the  o-type in the side query doesn’t mean  that the old site is still the one  that’s being indexed so simple way you  can double-check as well is to just  search for like the page title and when  I try that I get the dot-com version so  I think your site moves okay and just  the site query is kind of confusing  because on the one hand we’re trying to  be helpful and say well you’re looking  for the site and we know about the site  and leadership let’s show it to you on  the other hand you’re looking at those  results as a I’m trying to diagnose a  specific situation based on these  results and that’s not something that  you’d be able to do with a site query  there okay so then follow up to this  then is I’m also just posted in chat as  well is that the old website is still  ranking for for example the keyword of  strata management software when googled  in Australia it is in both position  three uncom delay you the ultimate and  calm for the new domain so this is the  kind of main point of contention with  the client because I read you a theory  for them and they currently have two  different groups as ranking when they  should have had everything consolidated  into one and then they agents 301  redirects okay I mean it’s just try that  query pad you see the dot-com  oh okay further down so what probably  happened is we we just don’t have that  one URL moved over yet so let me just  double check  I mean if it’s redirecting then that’s  generally not a problem but so my theory  behind this is that it currently if you  click on the URL and search results of  the Dondre you the 300 ones – revenge –  home didn’t you again and then at 301  redirects Creek home so it might be  something written their idea I still  feel that the it’s in position three and  six so ultimately the clients not really  complaining I want to make sure that  don’t call me see anyone in the actual  search results at the end of the day  yeah I I think what you could probably  do is use the inspect your l tool and  submit that URL to be recrawled and  reprocess that would probably pick that  up what what I expect would would  probably happen here though is that that  extra listing that you have kind of an  in the sixth place with a combat a you  would disappear in favor of the.com  version which I don’t know I mean the  the ranking would be okay I think that  would be fine it’s just that you don’t  have three listings shown for that query  and just would have to so I don’t know  if you would be much better off by by  forcing that over but it will happen  anyway over time okay so the website was  migrated at six months ago and the  dot-com delete the old domain distiller  position through so you’re saying that  if the change of address was completed  successfully the 301 redirect is still  in place correctly even if there is a  redirect chain the eventually Google  will pick it up and if I want to force  it across I should inspect the URL and  Google search console the new one yeah  and request the test live URL  Jacksonville is that correct yep yep  thanks for that cool sometimes a bit  tricky with individual URLs but a site  moves can take a bit of time and  depending on how things were set up and  when they were set up that made might  take a bit so what I also noticed is  your site I think is being indexed with  mobile first indexing so you need to  make sure that the mobile version also  redirects and has the canonical all of  that set up as well or maybe it’s even  responsive design which would make it a  lot easier yes responsive and there’s no  indoor no subdomain or anything like  that so yeah it also does have some  domains which are still active on the  old domain and there’s all sorts of  other things but if you say that I can  probably get the old results gone just  by indexing requesting indexing in the  pages that will probably solve my  problem for the majority of it so thank  you for that  cool all right  let me run through some of the other  questions that were submitted and then  we could do some more from you all we  have an article ative page that has  quite similar images to child pages are  listed on the archive the archive page  is banned from the safe search while the  child pages are considered safe what  factors do the safe search algorithm use  apart from images and vanity to consider  whether a page is safe I don’t know what  the the exact factors will be but we we  take into account a number of different  things and it can certainly happen that  a part of your site is seen as being  fine with regards to Save Search and  part of your site is seen as something  that we would filter with safe search  activated so usually what what I  recommend doing there is making sure  that you have a clearly separate clear  separation of this kind of content  so that it’s easier for our algorithms  to say well everything in this folder  here can be filtered by safe search  everything in in the rest of the site or  in this other folder here is generally  family friendly or safe search friendly  content and we’d be able to show that  appropriately so that’s probably the  direction I would head there it sounds  like your content is kind of borderline  with regards to how our safe search  algorithms might look at it which which  is fine I mean sometimes there’s content  out there like that but if you want to  take that into your own control a little  bit more than I’d recommend finding a  way to separate those two parts a little  bit clearer so that you have less or or  it’s a little bit more certain with  regards to safe search which parts would  be shown which parts would not be shown  my website gets hundreds of links that  seem to be spammy I suspect maybe one of  my competitors is trying to decrease my  rankings do I need to keep disavowing  these links week after week or should I  only be worried if I get unnatural links  manual action  so in general we we do automatically  take these into account and we try to  kind of ignore them automatically when  we see them happening and for the most  part I suspect that works fairly well I  I see very few people with actual issues  around that so I think that’s mostly  working well with regards to disavowing  these links I suspect if these are just  normal spammy links that are just  popping up for your website then I  wouldn’t worry about them too much  probably we figured that out on our own  if you’re worried about them regardless  if there’s if it’s something that you’re  not sure about you’re losing sleep over  these links and you just want to make  sure that Google handles them properly  then using the disavow tool is perfectly  fine the disavow tool is not an  admission of guilt or anything like that  you’re essentially just telling our  systems these links should not be taken  into account for my website and there  are multiple reasons why you might want  links not to be taken into account  that’s not something that our algorithms  would would try to judge for your  website so if you’re seeing spammy links  from certain sites using the domain  directive makes it easy to to handle  these into this eval file and you can  just submit those there on the other  hand if you’re if you feel that these  links are pretty normal spammy and  something that any algorithm would  figure out and you can just leave them  alone and just kind of move on I think  for most websites out there pretty much  they really largest majority of websites  you don’t need to use a disavow tool and  that’s also why we have to disavow tools  so separate from search console so that  you don’t get tempted to using this  avowal tool because it looks like  there’s normal part of search console  that everyone should be using but it’s  really something that you only really  need to use in in really extreme cases  we received a notification from the  trusted safety team regarding low  quality search results pages on our site  being crawled and indexed while we’re  working to solve this problem is there  any short-term risk of a manual penalty  being applied an algorithmic penalty or  Google taking any other action to remove  those pages from index I don’t know  which notification you received there  but in general when when you receive a  notification like this it’s because  someone from the manual webspam team has  taken a look and seeing that they’re the  real issues with regards to those search  results pages and usually what what also  happens at the same time then is that a  manual action is placed on specific URL  patterns from your website so that we  don’t index more of those pages so  probably if you received a notification  like that and this is also already where  we’re already trying to filter those out  of the search results pages so that’s  that kind of risk that you’re worrying  about there has probably already  happened solving that  and doing the reconsideration flow to  remove that manual action I think makes  sense regardless like I mentioned before  with search results pages there are two  aspects there on the one hand the  quality side which sounds like you ran  into there and on the other hand the  technical side that we end up crawling a  lot more pages and we actually need to  and both of those can play a critical  role for some websites  so that’s something where even if a  manual action is already in place to  kind of take these pages out of the  index I certainly try to find a way to  clean up the technical side as well so  that we don’t have to crawl and index a  ton of different pages just to keep up  with your website when these pages  wouldn’t actually be that useful for  users anyway yes yeah I have some  qualification this is it on your manual  action so recently working on a website  so this is what happens like I’m kind of  taking a charge of the website but  though it is targeting the international  market as well be I mean all this  country targeting with or subfolder wise  and the thing is I come in I was going  through this or manual action thing I  just what is they having a manual action  for unnatural links which is affecting  some pages of the website web sites are  either the third party maybe blogger or  something like that apart from that we  don’t have any backing as search well  just wanted like I mean what is the  reason of having the manual action for  those kind of backlinks apart from that  I am seeing this or manual action  notification in every country side like  I mean I have verified individually  different folder for different countries  per se so I’m getting is manual action  for each and every country so having a  manual action for domain.com where it is  going to affect my others folder  structure is good so are they subfolders  or are they different domains yeah they  are subfolders okay yeah I mean if  they’re subfolders and that would what  kind of makes sense to show that in  because it’s essentially the same domain  so showing the same manual action there  I think what makes sense  there are two places where where I I  remember where we take this this kind of  kind of targeted manual action with  regards to specific pages and specific  links on a side and on the one hand that  could be if if a site is buying links  for example for very specific elements  within within the site or doing  something kind of sneaky with regards to  links in that regard so that’s something  where we sometimes take very targeted  action and say well we’ll just ignore  this specific set of Link’s because we  know that there it’s pretty well defined  it’s something that we can isolate  fairly well and we’ll just work to  ignore those on a manual basis another  place where I’ve also seen that happen  is specifically around reputation  management so that’s one area where  we’ll also often see that people build  people usually SEO so reputation  management companies build up links to  pages that they find favorable and these  these might be completely the legitimate  websites they’re talking about maybe one  person or one company in a reasonably  favorable way and in order to try to  hide the less favorable content in the  search results they’ll go up and buy  links or build links for specific pieces  of content on essentially legitimate web  sites and from our point of view that  that’s also problematic because these  are unnatural links to those pages and  in order to resolve that sometimes when  we have to do that manually we will do a  manual action specifically regarding  those links and those pages so that’s  something where as the website that has  a website that might be affected by  something like this there’s nothing  really that you need to do there it’s  not that you need to kind of clean up  those links at someone else placed  they’re pointing at articles that you  have on your site but rather we’ve kind  of taken care of that on our side with a  manual  so from from our point of view if this  is isolated like that then that wouldn’t  be affecting the rest of your website it  would really be affecting that kind of  set of links specific set of Link’s  pointing at a specific set of pages on  your website and we’re essentially just  neutralizing those so that you don’t  have to worry about that or do anything  with that  sure thanks John Oh entitlement oh can  we get a set of those examples basically  I mean we don’t know what are those  pages actually and and if we know like I  mean if somebody knows who’s actually  building the links they know for what  pages they’re actually targeting but  when it’s coming through the third party  or anyone is building backlinks or any  third party website so we don’t know so  we are just kind of you know very you  can say in a state I mean what page is  to diagnose or I mean we have a very  huge website millions of pages out there  and we don’t know like I meant where it  is getting affected yeah I I know it’s  the the manual webspam team tries to  provide examples whenever they can but  sometimes that’s not that easy so the  the usual recommendation that I’d have  there is to file a reconsideration  request and give some information on  what you’re seeing or what you’re kind  of confused about and sometimes the  manual website and things will be able  to come back to you and say well it’s  affecting this set of queries or this  set this set of pages on your website or  on the external website so sometimes  they can come back with that kind of  information it’s not guaranteed that  they’d be able to do that okay yeah  thanks John  hey actually I think playing Google  PageSpeed insight recently and I saw  that there were least one sign was  around 1 to 1.5 seconds which is very  terrible so I was thinking to try some  new hosting or move my host you so my  question is changing the IP effects  ranking or not now changing IP you sign  okay  okay one more thing like this is my  fourth sign same question and the images  are is still not index I and I  I’m unable to find any reason like I am  serving the latest version of the image  VP version and everything is fine but it  still the images are not getting in the  Google search result and the old console  was able to show us whether the images  are indexed or not but in this search  console there is no option with it from  which we can look at it so what can I do  right now it’s been more than one month  what what do you mean more than one  month the images all the images are the  index only the author images which are  posted on here after the fight is on  WordPress only the author images are  available on google images and there is  no other images available on the google  and that’s why my ranking is affecting  and it’s getting down everyday would  only affect traffic from google images  so so the the normal web search ranking  wouldn’t be affected by that but if you  can copy the website and some sample  images into the the chat here I can take  a look at that afterwards ok I have told  you other site URL you can check the  comments the technics is a dot-com and I  think this is my fault I’m telling you  ok cool you see it NO xyz.com ok yeah  there is no manual action and no  messages in the Google search console  regarding this all right  that’s the in reference to recent  announcements on mobile first indexing  for new sites does this come into play  for new sites or only for new domains  important to know since the default  States for new websites will be mobile  force indexing since I am considering  redesigning my site it’s specific to new  domains  so essentially what what is happening is  we’re shifting to from a model where we  have a list of all the domains that are  already on mobile first indexing to  model where we’ll have a list of all the  domains that are not  on mobile first indexing so anything  that is new will be processed with  mobile first indexing by default so if  you’re doing a redesign and if you’re  launching on an or kind of revamping on  a separates folder on your website or  even on a separate subdomain of your  website all of those would still be seen  as being part of the old domain so  whatever your old domain status is would  would play a role here so if it’s  already moved to mobile first indexing  then that redesign will be moved as well  if it hasn’t moved to mobile first  indexing then that redesigned that  change on your website would also not be  under mobile first indexing and as we’ve  talked about before it’s not that mobile  first indexing is a ranking factor so  it’s not that you need to be in mobile  first index in order to be competitive  but rather this is more of a technical  thing on our side with regards to how we  crawl and index the the content from  your website I have a publisher who has  a site on WordPress they’re considering  disabling a heavy plugin to thoughts  only to improve PageSpeed scores the  plugin makes no visual visual changes  would that be advised or not recommended  if you have a plugin that doesn’t make  any changes then personally I would  remove that because usually the the less  overhead you have with regards to code  that’s running on your server the less  you have to worry about so specifically  with regards to maintenance but also  speed like you mentioned here those are  those are two aspects that I think are  pretty critical so if if you have a  plug-in that isn’t doing anything useful  for your site then maybe make sense to  disable that working for a publisher  with regards to URL structure we’re  using very short urls like domain.com  slash one two three four they’re easy to  share and  but the URLs for news articles that  contain keywords or similar information  many SEO agencies have pushed us to have  speaking your own so with keywords in  them in cases speaking URLs get  generated by titles which contain  numbers that change after a while the  editor may made a typo a few minutes  later the the URL could change as well  this is why we decided to use just IDs  which URL structure would you recommend  for news articles and is it advisable to  change the whole URL structure to  speaking URLs that would be necessary to  implement redirects for all existing  articles so generally in talking with  the team on our site they’ve always said  that you don’t need to have keywords and  urls and they’re essentially equivalent  on our side so we use URLs primarily as  an identifier there is a very small  factor with regards to keywords in your  house but if you’re working on a larger  website and that’s something where you  probably won’t see that that’s affect at  all the the bigger thing that I would  watch out for here which I think there  is the one that you’re worrying about is  if you change from one URL format to  another what will happen then so in  general what happens there is that what  you need to recrawl and pre-process the  whole website in order to understand  what the new status is so that’s  something where there’s definitely a  longer period of time where you’ll see  fluctuations across your website until  we’ve been able to kind of settle things  down again and understand that this new  URL on your website is linked from these  different parts of your website and the  context of these pages is like this or  like that  so you definitely see a fairly a  reasonably long time of things in  fluctuation there and it’s a lot harder  for us to reprocess something like this  then for example the site  if you move from one domain to another  and you map your URLs one to one to your  new domain then we can process that  pretty quickly sometimes within a day or  so  but if you change all of the URLs across  your whole website and that’s something  where I imagine you’d see changes in the  order of several months until everything  is settled down so that’s that’s kind of  the the one aspect there and that you’d  see these fluctuations the other aspect  that you might see you very minimal  change in the end it kind of also comes  into play there so that’s more a  question on your side are you willing to  take into account that maybe for a  couple of months things will be  fluctuating quite severely just so that  you have this potentially improvement  with regards to some keywords in your  URLs so that’s just kind of tricky I  don’t know which direction I would take  there if you’re sure that keywords and  urls make sense for example if you can  separate pages on your website out  better based on folders and you can  separate things out a little bit clearer  so that you can track specific parts of  your website’s easier so kind of like  you mentioned here you have / news /  politics if you want to track the  performance of your politics content  then having that in a separate folder  makes it a lot easier compared to  everything kind of in a flat structure  where you just have one number so from  that point of view I try to find the  different pros and cons to the new  structure and the old structure and  think about like how many pros do you  need to kind of cancel out this  potential multi-month time of  fluctuation or maybe you’re saying well  this is actually more of a long-term  goal and we like to get to kind of more  keywords in URLs but we don’t need to go  there from one day to the next and we  have maybe the engineering resources to  do that incrementally then that might be  something where you’d say well all new  content gets these new format URLs and  the old content stays on the old ones  until a time where we’re sure that  things are working out with the new  euros and then at that point we migrate  the old URLs as well and if it’s in the  news website then usually the older URLs  are less visible in search anyway so if  there’s some fluctuation with regards to  how the old URLs are shown in search  then that’s probably not something that  would be critical for your website so  that might be another approach to take  to kind of take it incrementally to a  state where you’re saying well this is  more of what I’d like to have oh wow we  still have a bunch of questions left and  almost no time let me see if I can run  through a bunch of these and we can run  over a little bit as well  you mentioned the hidden content and  ranking and not being able to use that  for that kind of snippets with regards  to tabs and mobile first indexing I  don’t know what what the current status  is there I need to double check my  understanding is that with regards to  mobile first indexing we would see this  as content that’s normally visible as  well but I need to double check not  hundred percent sure working on  e-commerce sites that is regularly  changing some category pages too nuanced  leaving behind a lot of blank pages with  200 result code the behavior can’t be  changed so I need to do something there  so I’m thinking of redirecting them  through a Google tag manager with  JavaScript redirects do you think that  would work or which one would you prefer  JavaScript or meta refresh I I think  JavaScript redirect would be preferred  there we probably picked that up but  especially if you’re going through  Google tag manager and a JavaScript  redirect then that’s something that  would take a bit longer to process than  a clean server-side redirect so  depending on how big of this problem  this actually is for your website it  might be worth trying to find a way to  do that server-side if this is more of a  cleanup thing that you think you’d like  to do in the long run then maybe that’s  okay to just leave it like this I social  bookmark my site after one week I  noticed that the site which I bookmarked  has been marked as spam so the pages  have through the pages have any effect  on my ranking I don’t think those pages  would have any effect on your ranking  because probably we’re also ignoring  those social bookmark sites the links  from there that’s a really really old  SEO strategy and we have a lot of  practice in recognizing those kind of  links and just ignoring them  when will the speed and search console  tool get updated with the Evergreen  Googlebot any timeline I don’t know I  know the team is working on that so I  wouldn’t expect it to take too much  longer but I don’t have any specific  timeline question about Google News I  saw a big drop in news since the second  week of March and kept declining ever  since I don’t know about Google News and  how that would would play together with  regards to being penalized from Google  News I suspect if that were the case  then you would not see your site at all  in Google News it wouldn’t be that it  would just be ranking lower in Google  News but I don’t really have much  insight into the Google News site can  you confirm that links top linked pages  externally in search console does not  include 404 URLs linked externally I  think it would be very useful to  discover broken links from search  console something that we can only do  with external tools as far as I know  that the linked pages would not include  links to 404 pages at least not to 404  pages that we know about because what  would happen or what needs to happen  with regards to links is we have to have  one page that is linking like where the  link is coming from and a link  destination and if we don’t have either  of those then essentially that link is  not something that we can we can track  in our systems or that would be useful  to track in our systems so for that  probably using external tools is an  option is it a good idea to use my PPC  landing pages to rank for local areas  maybe that’s kind of up to you and what  kind of landing pages you have but what  I’ve seen sometimes is people using PPC  or similar topics to test landing pages  and see how to create landing pages that  perform really well and then to update  the normal search ranking pages as well  to kind of that newer model half my  pages are shown and crawled currently  not indexed it’s quite a long time now  I’m guessing it’s because low quality  content and I’m not really sure is there  any way to check the reason why it’s not  indexed and how to resolve no there’s  not usually not really a way to  double-check the reason why it wouldn’t  be indexed it’s it’s completely normal  that we don’t index all pages on a  website so just because you see a large  chunk of pages that are either crawled  currently not indexed or I think the  other one is submitted and not indexed  that can be completely normal if you’re  worried about the quality of your  website and usually that means there is  something with regards to quality that  you can improve and we do try to take  into account quality when it comes to  picking which pages we want to have  indexed so that could be something to  focus on as well obviously that’s kind  of tricky because if we’re not indexing  those pages we’re not taking the quality  of those pages into account either so  you need to focus on the quality of your  website overall not just the quality of  the pages that are currently not indexed  can anyone tell me how different is  hosting from a free hosting like  blogspot than from a paid hosting will  it have any effect that my search  results ranking is blogspot considered  inferior to hosting with WordPress so  these are all different ways of hosting  hosting a site usually what what the big  difference there is with regards to the  free kind of platforms or the the  simpler platforms and something like  WordPress is with regards to the  flexibility and the customisation  opportunities that you have there so if  you just want to have a website out  there and you just need to have your  content visible at all then using any of  the platforms out there is usually a  good idea but if you need to do  something very specific if you want to  track things in a very specific way if  you’re I don’t know collecting leads in  a very specific way if you need to do  something fancy with regards to  multilingual users or to use us in  different countries then that’s  something where having your own hosting  maybe on WordPress maybe on some of the  other platforms out there can make a lot  of sense  so that’s usually the way that I would  look at it there if you’re not sure what  I’d recommend doing is setting up your  own domain in any case that’s something  that you can do with most of these free  platforms and when you’re using your own  domain if you later switch to a more  customized a little complex for meeeeee  like WordPress or anything else then you  kind of keep all over your content and  keep all of the URLs and you can just  move on to the new platform so that  makes it a lot easier to migrate and  that way you can kind of work with the  free platform you recognize where the  limitations are where you’d like to do  more than what is possible and if you  find a different platform that offers  all of those opportunities that you’re  looking for then maybe at that point it  makes sense to move on but it’s  certainly not the case that there’s any  inherent SEO advantage to  these platforms your content is hosted  we can access that users can access it  so from that point of view that’s that’s  all fine is it a good idea to block a  certain URL from being crawled or is it  better to let Googlebot decide and use  the configure URL parameter tool the the  URL parameter tool is a great way to  give us hints about which URLs or URL  patterns you don’t want to have crawled  but it’s not the same as a robots.txt  file so if you need to block crawling  for example if your server does  something that takes a lot of time to be  processed when we access that URL then  that might be something where you want  to use the robots.txt file to really be  clear and say I don’t want you to access  these URLs at all how can we report  manipulated structured data this  ephemeral page has no relevant category  there’s a structured data or rich rich  snippet spam report form that you can  use for single pages like about us is it  recommended to use Rich Snippets for the  single article on the page or not  totally up to you  that’s something like an about Us page  if you have a structured data type that  fits that I think that’s fine I don’t  think you’d see any special handling in  the search results with regards to using  just the article markup for example on a  page like that our website has 140  language and country localized versions  all connected correctly with HR flang  but Google is often showing the wrong  versions in the search results is there  any limit to the number of href Lang  have alternates that we can include does  Google treat this as a suggestion rather  than a strict directive there’s no limit  to the number of alternatives that you  could provide however there are  practical limitations in the sense that  if you have a hundred and forty copies  of all of your content  that we have to crawl and index all of  those copies and we might not be able to  crawl and you know all of those copies  depending on the website so that’s  something that sometimes plays a role  there does Google treat this as a  suggestion or directive we treat this  more as a suggestion it gives us a  little bit more information about those  pages we can’t treat it as a strict  directive because we really need to make  sure that it actually works out well or  rather it’s not that we can’t it’s more  that most of the time when we don’t when  we don’t take that into account it’s  more that there are technical reasons  why why we don’t do that  so for example maybe clean backlinks  canonicalization plays a role there as  well and then the general setup of the  atria playing  Constellation is something that that  often plays we’re all there so it’s not  that I’d say we treat it as a subtle  hint but more as something or I’d say  there are a few things that need to  align for us to be able to use it  properly and just because we see the  href Lang link on one page doesn’t mean  that we will always show exactly that  link we really need to make sure that  the rest of the signals also aligned  before we can show that href Lang URL in  the search results and especially if you  have a hundred and forty versions then  that’s a lot of versions that can kind  of subtly fall out of alignment so I  imagine especially with a bigger setup  like that that it gets tricky  I don’t know which website this is but  in general I’d recommends using fewer  variations rather than more variations  so in particular if you have the same  language content for a number of  different countries you don’t really  have anything specifically country  dependent on those pages then I’d  recommend folding those together and  just having one really strong language  version rather than having all of the  individual country versions obviously  for larger websites that’s a lot easier  said and  okay I imagine there are still more  questions that got submitted we’re way  over time now so I’ll set up the next  batch of hangouts you’re welcome to drop  things there or if there’s something  more general that you’d like answered  we’re doing another set of Q&A videos so  for those you essentially just need to  tweet the question at Google webmasters  or the Google  WMC account with the hashtag ask Google  webmasters and we’ll be picking out a  bunch of those questions soon and start  recording videos and the aim is to do  that more on a regular basis so that  it’s not just like one-time shot but  actually that we do this on a regular  basis and these are specifically more  like I guess the general questions that  are easier to answer where the the  question fits into a tweet which are not  site-specific and where we can kind of  formulate a general answer that would be  useful for everyone so you’re welcome to  do that as well and of course there’s  also the webmaster help forum where  bunch of volunteers spend their time  helping out and they have a lot of  experience with a lot of these webmaster  questions pretty much everything that  comes up in the webmaster hangouts  they’ve seen as well so if you drop  questions in the forum there that’s also  a good place to get advice or if you’ve  been following these hangouts for a  while and you can probably also help out  in those forums and help to answer a lot  of the questions that are dropped there  too okay so I think with that let’s take  a break here it’s it’s been great having  you all here thanks for all of the tons  of questions that were submitted to sign  and I wish you all a great weekend and  hopefully we’ll see each other again in  one of the next hangouts all right bye  everyone thanks John  bye have a great day thanks Joan bye-bye  thank you