Last Updates

How to Fix Index coverage issues detected in Google Webmasters tool/ Search Console

Recently Google released a new beta search console which will help webmasters to get more insightful information and activity reports of their website. Yesterday morning I got an email from Google stating that there are some Index coverage issues found on my website. And they told me to fix it to avoid negative impact on my SEO ranking. I was shocked and thought something wrong happened, and after a while, in response to this email I decided to check what exactly happened in Google Search Console and I opened my webmaster tool/ search console and found the following message(same as Email).

Google Webmasters Tool


"Search Console has identified that your site is affected by 1 new Index coverage related issue. This means that Index coverage may be negatively affected in Google Search results. We encourage you to fix this issue."
Nobody wants to lose their organic traffic from Google Search and so this is very scary for every webmaster. After researching and understanding the problem I was successful to fix the "Index coverage issues" and submitted to Google for validation for my site, and now I am going to tell you how I fixed this issue. So please be with me, be with Mastisky and see how I solved this. So let's begin....

How I Fixed Index coverage issues detected in Google Webmasters Tool

I just went to Google Webmaster Tool and looked for the message stating the Index Coverage issues and then clicked on view details link and saw how many pages of my website got impacted due to this.
How to Fix Index coverage issues detected in Google Webmasters tool/ Search Console

Recently I have changed my websites hosting(Domain is same) and that's why all the old links became outdated and  blocked by new robots.txt of my new hosted site now but all the links still appear in Google search, so Google Search Console warned me because that's all old pages are Indexed but now they are blocked by my new robots.txt file. I got a list of old and outdated links that are indexed though blocked by robots.txt now.
How to Fix Index coverage issues detected in Google Webmasters tool/ Search Console2

After understanding the problem I removed all the old links(URLs) by requesting Search console from Remove outdated content tool.
Remove outdated content
Remove outdated content
Once when I realize that all the old URLs are removed then I resubmitted it to Google for further Validation.


That was the issue with my site but your issue may be something different. But don't worry about this because now I am going to discuss other issues you may have.

Your issue is either by following, identify it first.
  • Robots.txt file which is not allowing Google to index particular pages. That was the issue with my site and I told you the steps how I fixed this issue.
  • Pages are not getting fetch using Google Bots. For me, it is working as expected. If you have this issue then please use Fetch and Render tool under Crawl section in Google search console to Fetch your sites URLs.
for particular pages. If you have this issue then go to the impacted pages and check the source code, you will notice noindex, nofollow or noindex,follow meta tag. You have to edit this tag for all the pages got impacted by this issue. Change the tag with index,follow for all that pages.

  • Your sitemap is incorrect and needs resubmission. You have this issue? Okay, check your website's sitemap and resubmit it to Google search console in sitemap option under crawl section.
If still you have any doubt or any problem then please feel free to me in the comment section below. I will always be hapy to help you.

I hope you found this post helpful, if I am right then please Subscribe to our E-mail newsletter so that you never miss any post of Mastisky. Thank you friends, hope to see you again in our next post- When and how to use noindex, nofollow meta tags for SEO.

2 comments:

  1. I am having a similar issue - the error is because we switched hosting providers and the URL is outdated - but I've set up 301 redirects for all URLs, including the one in question. I attempted to have the URL removed, but it says it's still live - however when I navigate to it, it redirects to the new page. It's strange, because I'm only getting the error for one URL, but there are over 40 that have been 301 redirected.

    ReplyDelete
    Replies
    1. Hello there,
      You meant to say that you are having problem to remove just one URL and you have successfully removed other URLs though all are outdated and redirected to new URL, right? First off all check the source code of that outdated/old URL page. If the source code still exists then remove source code and try again to remove it from Google search console.

      But if there is nothing on that page because you already have deleted this page, then I will suggest you to disconnect the redirection mode for that URL you are facing problem for because when you are trying to remove that URL from Google search engine, Google is checking that outdated URL but as you redirected it to the new URL, Google is also getting redirected to the new URL and Google is finding that the page is still alive. Once you have disconnected the redirection mode for that URL, you can try again to remove it from Google search console.

      I hope I am successful to make you understand the problem you are facing. If not, then please describe your problem in detail here, and if I am missing any point of your problem then please feel free to let me know.

      Thank You.

      Delete