Google released a new beta search console that will help webmasters to get more insightful information and reports of their website activity. Yesterday morning I got an email from Google stating that there are some “Index coverage issues” found on my website. And they told me to fix index coverage issues to avoid negative impact on my SEO ranking. I was shocked and thought something wrong happened, and after a while, in response to this email I decided to check what exactly happened in Google Search Console and I opened my webmaster tool/ search console to fix index coverage issues and found the following message(same as Email).
You can’t find anyone who wants to lose his/her organic traffic from Google Search engine and so it is very scary for each and every webmaster or website owner or Blogger. After researching and understanding the problem I was successful to fix index coverage issues and submitted to Google for validation for my site, and now I am going to tell you how did I fix index coverage issues. So now you are going to learn about the process how I solved/ fixed this issue. So let’s begin….
How I Fixed Index coverage issues detected in Google Webmasters Tool
I just went to Google Webmaster Tool and looked for the message stating the Index Coverage issues and then clicked on view details link and saw how many pages of my website got impacted due to this.
Recently I have changed my websites hosting(Domain is same) and that’s why all the old links became outdated and blocked by new robots.txt of my new hosted site now but all the links still appear in Google search, so Google Search Console warned me because that’s all old pages are Indexed but now they are blocked by my new robots.txt file. I got a list of old and outdated links that are indexed though blocked by robots.txt now.
After understanding the problem I removed all the old links(URLs) by requesting Search console from Remove outdated content tool.
Once when I realize that all the old URLs are removed then I resubmitted it to Google for further Validation.
That was the issue with my site but your issue may be something different. But don’t worry about this because now I am going to discuss other issues you may have.
Your issue is either by following, identify it first.
1. Robots.txt file which is not allowing Google to index particular pages. That was the issue with my site and I told you the steps how did I fix index coverage issue.
2. Pages are not getting fetch using Google Bots. For me, it is working as expected. If you have this issue then please use Fetch and Render tool under Crawl section in Google search console to Fetch your sites URLs.
3. It’s not getting indexed in the search engine due to incorrect meta tags “noindex, nofollow” or “noindex,follow” for particular pages. If you have this issue then go to the impacted pages and check the source code, you will notice noindex, nofollow or noindex,follow meta tag. You have to edit this tag for all the pages got impacted by this issue. Change the tag with index,follow for all that pages.
4.Your sitemap is incorrect and needs resubmission. Do you have this issue? Okay, check your website’s sitemap and resubmit it to Google search console in sitemap option under crawl section.
If still you have any doubt or any problem then please feel free to me in the comment section below. I will always be happy to help you.
I hope you found this post helpful, if I am right then please Subscribe to our E-mail newsletter so that you never miss any post of Mastisky. Thank you friends, hope to see you again in our next post- When and how to use noindex, nofollow meta tags for SEO.