Google released a new beta search console. This helps webmasters to get more insightful information and reports of their website activity. Yesterday morning I got an Email from Google stating that there are some “Index coverage issues” found on Mastisky. And they told me to fix Index coverage issues to avoid negative impact on my SEO rankings.
I was shocked and thought something wrong happened, and after a while, in response to this email I decided to check what exactly happened in Google Search Console and I opened my webmaster tool/ search console to fix index coverage issues and found the following message(same as Email).
You can’t find any site owner who wants to lose their organic traffic from Google Search engine. And so it is very scary for each and every webmaster or website owner or Blogger. After researching and understanding the problem I was successful to fix index coverage issues. And submitted it to Google for further validation of my site. Well, you know what, now I am going to tell you the exact way to fix Index coverage issues in Google search console. So let’s begin….
How did I Fix Index coverage issues detected in Google Webmasters Tool?
I just went to Google Webmaster Tool. And looked for the message stating “Index coverage issues” and then clicked on view details link. And I came to know how many pages of my website got impacted due to this Index coverage issues.
Recently I have migrated from Wapka to Blogger and that’s why all the old links became outdated and blocked by new robots.txt now. But all the links still appears in Google search. That’s why Google Search Console notified me. Because that’s all old pages are Indexed, but now they are blocked by my new robots.txt file. I got a list of old and outdated links that are indexed though blocked by robots.txt now.
After understanding the problem I removed all the old links(URLs) by requesting Search console from Remove outdated content tool.
Once when I realize that all the old URLs are removed then I resubmitted it to Google for further Validation.
That was the issue with my site but your issue may be something different. But don’t worry about this because now I am going to discuss other issues you may have.
How to fix Index coverage issues?
1. Robots.txt file which is not allowing Google to index particular pages. That was the issue with my site and I told you the steps how did I fix index coverage issue.
2. Pages are not getting fetch using Google Bots. For me, it is working as expected. If you have this issue then please use Fetch and Render tool under Crawl section in Google search console to Fetch your sites URLs.
3. It’s not getting indexed in the search engine due to incorrect meta tags “noindex, nofollow” or “noindex,follow” for particular pages. If you have this issue then go to the impacted pages and check the source code, you will notice noindex, nofollow or noindex,follow meta tag. You have to edit this tag for all the pages got impacted by this issue. Change the tag with index,follow for all that pages.
4.Your sitemap is incorrect and needs resubmission. Do you have this issue? Okay, check your website’s sitemap and resubmit it to Google search console in sitemap option under crawl section.
If still you have any doubt or any problem then please feel free to me in the comment section below. I will always be happy to help you.
I hope you found this post helpful, if I am right then please Subscribe to our E-mail newsletter so that you never miss any post of Mastisky. Thank you friends, hope to see you again in our next post- When and how to use noindex, nofollow meta tags for SEO.