Google released a new beta search console. This helps webmasters to get more insightful information and reports of their website activity. Yesterday morning I got an Email from Google stating that there are some “Index coverage issues” found on Mastisky. And they told me to fix Index coverage issues to avoid negative impact on my SEO rankings.

I was shocked and thought something wrong happened, and after a while, in response to this email I decided to check what exactly happened in Google Search Console and I opened my webmaster tool/ search console to fix index coverage issues and found the following message(same as Email).

“Search Console has identified that your site is affected by 1 new Index coverage related issue. This means that Index coverage may be negatively affected in Google Search results. We encourage you to fix this issue.”

You can’t find any site owner who wants to lose their organic traffic from Google Search engine. And so it is very scary for each and every webmaster or website owner or Blogger. After researching and understanding the problem I was successful to fix index coverage issues. And submitted it to Google for further validation of my site. Well, you know what, now I am going to tell you the exact way to fix Index coverage issues in Google search console. So let’s begin….

How did I Fix Index coverage issues detected in Google Webmasters Tool?

I just went to Google Webmaster Tool. And looked for the message stating “Index coverage issues” and then clicked on view details link. And I came to know how many pages of my website got impacted due to this Index coverage issues.

fix index coverage issues

Recently I have migrated from Wapka to Blogger and that’s why all the old links became outdated and blocked by new robots.txt now. But all the links still appears in Google search. That’s why Google Search Console notified me. Because that’s all old pages are Indexed, but now they are blocked by my new robots.txt file. I got a list of old and outdated links that are indexed though blocked by robots.txt now.

fix index coverage issues

 

After understanding the problem I removed all the old links(URLs) by requesting Search console from Remove outdated content tool.

fix index coverage issues

Remove Outdated URL

Once when I realize that all the old URLs are removed then I resubmitted it to Google for further Validation.

That was the issue with my site but your issue may be something different. But don’t worry about this because now I am going to discuss other issues you may have.

 

How to fix Index coverage issues?

1. Robots.txt file which is not allowing Google to index particular pages. That was the issue with my site and I told you the steps how did I fix index coverage issue.

2. Pages are not getting fetch using Google Bots. For me, it is working as expected. If you have this issue then please use Fetch and Render tool under Crawl section in Google search console to Fetch your sites URLs.

3. It’s not getting indexed in the search engine due to incorrect meta tags “noindex, nofollow” or “noindex,follow” for particular pages. If you have this issue then go to the impacted pages and check the source code, you will notice noindex, nofollow or noindex,follow meta tag. You have to edit this tag for all the pages got impacted by this issue. Change the tag with index,follow for all that pages.

4.Your sitemap is incorrect and needs resubmission. Do you have this issue? Okay, check your website’s sitemap and resubmit it to Google search console in sitemap option under crawl section.

If still you have any doubt or any problem then please feel free to me in the comment section below. I will always be happy to help you.

I hope you found this post helpful, if I am right then please Subscribe to our E-mail newsletter so that you never miss any post of Mastisky. Thank you friends, hope to see you again in our next post- When and how to use noindex, nofollow meta tags for SEO.

I am an Electrical Engineer. Completed my B.Tech from Aliah University. Web-design is my hobby and Mastisky is my passion. I love to help others.

Categories: BloggerWordpress

mhreja

I am an Electrical Engineer. Completed my B.Tech from Aliah University. Web-design is my hobby and Mastisky is my passion. I love to help others.

2 Comments

mhreja · March 7, 2018 at 9:14 pm

Hello there,
You meant to say that you are having a problem to remove just one URL and you have successfully removed other URLs though all are outdated and redirected to new URL, right? First off all check the source code of that outdated/old URL page. If the source code still exists then remove source code and try again to remove it from Google search console.

But if there is nothing on that page because you already have deleted this page, then I will suggest you to disconnect the redirection mode for that URL you are facing problem for because when you are trying to remove that URL from Google search engine, Google is checking that outdated URL but as you redirected it to the new URL, Google is also getting redirected to the new URL and Google is finding that the page is still alive. Once you have disconnected the redirection mode for that URL, you can try again to remove it from Google search console.

I hope I am successful to make you understand the problem you are facing. If not, then please describe your problem in detail here, and if I am missing any point of your problem then please feel free to let me know.

Thank You.

Unknown · February 7, 2018 at 10:01 pm

I am having a similar issue – the error is because we switched hosting providers and the URL is outdated – but I've set up 301 redirects for all URLs, including the one in question. I attempted to have the URL removed, but it says it's still live – however when I navigate to it, it redirects to the new page. It's strange, because I'm only getting the error for one URL, but there are over 40 that have been 301 redirected.

Leave a Reply

Your email address will not be published. Required fields are marked *