The “new” Google Search Console rolled out to webmasters in 2018. You may have received a message called “Introducing the new Search Console (beta)” when this occurred or read about it on Google’s blog here.
Since then, a scary warning has gone out to thousands of websmasters. That warning, or error, reads “New Index coverage issue detected for site www.YourDomain.com” and says a new issue was found such as “Submitted URL blocked by robots.txt”.
It looks like this:
What does “New Index coverage issue detected for site” mean?
This could mean anything, but in most cases your site map lists a file or folder such as “/images/” which is blocked by your robots.txt file. The robots.txt file blocks search engines from crawling parts of a website. In many cases a robots.txt file will block all sorts of things such as an admin login screen. Wannabe hackers are always scanning the web for these pages and keeping them out of search results prevents unnecessary hack attempts.
It is however always wise to go ahead and click on “Fix index coverage issues” to see which resources have been blocked. Every time I have seen this message, what was blocked was one single irrelevant file or folder and I completely ignored the message. One error like this is not going to cause your rankings to tank in search or harm your web presence in any way, shape or form.