How to Troubleshoot Coverage "Indexs even if blocked by robot.txt" | Google Console
Before going into the main part of this discussion, it’s good to understand a little Robot.txt , just as a reference, because if we want to fix problems related to robot.txt errors or warnings then the most basic thing is to understand its basic functions;
Robot.txt function ?
This one robot is quite a mystery to its existence, because no one has ever seen the real form of the robot. Because its existence is quite mysterious, this makes beginner bloggers a bit confused about it. To explore millions of websites around the world, search engines like Google use Robot.txt to help them get to know each part of the website, making it easier to visit. what search engines do for each part. When search engines like Google visit our website, the first thing they visit is the Robot.txt file, so the search engine knows what to do, which parts the crawler can and cannot crawl.
Actually, this article is just to answer the curiosity of beginner bloggers like me who are often made curious by the appearance of the warning “Indexs, even though it is blocked by robot.txt”. The second reason this article was created is because there are quite a lot of search keywords, that’s the reason why this article appears.
According to Google’s help, which I’ve read dozens of times discussing this issue, the answer is quite simple “if the warning appears, just ignore it, because it’s already innate from the blog we have”.
What is the impact for the blog?
So far no one has complained about the effect of “Indexed , even though it is blocked by robot.txt”. But it doesn’t feel good to look at if we open the google console, enter the scope, then see a yellow warning “Valid with warning”, there seems to be something missing, less beautiful to look at. When I first started blogging, I saw a warning like that, I immediately panicked, I became a ninja, opened a new tab, kept typing in the search with the keywords “what is it, how to set it, how to check, how to solve it in the index even though it is blocked by a txt robot, some even search with keywords ‘how to see blah blah blah, ” ,based on experience :), calm.
Cause The warning appears?
So that the questions in the minds of all are answered, then there is no harm in looking deeper. The reason for the warning is because; Meta tags used , The label page that should be blocked by the default robot.txt from the blog , on the label search
In that section it should not be indexed, but it’s a different story if the page and other sections experience a warning status.
How to Fix “Indexs even if blocked by robot.txt?
Step :
- Login to blogger dashboard
- select > Themes > Edit HTML ;
- After entering the edit html, look for the code;
(Generally located between …..Remove this code from the template, then save the template)
4. If it has been saved, then return to the google console, please validate
5. Done
Now just waiting for validation from google, usually the warning will disappear in the next few days.
Is there any other way?
It’s still there, it’s quite simple and you don’t need to go around to edit the theme, just fiddle with the settings on blogger;
Step :
- because we are already on the dashboard, then
- Go to Settings
- Search Custom robot.txt
- click edit on that section
- If it says “Disallow:/search“, if you have found it, change the word Disallow Becomes allow .
Is it done ? ; FINISH
Perform the validation steps and see the results in the next few days, usually less than 3 days the validation process is complete, and the warning will disappear.
Is this repair really necessary?
Maybe there are some bloggers who are perfectionists, so anything that is disturbing and not pleasing to the eye must be corrected immediately. So far I myself have never felt the impact of decreased traffic there are other negative impacts of this problem. So it is necessary or not, in my personal opinion; This repair is only a form of prevention of things that are not wanted in the future.