Resolve blocked resources in the Google Search Console

There are few causes that lead to blocked resources in Search console. Usually people let this happen and don’t think about the symptoms, but that actually has side effects, you know SERP thereby lowering the page rank of the user’s website. Therefore, this should be addressed immediately so that it does not have any negative effects in the future.
Usually there are websites that suddenly lose visitors fifteen% could be due to an increase blocked resources This and sometimes users are notified by email of an escalation of this incident.
To overcome this there are several ways and this is not immediately feasible or with the intention of needing accuracy. Once the user knows the problem, the problem will stop immediately.
Also Read: How To See All Alexa Backlinks For Free
Examples like the effect on the robots.txt, the Block Google crawling Search the page, you will immediately see it in the “Blocked Resources” section. If it’s only a page or two, that’s fine.
But when it’s hundreds of pages, it’s already a problem that results in visitors being unreachable lately. All right, let’s discuss the procedure for fixing this problem in several different ways that I will describe below.
- Review of the Robots.txt website – The only reason for this problem is that robots.txt found an error in it.
Try to check that the robots.txt file is working properly on the website by using the tester function included in Search console. The most important thing is to set it up so that the Googlebot crawls the website’s content pages.
- Check external scripts – Examples are ad code or other things that are not internal. The website content placed on the user’s website may not allow Google to crawl it.
So that the website of users who have already installed impressions of the website, also the effect of blocked resources This.
- Remove the iframe tag code – Actually, I don’t repeat that anymore because the point is almost the same as with the second method. Therefore, sometimes an ad is served through the frame so that the content in that frame does not allow the Googlebot to crawl it.
- Check the image url source – Users unknowingly taking pictures on a website for created articles can sometimes cause this. The point is the same as above, namely that the image storage page doesn’t allow the Googlebot to crawl it from the robots.txt.
So from now on, if you want to take pictures from other websites, please upload them to your own website first to make it more secure. And don’t forget, images are also protected by copyright.
When the user looks at the Blocked Resources section, is displayed List of infected sites because of this problem
And to find out which website source is causing this problem, just click on one of the pages and it will show the URL of the website that caused it. Now watch the user’s website and look for something related to the website’s url and immediately delete it if there is.
Also Read: How To Penetrate Websites Displayed By Google With Malware
If it’s an ad, it’s a little tricky to find out other than by directly uninstalling the code. Usually, to find out whether or not this problem has been resolved, you have to wait up to 3 days.
Because the blocked resources feature only shows statistics 3 days ago. within the next 3 days it will not go to zero immediately, but it will decrease day by day and that is better.
Hopefully useful and good luck