Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings”.
That sounds bad, but the good news is there’s an easy fix for it and implementing the fix end up helping your site.
How can you see the Google warning please see bellow,
Here is the warning the full warning:
“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”Blocking CSS and JavaScript has been a Google no-no since it was written into the Webmaster Guidelines last October. It’s only recently that the company has been issuing warnings about it.
If your site has been blocking Googlebot from accessing those files, then it’s a good thing you know about it so you can deal with the issue.
There’s an easy fix for it, which involves editing your site’s robots.txt file. If you’re comfortable editing that file, then go ahead with this fix.
Look through the robots.txt file for any of the following lines of code:
[ Disallow: /.js$*
Disallow: /.inc$*
Disallow: /.css$*
Disallow: /.php$* ]
If
you see any of those lines, remove them. That’s what’s blocking
Googlebot from crawling the files it needs to render your site as other
users can see it.The next step is to run your site through Google’s Fetch and Render tool, which will confirm whether or not you fixed the problem.
If Googlebot is still being blocked, the tool will provide further instructions on changes to be made to the robots.txt file.
In addition, you can use the robots.txt testing tool in Search Console to identify if there are any other crawling issues.
If You not clear or have any good way to improve please comment bellow;.
No comments :