The URL being discussed is a CSS file that a robots.txt file prevents from being accessed.
Why is this important?
The aiming of search engines like Google is to emulate user experiences by not just crawling but also rendering content. This includes CSS files which style the visual representation of your webpage. In case these files are not accessible due to robots.txt restrictions, it may hinder the rendering process and negatively impact algorithms—like mobile-friendliness—that influence search rankings.
What does the Optimization check?
The alert is triggered if there is an internal CSS file type URL that is blocked by the robots.txt file.
Examples that trigger this Optimization
Assume the following CSS file URL: https://example.com/assets/styling.css
This alert would be generated if the robots.txt file of the site contains the following disallow rule blocking search engine access:
User-agent: *Disallow: /assets/
How do you resolve this issue?
Since this issue is classified as 'Critical', it is imperative to resolve it quickly due to its potential significant negative impact on organic search traffic. Correcting this problem involves reviewing the robots.txt file, locating the specific rule that is causing the disallowance of the CSS file, and removing that particular rule set.
Reviewing the Optimization details and the URL List will reveal the specific robots.txt disallow rule causing the issue.
Comentarios