This means that the URL in question is an image URL that is blocked by a robots.txt disallow rule.
Why is this important?
Googlebot is evolving to become a bot that crawls and renders content as a default behavior. This process involves constructing the page and visualizing it as it would appear in a web browser. To achieve this, the bot must have access to all the resource files that contribute to the creation of the page.
If Googlebot is prevented from accessing these resource URLs by a robots.txt file, it might not render the website's content accurately. Given that rendering plays a crucial role in several of Google’s algorithms, especially the 'mobile friendly' algorithm, inability to render content might negatively affect search engine rankings.
What does the Optimization check?
The Optimization is triggered by any internal image URLs that are blocked by a disallow directive in the robots.txt file.
Examples that trigger this Optimization
Take, for example, the image URL: https://example.com/images/image1.jpg
This Optimization would be activated if the site's robots.txt file contains a disallow directive preventing search engines from accessing the URL:
User-agent: *Disallow: /images/
How do you resolve this issue?
Marked as 'Critical', this Optimization indicates a significant error, which could drastically affect organic search performance. Resolving Critical issues should be prioritized promptly.
To fix the issue, locate and remove the specific robots.txt rule(s) that are blocking the image URL.
Navigating from the Optimization to the URL List displays the robots.txt disallow rule causing the issue for the URL.