The URL being referenced has been identified as an AMP Page URL, which is currently barred by the robots.txt file.
Why is this important?
Disallowing an AMP Page URL via robots.txt obstructs search engines from crawling and indexing the page, preventing it from appearing in search results.
What does the Optimization check?
This Optimization is activated for any AMP page URL that is blocked in robots.txt.
Examples that trigger this Optimization
Take this AMP URL as an example: https://example.com/page-a/amp/
If the URL is barred by a rule in robots.txt, the Optimization will be triggered. The rule might look like this:
User-agent: *Disallow: /page-a/amp/
How do you resolve this issue?
To ensure AMP URLs can be indexed, they must be accessible to search engines. You should edit the robots.txt file to remove or alter any disallow directives targeting AMP URLs.