The URL in question is labeled with a noindex command in both the HTML of the page and within the HTTP header that accompanies it.
Why is this important?
Adhering to a standard where robots directives are only included once per URL minimizes the possibility of errors. It ensures clarity and reduces the chance for human mistakes within the configuration.
In this case, a noindex directive is present both as an X-Robots-Tag in the HTTP header and as a meta noindex within the HTML <head> section of the page.
Although these instructions currently match, it could lead to complications if there's an attempt to alter the page's index status in the future. Should you alter the HTML meta noindex but neglect the HTTP header directive, your page would inadvertently continue instructing search engines not to index it.
As per Google's official guidance, conflicting robots directives will result in search engines opting for the more restrictive option, a pattern which is likely adopted by other search engines as well.
To preclude such issues, it's essential to centralize the location of your robots directives.
What does the Optimization check?
This Optimization will be activated by any internal URL which features noindex directives in both the HTML and HTTP header components.
Examples that trigger this Optimization
This Optimization is set off by any URL that manifests both of the conditions below;
Meta noindex in the <head>:
<!doctype html><html lang="en"><head> <title>Example Title</title> <meta name="robots" content="noindex,nofollow"> ...</head><body>...</body></html>
AND also in the HTTP header:
HTTP/... 200 OK...X-Robots-Tag: noindex
Why is this Optimization marked 'Potential Issue'?
This Optimization signifies a 'Potential Issue', which is to say it's currently unlikely to impact the site yet warrants scrutiny as it may lead to complications down the line.
Setting robots directives more than once tends not to be a conscious choice, and so Loud Interactive flags this to help you mitigate potential future misfortunes. Correcting such issues may necessitate technical assistance for the purpose of modifying page templates, plugins, or HTTP headers – thereby eliminating redundancy and defining robots directives singularly.