If a URL presents a noindex directive within its HTML and another one in the HTTP header, that don't align, this is the issue at hand.
Why is this important?
A situation where one input signals 'index' yet another suggests 'noindex' can lead to the page being labeled 'noindex', contravening any other actions taken to index the page. As stated by Google, conflicting directives will result in them picking the more limiting option, which other search engines probably emulate as well.
The recommendation is to apply robots directives only once on a given URL to prevent mistakes that can arise from redundant settings, as seen with the trouble mentioned above.
What does the Optimization check?
Loud Interactive's Optimization will be activated for any internal URL where there is a discrepancy between index/noindex directives in the HTML code and the HTTP header.
Examples that trigger this Optimization
Any URL would activate the Optimization if it had;
For instance, a meta noindex within the <head>:
<!doctype html><html lang="en"> <head> <title>Your Example</title> <meta name="robots" content="noindex,nofollow"> ... </head> <body>...</body></html>
AND an index directive present in the HTTP header:
HTTP/... 200 OK...X-Robots-Tag: index,nofollow
Conversely, it would also trigger if there were:
A meta index within the <head>:
<!doctype html><html lang="en"> <head> <title>Your Example</title> <meta name="robots" content="index,nofollow"> ... </head> <body>...</body></html>
AND a noindex directive in the HTTP header:
HTTP/... 200 OK...X-Robots-Tag: noindex,nofollow
How do you resolve this issue?
To fix such conflicts, identify the accurate directive and adjust the incorrect instance so they are in harmony. In the future, to avoid such problems, it's advisable to modify your page template to implement only one approach for setting robots directives.
Changes might necessitate developer involvement for editing page templates, plugins, or HTTP headers to ensure there's only a single declaration of robots directives.