An identified link has a nofollow directive in both its HTML markup and within the HTTP header response.
Why is this important?
Adhering to consistent practice, it is advised to include a singular directive for robots on a URL, as multiple declarations increase the probability of mistakes.
In the evaluated case, your URL carries a nofollow specification in the HTTP header's X-Robots-Tag as well as in the meta tags of your HTML document's head section.
While current directives may align, future modifications to shift the URL from nofollow to follow might be thwarted if alterations to the HTTP headers are overlooked while updating the HTML templates or configurations.
Thus, despite attempts to revise, your page could unwittingly persist as 'nofollow'. Google has expressly mentioned that in instances of conflicting directives, they adhere to the more restrictive option — a stance commonly mirrored by other search engines.
To prevent such issues, it is suggested to impose robots directives solely in a single location.
What does the Optimization check?
This Optimization becomes active for any internal URL that echoes nofollow orders within both the site's HTML and HTTP headers.
Examples that trigger this Optimization
The Optimization is activated when a URL includes;
A nofollow meta tag in the head section,
<!DOCTYPE html><html lang="en"><head> <title>Example Page</title> <meta name="robots" content="noindex,nofollow"> ...</head><body>...</body></html>
PLUS a nofollow directive in the HTTP header:
HTTP/... 200 OK...X-Robots-Tag: nofollow
Why is this Optimization marked 'Potential Issue'?
Loud Interactive flags this hint as a 'Potential Issue' to signal that while not immediately harmful, it requires attention to preclude future complications.
Redundant robots directives often are not intentional, warranting scrutiny to eliminate possible risks. You might seek a developer's aid to modify HTTP headers, page templates, and plugins, ensuring directives are singly defined.
Comentarios