Having multiple nofollow directives for a URL (i.e., in both the HTML markup and HTTP headers) can complicate SEO efforts and configurations.
Why is this important?
Establishing a singular instruction for robots directives is seen as a standard practice to prevent errors. Consolidated directives remove confusion and allow for straightforward SEO management.
Picture this scenario: you incorporate an SEO plugin into your website with capabilities to set robots directives, marking a particular page as nofollow. Down the line, another plugin with similar functionalities is added, and this page's nofollow directive is duplicated.
Initially, this poses no problem, as the robots directives are not conflicting. However, should you decide to make the page follow and only update one plugin, the page remains nofollow due to the overlooked second plugin setting.
Therefore, the page's search engine visibility could be compromised, contradicting your intentions. This is consistent with Google's approach to default to the most limiting directive when encountering conflicting signals. This preventive measure against duplicate directives ensures search engines interpret your preferences accurately.
What does the Optimization check?
The Optimization identifies any internal URLs that issue nofollow directives in more than one form, either within the HTML or in server responses.
Examples that trigger this Optimization
Triggering examples include:
Duplicative meta nofollow tags in the <head>:
<!doctype html><html lang="en"><head> <title>Sample Title</title> <meta name="robots" content="index,nofollow"> ... <meta name="robots" content="noindex,nofollow"></head><body>...</body></html>
OR a single nofollow meta tag in the <head>,
<!doctype html><html lang="en"><head> <title>Sample Title</title> <meta name="robots" content="index,nofollow"> ...</head><body>...</body></html>
AND an HTTP header directive:
HTTP/... 200 OK...X-Robots-Tag: nofollow
Why is this Optimization marked 'Potential Issue'?
Labeled as a 'Potential Issue', it implies the dual directives are presently not problematic but demand scrutiny to avert complications. The duplication often isn't purposeful, prompting audits to preempt issues. Such adjustments might require technical intervention to amend templates, plugins, or server settings to ensure singular, clear-cut directives regarding robots behavior on your URLs.