top of page
  • Writer's pictureBrent D. Payne

Multiple noindex directives

At Loud Interactive, we have observed that a URL may possess a noindex directive in numerous positions, such as within the HTML and the HTTP header.


Why is this important?

Adhering to best practices involves assigning a single instance of robots directives for each URL to reduce the chance of human error.


Consider a scenario where you integrate an SEO tool to your website that allows robots directives configuration. You set a page to noindex. Later, another tool with the same capabilities is added, and you mark the page as noindex once again.

If at a later time you chose to index this page and modified only one tool's settings, you might forget the other's configuration. Despite your intention, the page would remain noindex as a result.


To circumvent these issues, it's crucial to ensure that robots directives are declared just once. Google emphasizes that in cases of conflicting directives, the most restrictive option is chosen, a standard expected to be mirrored by other search engines as well.


What does the Optimization check?

This Optimization is activated if an internal URL contains noindex directives in multiple forms, within either the HTML or HTTP header.


Examples that trigger this Optimization

An instance that could set this Optimization off may be:


Duplicate meta noindex tags in the <head> section:

<!doctype html><html lang="en"><head>  <title>example</title>  <meta name="robots" content="noindex,nofollow">  ...  <meta name="robots" content="noindex,follow"></head><body>...</body></html>


OR a meta noindex in the <head>,

<!doctype html><html lang="en"><head>  <title>example</title>  <meta name="robots" content="noindex,nofollow">  ...</head><body>...</body></html>


AND within the HTTP headers:

HTTP/... 200 OK...X-Robots-Tag: noindex


Why is this Optimization marked 'Potential Issue'?

At Loud Interactive, we mark this as a 'Potential Issue' - while it may not currently impact the site, it necessitates investigation to prevent possible future complications.


Often, redundant robots directives aren't intentional. Considering this, we flag such findings to allow for proactive correction, potentially involving backend modifications with the help of a developer to remove duplicates, ensuring a single, accurate directive declaration.

7 views

Recent Posts

See All

Comments


bottom of page