top of page
  • Writer's pictureBrent Payne

Multiple noindex directives

At Loud Interactive, we have observed that a URL may possess a noindex directive in numerous positions, such as within the HTML and the HTTP header.


Why is this important?

Adhering to best practices involves assigning a single instance of robots directives for each URL to reduce the chance of human error.


Consider a scenario where you integrate an SEO tool to your website that allows robots directives configuration. You set a page to noindex. Later, another tool with the same capabilities is added, and you mark the page as noindex once again.

If at a later time you chose to index this page and modified only one tool's settings, you might forget the other's configuration. Despite your intention, the page would remain noindex as a result.


To circumvent these issues, it's crucial to ensure that robots directives are declared just once. Google emphasizes that in cases of conflicting directives, the most restrictive option is chosen, a standard expected to be mirrored by other search engines as well.


What does the Optimization check?

This Optimization is activated if an internal URL contains noindex directives in multiple forms, within either the HTML or HTTP header.


Examples that trigger this Optimization

An instance that could set this Optimization off may be:


Duplicate meta noindex tags in the <head> section:

<!doctype html><html lang="en"><head>  <title>example</title>  <meta name="robots" content="noindex,nofollow">  ...  <meta name="robots" content="noindex,follow"></head><body>...</body></html>


OR a meta noindex in the <head>,

<!doctype html><html lang="en"><head>  <title>example</title>  <meta name="robots" content="noindex,nofollow">  ...</head><body>...</body></html>


AND within the HTTP headers:

HTTP/... 200 OK...X-Robots-Tag: noindex


Why is this Optimization marked 'Potential Issue'?

At Loud Interactive, we mark this as a 'Potential Issue' - while it may not currently impact the site, it necessitates investigation to prevent possible future complications.


Often, redundant robots directives aren't intentional. Considering this, we flag such findings to allow for proactive correction, potentially involving backend modifications with the help of a developer to remove duplicates, ensuring a single, accurate directive declaration.

0 views0 comments

Recent Posts

See All

CSS file size too large discovered in SEO audit

In our SEO audits at Loud Interactive, we often come across a common yet significant issue that can heavily impact your website's loading speed and overall performance: CSS file size too large. Let's

CSS file size too large discovered in SEO audit

When conducting an SEO audit on your site, one issue that might come up is that your CSS file size is too large. But why does this matter, and what can you do about it? Let's dive in. What's the Issue

H1 tag missing or empty" Discovered in an SEO Audit

At Loud Interactive, we've encountered a common issue during our SEO audits that can significantly impact a website's performance on search engines: the "H1 tag missing or empty" warning. This problem

bottom of page