top of page
Writer's pictureBrent D. Payne

Internal Disallowed URLs

This refers to URLs that are excluded from being accessed by search engine bots as specified in your site's robots.txt file.


Why is this important?

When a URL is disallowed, it's not crawled by search engines, which means any content or links on those pages won't be indexed or followed. Disallowing certain URLs can be strategic to control what content search bots crawl, preventing them from accessing specific parts of a site, such as user-specific areas.


While disallowed URLs can be intentional, Loud Interactive brings it to your awareness to ensure there are no inadvertent blocks on important content.


What does the Optimization check?

This Optimization will activate for any internal URL that is blocked by a disallow directive in the robots.txt file.


Examples that trigger this Optimization

Take for instance the following URL: https://example.com/members/profile

This URL would activate the Optimization if the site's robots.txt includes a rule preventing search bots from crawling it:


User-agent: *Disallow: /members/


Why is this Optimization marked 'Insight'?

An 'Insight' signifies that the detection of the Optimization doesn't necessarily call for immediate action. It's informational and prompts a closer look at what's being disallowed to ensure it aligns with your search engine optimization goals. However, it is always good practice to verify to confirm these disallowed URLs are intentionally blocked.


If an essential URL has been disallowed by mistake, the correction is straightforward: amend the robots.txt to remove the specific disallow directive that's causing the block.


Navigating from the Optimization to the URL List will display the specific robots.txt disallow directive related to the URL in question.

5 views
bottom of page