top of page
  • Writer's pictureBrent Payne

Disallowed JavaScript file

This indicates that a JavaScript file URL is being blocked due to a robots.txt disallow directive.


Why is this important?

As Googlebot evolves to a render-capable crawler, it requires access to all resources that construct the page. If resources such as JavaScript files are blocked by robots.txt, Googlebot might not render the page as intended, impacting algorithms that depend on rendering, such as the 'mobile friendly' assessment, ultimately affecting search rankings.


What does the Optimization check?

Loud Interactive's audit will flag any internal JavaScript resource that is blocked due to a disallow directive in robots.txt.


Examples that trigger this Optimization

The following is an instance that would activate the Optimization: Assume the obstructed resource is https://example.com/assets/script.js

An example of a disallow rule in robots.txt that would cause this Optimization to be triggered is:


User-agent: *Disallow: /assets/


How do you resolve this issue?

Labeled 'Critical', this Optimization suggests a significant issue that might negatively influence SERP performance, requiring immediate attention. To resolve, locate the disallow rule in the robots.txt that blocks the JavaScript file and eliminate that specific rule.


Reviewing the Optimization details and associated URL List will display the exact robots.txt disallow directive causing the issue.

8 views

Recent Posts

See All

ClubReq Gets A Link (Because They Asked)

I am a mentor for Techstars and have been for over 10 years. In those ten years I have mentioned to startups to ask ANYONE and everyone that they meet to link to their site. Yet, in all those times on

Comments


bottom of page