Site Audit: Webpage Resources Checks

March 7, 2018image

Today we’re excited to let you know that we’ve implemented webpage resources checks into the Site Audit tool to alert you about poor status of your resources like JavaScript, CSS and images, and to advise you how to fix the issues.

Why should you be aware of these issues?

In short, they can lead to a drop in Google rankings by causing the following problems.

Indexing failure by Googlebot

Google needs to see all of your webpage resources to fully understand and properly index your page. If you block your resources for Googlebot such as CSS files, it may not know your site is even mobile.

Also, you’ll need to ensure that your page resources function properly to expect them to have any effect. Any script that has stopped running on your website, or a broken CSS or image may not only cause failure in Google indexing, but also spoil your user experience.

Low website speed

Starting from July 2018, page speed will become a ranking factor for mobile searches. Optimizing content efficiency is critical to deliver instant web experience. This is when you should take care of each and every byte on your website.

If you don’t enable caching for CSS and JavaScript files, a user’s browser will have to download them again and again when requesting your page. Other issues such as uncompressed CSS and JavaScript files also make your page load slower, which negatively affects user experience.

Taking account of all the above issues, we’ve come up with four webpage resources checks to ensure you provide a fast web experience for your users:

How to check your page resources with Site Audit

First, go to your running project for which you’ve set up Site Audit. If you don’t have any, head to the tool and set it up.

Next, go to the ‘Issues’ tab and click on ‘Select an Issue’ on the right. This is where the page resources checks can be accessed if they are triggered for your website.


Let’s look in detail at each check we have at your disposal.

Blocked resource in robots.txt

This check is triggered if your webpage resources are blocked from crawling by a "Disallow" directive in your robots.txt file. Click on the check’s name to have a list of broken resources.


Check the “Why and how to fix it” section on the right to find out how to solve the issue.

Broken JavaScript and CSS files

This check is triggered if Site Audit has found broken JavaScript or CSS files hosted on your website. Click on the check’s name to get a list of broken resources and their HTTP Status codes.


To detect the other issues, you can either find the checks for them in the same ‘Select an Issue’ category or head to the ‘Overview’ tab and click on ‘Performance report’.


Uncached JavaScript and CSS files

This issue is triggered if browser caching is not specified in the response header.


Uncompressed JavaScript and CSS files

This issue is triggered if compression is not enabled in the HTTP response.


Don’t forget to hover over the ‘Why and how to fix it’ section on the right in each report to get actionable recommendations on how to deal with these issues.

Spoiler alert: we’ve started working with webpage resources more actively, and these are only the first checks in the Performance report. Stay tuned for the new handy checks and the report’s improvement!

What do you think about this update? Share your feedback and thoughts on how to improve Site Audit by dropping us an e-mail at