Scheduling crawls in Screaming Frog is very useful, but the magic happens when that automatic data is dumped into a Looker Studio SEO dashboard.
To have a Looker Studio report with updated data from Screaming Frog crawls, you must take a series of steps.
What are you going to need?
- Have a licensed version of Screaming Frog.
- A Google account for Looker Studio
- A Google Drive account
You can view all the data from the scheduled Screaming Frog crawls in a Looker Studio template like this one.
If you don’t have this template, don’t worry, you can still follow the tutorial.
For the template to work correctly, you must have Screaming Frog configured in English, which is the default language.
You can schedule daily, weekly, or monthly crawls, and analyze the evolution in a very visual way.
Setting all this up may seem complicated, but it isn’t. Just follow the steps.
It is worth automating these reports to save hours of time and be able to analyze much better.
Let’s look in detail at the steps you have to take:
Step 1: Create a Configuration File in Screaming Frog
This involves configuring the crawl with everything you need. I’m going to tell you what I recommend you activate.
Go to Configuration > Crawl Config

In the Spider – Crawl section, configure the Sitemaps section. Activate the options and add the sitemaps you want to analyze in the crawl.

In Spider – Extraction activate JSON-LS, Schema.org validation, and Google Rich Result Feature Validation.

Now in Content – Duplicates activate “Enable Near Duplicates” and in Spelling & Grammar, activate the options and choose the language or leave it on automatic.
Now we jump to User-agent, where I recommend you choose Googlebot (Smartphone).

The next step is to connect the APIs that we are going to use: Google Search Console and Page Speed Insights.
(You can connect others if you need to, but for the template we use linked above, these are the ones we use).

👉Check out our selection of Looker Studio templates
Google Search Console API
Once you connect it, choose the last 12 months in the date range and in the URL Inspection tab, activate the “Enable URL Inspection” option.

Page Speed Insights API
If you have never activated it, you can do it here.
Once activated and the API Key has been entered, check the “Auto Connect on Start” option.
In the Metrics tab, you can check all the options (although not all will be used, but you’ll finish sooner).

The last thing to configure in this part is within Crawl Analysis.
Check that everything is checked and that the “Auto-analyse at End of Crawl” box is checked.
This way it will do the crawl analysis automatically.

Step 2: save the Configuration File
Once you have everything configured, you must save the configuration file.
In Configuration > Profiles > Save As.
Save it in a place where you can find it easily because you are going to need it in step 3.

Step 3: Schedule the Crawl
Now you are going to schedule the crawl to run when you indicate.
Go to File > Scheduling

Click on +Add and configure the scheduled crawl:
Name the task and the project.
Set the frequency for the crawl. The date of the first crawl and when it will be repeated: once, daily, weekly, or monthly.

In Start Options you must leave Spider as Crawler Mode.
In Crawl Seed include the domain of the website you want to analyze.
In Crawl Config include the file you configured in step 1 and saved in step 2.

In API, check Google Search Console and PageSpeed Insights, which are the APIs you configured in step 1.

In Export:
Check the “Headless” option.
In Google Drive Account connect your Google Drive account where the Google Sheets with the crawl data will be created.
⚠Important: the Google account of this Google Drive must be the same one with which you connect to Looker Studio.
Check “Created timestamped folder in ouput”.

Finally, check “Export for Looker Studio” and in Configure select all the metrics, clicking on the double arrow.

Click on OK to save everything and you will have the schedule activated.
I recommend that you check if everything has worked correctly after the first crawl. To do this, schedule a crawl shortly after making the configuration.
Whenever you want to check if a scheduled crawl is working well, go back to File > Schedule and then to History.
Here you have more information about scheduled crawls.
Step 4: Connect to Looker Studio
When the first crawl is performed, a Google Sheets document is created in the Google Drive account you have chosen.
This file will have the name you gave the Task in the previous step.
Check that it has been created.
In Looker Studio you are going to click on Create data source.

Choose the Google Sheets connector that you will find in Google Connectors.

Now select the Google Sheets document that corresponds to the Screaming Frog crawl and click on connect, above on the right.

On the next screen you will see the fields that the report contains.
Click on Create report.
The data source is already created. The report that opens can be closed and deleted.
Before moving on to step 5, you are going to create another data source. The connector is “Chrome UX Report”.
Go back to Create data source and search for the connector.

Configure the steps and connect this data source.

Step 5. Duplicate the Template and Connect your Data Source
In the event that you have purchased the template I mentioned at the beginning, you must duplicate it from this section:

Now you must choose your data sources to replace the original ones in the template:
In the first field, you must select the Google Sheets data source that you have just created.
In the second field, select the Chrome UX Report data source that you have just created in step 4.

Click on Copy report and you will have your copy with your own data.
Keep in mind that the Google Sheets is completed with each crawl and in this way the Looker Studio report has more and more data to see the evolution.
In the first days the report will hardly have historical data, especially if you schedule crawls weekly or monthly.
This template can for example give you data like these:
Evolution of the Core Web Vitals over time:

Problems with titles and meta descriptions:

Data such as total URLs, images, CSS, JavaScript, status codes…

Detail and evolution of indexability:

Configuring all this may take you a few minutes, but the time you can save is enormous and it will be very useful to have the onpage SEO of your projects controlled.
Alex Serrano
12 años en marketing digital. Creador de Chartud y especialista en Looker Studio. Consultor SEO desde hace 8 años. Creador y divulgador de contenido en diferentes formatos como 300Segundos (newsletter) o SEOdesdeCero (podcast). También cofundador de RankPulse.app, herramienta de SEO Local.
- Alex Serranohttps://chartud.io/en/author/alex_serramar_2022/
- Alex Serranohttps://chartud.io/en/author/alex_serramar_2022/
- Alex Serranohttps://chartud.io/en/author/alex_serramar_2022/
- Alex Serranohttps://chartud.io/en/author/alex_serramar_2022/