Our Looker Power Tools: Automatic Content Validation and Dashboard Recovery

William Horel
May 18, 2020
Read: 4 min

If your organisation is using Looker, great – you’re probably on your way to higher data literacy across teams and a more data-driven culture. But as uptake increases and more users start producing their own content, modifying dashboards, and adding or deleting views and models, you may also have run into two common issues:

  • a dashboard or tile broke and you can’t figure out why
  • something important accidentally got deleted

Part of the beauty of Looker is that the data team does not – and should not – control and review everything. But you still need to ensure that your coworkers have quick and reliable access to this data, and that you can anticipate and prevent problems before they happen. In this blog we’ll talk about two tools we developed precisely for that purpose.

1. Content Validation

With an increasing number of users, it becomes easy to miss when something goes wrong. For example, a BI developer might get a request to modify a table or create a new one, but after doing so could forget to commit their modifications.
This can result in false data in downstream dashboards, or tiles that just stop working. Developers tend to hear about these issues post-hoc, when a business user has already complained about missing or misleading data.
Looker does provide its own solution to this, which is the Content Validator. This is pretty handy when you want to check the impact of some specific modifications, but you need to remember to regularly check this tool whilst on the correct branch. As content multiplies, this becomes unscalable and unreliable.

Our solution

We created a tool that will automatically run the Content Validator and extract any errors on a daily basis and send the results through the messaging platform of your choice on a regular basis.


First, a Python script connects to the Looker API. Then, it checks for any error reported in a dashboard or a look with the content_validation() function. Looker renders a JSON, with all the details of the error, such as the folder it’s in, the hyperlinks to the dashboards it affects, and a description of the error.
After all the errors are collected, we send chosen details as a report on your messaging platform, such as Slack or email.

Reporting errors via Slack

Your team will receive an instant message and will be able to detect and correct errors faster.
The error checker can run at regular intervals of your choice but the beginning or end of the workday tends to work best.

2. Dashboard recovery

Mistakes happen, and someone could have deleted a dashboard that they thought wouldn’t be needed in the future. This can also happen if you have deleted rather than disabled the account of a user who has left your organization, or who no longer needed Looker access, but had created resources that others still rely on.
Unfortunately, Looker doesn’t currently allow erased dashboards to be retrieved.

Our solution

To remedy this, we created a way of storing every dashboard (and its tiles) in Github.


Each day, a python job connects to the Looker API, and pulls details about all dashboards and relevant metadata like the dashboard owner’s full name, title, dashboard ID, and so on. This information for any new or modified dashboard is then added to a Github repository made for this purpose. When a dashboard is accidentally deleted, it can be restored using the backup file from Github.

GitHub backup file

What to do with a Dashboard.json?

If there were any unexpected deletions, you can upload your dashboard with Gazer. After installing Ruby, add the Gazer gem.

Get your access keys to Looker API through the User page on Admin panel. First click on “Edit”.

Then select “Edit Keys” on API3 Keys.

API3 keys

Create a file ~/.netrc in your home directory.
Paste in the API3 credentials as follows:

machine foo.bar.mycompany.com
login AbCdEfGhIjKlMnOp
password QrStUvWxYz1234567890

The login is your Client ID, and the password is your Client Secret. Once you have it, get a clone of your Dashboard.json file on your local machine, and run the following command, where 1 is the number of the folder where you want the dashboard.


gzr dashboard import Dashboard.json 1 –host foo.bar.mycompany.com


What else?

The approaches described above can be expanded to a variety of use cases. For instance, as well as checking for errors, we can set up automatic checks for any soft-deleted dashboards, Looks or folders, with the functions search_dashboards(), search_folders(), or search_looks(), and send this list to your chosen messaging channel, so you won’t be surprised at the last minute.

Alternatively, to get a detailed breakdown of Looker users’ activity, we can also use the python script to check for activities like logins, creations and queries, and send it as a regular report or compile it into a tile or dashboard. Looker also provides some out of the box activity dashboards, read more about them on our blog.

Thank you, @Nina, for proofreading and upgrading this article!

More on the topic

Everything we know, we are happy to share. Head to the blog to see how we leverage the tech.

Data Validation After Refactoring in Snowflake
Data Validation After Refactoring in Snowflake
Oh, well. Your current model is working as expected. The resulting table or query for reporting has good quality data that has been already validated...
January 24, 2023
speed up Terraform
How to Speed Up Terraform in CI/CD Pipelines
In this series of blog posts, we are going to offer various tips and tricks to speed up Terraform in CI/CD pipelines. In the first...
January 20, 2023
Infinite Lambda is now a Snowflake Elite Services Partner
Infinite Lambda Named Snowflake Elite Services Partner
Infinite Lambda has been accredited with Snowflake Elite Services Partnership. We are beyond thrilled to share the news as this recognition attests to our extensive...
December 8, 2022
How to Provide Platform-Specific Interface for Kotlin Multiplatform Code
This blog post looks at interfacing with platform-specific code using Kotlin Multiplatform. This is Part 2 of a series, so make sure to check out...
December 2, 2022
event-driven architectures for data-driven apps
Make Data-Driven Apps with Event-Driven Architectures
The rise of cloud computing and cloud-native technologies enabled the emergence of new age companies. A digital-native breed of businesses that truly operate 24/7 across...
November 23, 2022
Fivetran Regional Innovation Partner of the Year for EMEA 2022
Infinite Lambda Named Fivetran Regional Innovation Partner of the Year for EMEA
We are thrilled to announce that we have been named Fivetran Regional Innovation Partner of the Year for EMEA. We are twice as happy to...
October 20, 2022

Everything we know, we are happy to share. Head to the blog to see how we leverage the tech.