Table of Contents

Looker is an enterprise platform that offers cloud-based business intelligence tooling. With the help of Looker, we get to understand the data in real-time so that we can make decisions that matter. Looker offers a complete view of your data, so you can see and filter it as needed. With the help of LookML, a modelling language, you can create sophisticated queries and dashboards to analyse data in a way that piques your interest.

Google Cloud Platform’s BigQuery service allows you to create, manage, share and query data and offers data warehousing, analytics, and machine learning capabilities. With the help of BigQuery, you can analyse vast amounts of data and meet your big data needs with cloud-based parallel processing.

With the help of a data warehouse like BigQuery, you can store and query data in the cloud. You can also consolidate multiple data sources into a single location and analyse them using tools like Looker or other business intelligence tools in the market. Looker can help the engineering team visualise their data and understand the bigger picture to make effective data-driven decisions.

In this blog, we will be using Looker’s BigQuery integration to query data from BigQuery and understand how you can use Looker to optimise your Google Cloud spend. We will also look into how you can capitalize on some critical features of Looker to understand your billing, pricing and recommendations. you can use our BigQuery Pricing calculator to estimate your BigQuery expenses.

Getting started

Before getting started, you need to have the following prerequisites satisfied:

1. Administrator access to a BigQuery project.

2. Access to an up and running Looker instance.

You will also need to possess a working knowledge of the Google Cloud Platform and BigQuery. You can connect BigQuery to your Looker instance via two methods:

1. Connecting to BigQuery standard SQL or the legacy SQL.

2. Connect BigQuery to Looker using the OAuth API.

Connecting to BigQuery standard SQL or the legacy SQL

As outlined earlier, you need to have the admin privileges for a Google Cloud project to create a service account. You can create a service account by opening the credentials page in the GCP API manager and selecting the “Create credentials” button. Choose the account type as “Service Account” and provide the name and description for the account. Click on the “Create” button once you are done.

Select the data editor role as the first option and the job user as the second option when prompted for the role. Click on “Continue” and further progress ahead by clicking on the “Create Key” button to generate the private key. Download the key as a JSON on your machine. You can now set up a temporary dataset in BigQuery.

Navigate to your Google Cloud BigQuery console and select a project. Click on “Create Dataset” and provide the Dataset ID, Data location and progress ahead with clicking on the “Create Dataset” button to create the dataset. You can now integrate the dataset into your Looker instance.

Navigate to Looker’s admin section and go on the Connections page to click the “New connection” option. Looker will navigate you to a form where you can provide the following details:

  • Project Name: Provide the project ID of the project you created in the previous step.
  • Dialect: Select the dialect as Google BigQuery standard SQL.
  • Dataset: Enter the name of the dataset you created in the previous step.
  • OAuth: Select the OAuth option to enable the Looker users to authenticate BigQuery.
  • Service Account Email: Select the Email address of the service account you created in the previous step.
  • Password: Provide the password of the service account you created in the previous step, in the form of a JSON certificate.
  • Temp Dataset: Select the option to enter the temporary dataset.

The rest of the options can be configured as per your needs or left as defaults. Click on the “Update connection” button to create the connection.

Connect BigQuery to Looker using the OAuth API

Looker, BigQuery, Integrate, Visualization, Data Processing,

Source: Alteryx

You can use the OAuth API to connect BigQuery to Looker. It can allow you to monitor Looker users who are running queries on BigQuery and provide role-based accesses. To get started, navigate to your BigQuery project and select it. Navigate to the API & Services option and click on the credentials tab. Click on the “Create credentials” button.

Select the OAuth client ID option where you can provide the following details:

  • Application type: Select the application type as a Web application and provide a name for the application.
  • Authorised JavaScript origins: Enter the URL of your Looker instance.
  • Authorised redirect URIs: Enter the URL of your Looker instance as https://.looker.com/external_oauth/redirect.

You can now further configure your OAuth consent screen. Enter the Application name, support email, and the authorized domains that can access your Looker instance on the consent screen.

Finally, navigate to your Looker instance, go to the BigQuery configuration page and check the OAuth checkbox. Provide the clientID and client secret of the OAuth client you created in the previous step. You can now test the connection between your Looker instance and BigQuery.

Using the dashboards

Once your Looker and BigQuery integration are configured, you can use the Looker dashboards for data visualization in the BigQuery data warehouse. To do so, you can search across Looker marketplaces to find and install an instance block. It would serve as a great starting point for analysing your BigQuery data.

To better understand your cloud cost spend, you can use the Project Deep Dive dashboard to see how much your project spends on compute, storage, network, and more. You can further pinpoint your analysis on specific services that can be further optimised using the Service Deep Dive dashboard.

Looker also provides Recommendation Insights to understand quick optimizations to cut the costs immediately. You can further use labels to designate your cost centre and refine them to specific services. It would allow the administrators to understand how to optimize their costs and how their teams optimize Google cloud services.

GCP’s newest update, BigQuery Studio (designed for big data management in organizations using AI, ML, and LLM) is an all-in-one analytics dashboard that streamlines end-to-end analytics workflows, from data ingestion and transformation to sophisticated predictive analysis.

Conclusion

If you don’t have access to Looker, you can request a demonstration of the platform by filling out the form on Looker. You can also up-skill your team on Looker with these getting started guides.

Abijith

A FinOps enthusiast and experienced writer with over 7 years of expertise in creating content. Through Economize blog, delivers expert-driven content on FinOps strategies, cloud cost management, and related topics, blending technical insights with actionable advice for SaaS and companies

Related Articles

How to Use Azure Tags: Best Practices for Cost Management

How to Use Azure Tags: Best Practices for Cost Management

10 Best Azure Kubernetes Cost Optimization Strategies,

10 Best Azure Kubernetes Cost Optimization Strategies

AKS Costs Explained: Pricing Models & Best Practices

AKS Costs Explained: Pricing Models & Best Practices