Organizations increasingly turn to AI to transform work processes, but this rapid adoption of models has amplified the need for explainable AI. Explaining AI helps us understand how and why models make predictions. For example, a financial institution might wish to use an AI model to automatically flag credit card transactions for fraudulent activity. While an accurate fraud model would be a first step, accuracy alone isn’t sufficient. Banks and regulators are often required to explain why an AI model is making a specific prediction. Was a fraud decision based on the transaction amount? The cardholder’s gender? Their spend history? Explainable AI helps answer these types of questions, promotes fair business practices, assists with regulatory requirements, and protects against bias and discrimination.
Implementing explainable AI in Google Cloud is an increasingly easy and common practice. Data scientists can use Google Cloud’s Vertex AI to understand what factors contribute to predictions for even the most complex deep learning models. But what about citizen data scientists?
In this post we’ll look at how data analysts can also take advantage of explainable AI by creating models in SQL using BigQuery ML and then explain those model predictions to stakeholders and domain experts using “What-If Scenario Dashboards” in Looker.
Building a Fraud Model in SQL Using BigQuery Machine Learning
BigQuery Machine Learning (BQML) allows analysts to create a variety of machine learning models entirely in SQL. In addition to democratizing data science capabilities, BQML benefits organizations by allowing models to be trained and predictions made without moving any data eliminating many data governance and MLOps challenges.
In this example, a retail bank has a dataset of credit card transactions, card holder details, and merchant information. A query creates the model training data including transaction amount, the distance between a merchant and the customer’s home, and the transaction time of day. These features are generated entirely in SQL taking advantage of BigQuery’s support for geo-spatial functions.
The sample dataset is publicly available to query. Make sure to create a dataset named retail_banking in your project to store the resulting ML datasets and models.
Example BigQuery SQL to Prepare a Model Training Dataset
After creating the training data, a short query fits a regression model to predict whether a transaction is fraudulent. BQML includes robust defaults along with many options for specifying model behavior. BigQuery provides model fit metrics, training logs, and other model details.
You can use other models in BigQuery ML such as xgboost and deep neural networks as well. Explainability works with those techniques also.
Explainable AI in BQML
Beyond training and inspecting a model, BQML makes it easy to access explainable AI capabilities. Users can provide new hypothetical transactions and view the model’s prediction and explanation.
For example, the following query creates three hypothetical transactions with varying amounts, distances, and times of day. The model predicts the first transaction is fraudulent because of the large monetary value and early hour.
BQML Explainable AI Query and Result
Creating a “What-If Scenario Dashboard” in Looker
While BQML unlocks a rich set of capabilities, it can be more valuable to bring explainable AI to non-technical stakeholders such as business domain experts or executives. These individuals are often better qualified to interpret and validate a model’s explanation. Providing an interface to explainable AI improves the trust, adoption, and overall success of an AI initiative.
Looker helps bring BQML’s explainable AI to stakeholders. Looker is a modern BI tool and data platform that is deeply integrated with BigQuery. With Looker, analysts create governed dashboards and data experiences using a unique semantic model. In this example we use the semantic model to parameterize the BQML SQL statement and create a dashboard. Once built, end users can enter their own transaction details using dashboard filters and view the prediction and model explanation – all without writing any code!
Looker Explainable AI Dashboard
BQML Predictions in Looker’s Semantic Model
The LookML pattern below creates the “What-If Scenario” dashboard. Define a parameter for each hypothetical user input. Build a derived table using ML.EXPLAIN_PREDICT on a subquery with the user’s input parameters. This pattern should be modified based on your dataset, trained model, and desired user inputs. Alternatively, you can reference the existing BQML Looker blocks on the Looker marketplace for an end-to-end guide to using BigQuery Machine Learning with Looker.
Historically model interpretation has been limited to data science teams. Collaborating with business stakeholders has required significant back-and-forth or the development of custom data science applications. Looker and BigQuery ML provide an alternative approach that empowers SQL analysts and enables business collaboration.
You can start with BigQuery Machine Learning and Explainable AI by writing a short query. Or you can learn more about how teams are doing data science with Looker, jumpstart your own use case with the Looker Marketplace BQML blocks, or explore how AI-powered data experiences are possible using Vertex AI and the Looker platform.
Cloud BlogRead More