Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

End-to-end ML workflow with Google Cloud Platform and BigQuery

Access this AI accelerator on GitHub

DataRobot can integrate directly into your GCP environment, helping to accelerate your use of machine learning across all of the GCP services.

In this notebook accelerator, you can use Google Collaboratory or another notebook environment to source data from BigQuery, build and evaluate an ML model using DataRobot, and deploy predictions from that model back into BigQuery and GCP.

This accelerator covers the following:

  1. Prepare data and ensure connectivity: In the first section of the notebook, you will load a sample dataset to be used for modeling into BigQuery. Once complete, you will connect your BigQuery data with DataRobot.

  2. Build and evaluate a model: Using the DataRobot Python API, you will have DataRobot build close to 50 different machine learning models while also evaluating how those models perform on this dataset.

  3. Scoring and hosting: In the final section, the entire dataset will be scored on the new model with prediction data written back to BigQuery for use in your GCP applications.


Updated September 28, 2023