About BigQuery
Google Cloud's serverless warehouse, where GA4 and Google Ads data already live.
BigQuery was announced by Google in May 2010 and became generally available in November 2011. It grew out of the Dremel paper Google published in 2010 and remains the public face of that lineage: a serverless SQL warehouse where compute scales per query and you do not pick a node size or a cluster shape. Storage is decoupled from compute, partitioning and clustering are first-class, and a free tier of 10 GB of storage and 1 TB of query scanning per month makes it easy to start.
Pricing splits in two. On-demand bills per terabyte scanned, which is generous for sporadic workloads and a foot-gun for a curious analyst with a SELECT * on a 4 TB table. BigQuery Editions (Standard, Enterprise, Enterprise Plus) bill committed slot-hours with autoscaling, which most BE/NL teams move to once monthly on-demand spend gets unpredictable. On top of the warehouse sit BigQuery ML for SQL-defined models, BigQuery Omni for querying data that lives in S3 or Azure Blob, and Gemini-in-BigQuery for natural-language SQL and code assist. The reason BigQuery wins so often in the BE/NL mid-market is the Google data gravity: the free GA4 export, the Google Ads transfer, YouTube reporting, all land in the same project. We add the rest of the business so the warehouse is more than a marketing data lake.