BigQuery connector

Land business data in BigQuery and finally join GA4 to revenue.

Data Panda lands the data from your CRM, ERP, ecommerce and finance systems into BigQuery alongside your existing GA4 and Google Ads exports. The warehouse becomes the place where marketing, finance and product look at one set of numbers.

Data Panda Reporting Automation AI Apps
BigQuery logo
About BigQuery

Google Cloud's serverless warehouse, where GA4 and Google Ads data already live.

BigQuery was announced by Google in May 2010 and became generally available in November 2011. It grew out of the Dremel paper Google published in 2010 and remains the public face of that lineage: a serverless SQL warehouse where compute scales per query and you do not pick a node size or a cluster shape. Storage is decoupled from compute, partitioning and clustering are first-class, and a free tier of 10 GB of storage and 1 TB of query scanning per month makes it easy to start.

Pricing splits in two. On-demand bills per terabyte scanned, which is generous for sporadic workloads and a foot-gun for a curious analyst with a SELECT * on a 4 TB table. BigQuery Editions (Standard, Enterprise, Enterprise Plus) bill committed slot-hours with autoscaling, which most BE/NL teams move to once monthly on-demand spend gets unpredictable. On top of the warehouse sit BigQuery ML for SQL-defined models, BigQuery Omni for querying data that lives in S3 or Azure Blob, and Gemini-in-BigQuery for natural-language SQL and code assist. The reason BigQuery wins so often in the BE/NL mid-market is the Google data gravity: the free GA4 export, the Google Ads transfer, YouTube reporting, all land in the same project. We add the rest of the business so the warehouse is more than a marketing data lake.

What your BigQuery data is for

What you get once BigQuery is connected.

GA4 joined to revenue

Marketing data from GA4 and Google Ads finally sits next to invoiced revenue from finance.

  • Channel and campaign tied to actual paid invoices, not just sessions
  • One customer record across GA4 user_pseudo_id, CRM and billing
  • Looker Studio and Power BI read the same warehouse facts

Reverse ETL out of the warehouse

Audiences, scores and account fields built in BigQuery push back into the operational tools.

  • Lookalike audiences synced into Google Ads and Meta
  • Lifetime value field on every HubSpot or Salesforce account
  • Churn-risk flag back into the support and CS tools

BigQuery ML on real data

SQL-defined models on the same warehouse tables your dashboards read.

  • Forecast, classification and clustering models in pure SQL
  • Vector search and embeddings on Gemini-backed models
  • Predictions written back as a column other reports can use

Apps on a Google Cloud back end

Custom internal apps and customer-facing portals fed from BigQuery without standing up a separate API layer.

  • Looker Studio dashboards published to clients with row-level filters
  • Internal tools on Cloud Run reading governed BigQuery views
  • Operational lookups served from materialized views
Use cases

Use cases we deliver with BigQuery data.

A list of concrete reports, automations and AI features we have built on BigQuery data. Pick the one that matches your situation.

GA4 plus revenueGA4 events joined to invoiced revenue from finance, by channel and campaign.
Google Ads ROASAd spend from Google Ads tied to pipeline and closed revenue.
Marketing-mix reportingPaid, organic and email channels on one P&L view per period.
Customer 360GA4 pseudo-id, CRM contact and billing customer reconciled.
Lifetime value scoringLTV per cohort and per acquisition channel, written back as a field.
Audience activationWarehouse-built audiences pushed to Google Ads and Meta.
Product-and-revenue joinProduct usage from the app database next to subscription revenue.
Forecasting in SQLSales and demand forecasts via BigQuery ML, refreshed nightly.
Looker Studio dashboardsFree-tier dashboards on warehouse views, governed centrally.
Embedded customer portalPer-tenant data slices served from BigQuery into a portal.
Cross-cloud Omni queriesRead S3 or Azure Blob data from BigQuery without copying it first.
Real business questions

Answers you will finally get.

Why does our Google Ads ROAS not match what finance booked as revenue?

Because Google Ads stops at the click and finance starts at the invoice. With CRM, ERP and Google Ads all landed in BigQuery, the path from click to opportunity to invoice is one SQL chain. ROAS reads off invoiced revenue, not off GA4 conversions, and the marketing report ties to the close pack.

Our BigQuery bill jumped 4x last month. How do we get this back under control?

Almost always one of three things: an analyst running SELECT * on the GA4 events table, an unpartitioned hot table being scanned in full per query, or a dashboard view that re-scans raw data instead of a materialized aggregate. We audit the top spending queries, partition and cluster the hot tables, and move predictable workloads to a Standard or Enterprise reservation.

Can we keep GA4 in BigQuery and have the warehouse only on top, or do we need to move GA4 too?

Keep GA4 where it is. The free GA4 export already lands in BigQuery as a daily and optional streaming dataset. We add CRM, ERP and finance into the same project and join on top, so the GA4 raw stays untouched and the warehouse layer is the only thing your reports read.

Value for everyone in the organisation

Where each function gets value.

For finance leaders

The CFO finally sees marketing spend tied to invoiced revenue, not to last-click conversions. BigQuery joins Google Ads, GA4 and the boekhouding so margin per channel reads the same on the close pack and on the marketing dashboard.

For sales leaders

Sales sees pipeline next to GA4 attribution and Google Ads cost per lead, sourced from one warehouse. Reps stop arguing with marketing about lead quality because the data lives in the same place and the definitions match.

For operations

Ops monitors warehouse cost, query latency and dataset freshness in BigQuery itself. Predictable workloads sit on reservations, exploratory work stays on-demand, and a runaway query gets caught the same day instead of on the monthly invoice.

Your existing tools

Your data lands in a warehouse. Your BI tools read from it.

You keep the reporting tool you already have. We connect it to the warehouse where your BigQuery data lives.

Power BI logo
Power BI Microsoft
Microsoft Fabric logo
Fabric Microsoft
Snowflake logo
Snowflake Data warehouse
Google BigQuery logo
BigQuery Google
Tableau logo
Tableau Visualisation
Microsoft Excel logo
Excel Sheets & pivots
Three steps

From BigQuery to answers in three steps.

01

Connect securely

OAuth authentication. Read-only by default. We sign a DPA and your admin keeps the keys.

02

Land in your warehouse

Data flows into your warehouse on your schedule. Near real time or nightly, your call. You own the data.

03

Reporting, automation, AI

We build the first dashboard, workflow or AI feature with you, then hand over the keys. Or we stay on for ongoing delivery.

Two ways to work with us

Pick the track that fits how you work.

Track 01

Self-serve

We set up the foundation. Your team builds on top.

  • BigQuery connector configured and running
  • Warehouse set up in your cloud account
  • Clean access for your Power BI, Fabric or Tableau team
  • Documentation on what's in the data model
  • Sync monitoring so you're warned before reports break

Best fit Teams that already have a BI analyst or data engineer and want to own the build.

Track 02

Done for you

We build the whole thing, end to end.

  • Everything in Self-serve
  • Dashboards built to the questions your team actually asks
  • Automations between your systems
  • AI workflows scoped to real tasks your team runs
  • Custom apps where a dashboard does not cut it
  • Ongoing delivery at a pace that fits your team

Best fit Teams without in-house BI or dev capacity. You tell us what you need and we deliver it.

Before you book

Frequently asked questions.

Who owns the data?

You do. It lands in your warehouse, on your cloud account. We don't resell or aggregate it. If you stop working with us, the warehouse stays yours and keeps running.

How fresh is the data?

Near real time for most operational systems. For heavier sources we schedule hourly or nightly. You pick based on what the reports need.

Do I need a warehouse already?

No. If you don't have one, we help you pick one and set it up as part of the first delivery. Common starting points are Snowflake, Microsoft Fabric, or a small Postgres start.

Should we run BigQuery on-demand or move to Editions and slots?

On-demand at per-terabyte-scanned billing is the right starting point and stays right for sporadic workloads. Once monthly spend gets past a few hundred euro and the same dashboards run every day, BigQuery Editions (Standard, Enterprise, or Enterprise Plus) with a slot reservation gives a predictable bill and autoscaling. We run a one-week query audit before recommending a switch.

How does the free GA4 to BigQuery export fit into this?

GA4 exports event-level data into BigQuery for free on standard properties, with a daily limit of one million events; Analytics 360 lifts that limit substantially. Storage and query cost still apply on the BigQuery side. We treat the GA4 export as a source, never overwrite it, and build joined views on top in a separate dataset so the raw events stay clean.

Do we use BigQuery ML or Vertex AI for the model layer?

BigQuery ML covers forecasting, classification, clustering and embedding lookups in plain SQL, on the same tables your reports read, which is enough for most BE/NL mid-market models. Vertex AI fits when you need a custom model, MLOps tooling or training on data outside BigQuery. The two interoperate: a Vertex model can be registered and called from a SQL query.

GDPR-compliant
Data stays in the EU
You own the warehouse

A first deliverable live in four to six weeks.

We review your BigQuery setup and the systems around it. Together we pick the first thing worth building.