AWS connector

Pull AWS cost, security and observability data into the warehouse, then read your cloud the same way you read the rest of the business.

Data Panda lifts AWS Cost and Usage Reports, CloudWatch metrics, CloudTrail events and IAM inventory into the same warehouse as your CRM, ERP and product systems. FinOps, security and platform teams stop opening the AWS console for a number and start reading it next to revenue, headcount and customer activity.

Data Panda Reporting Automation AI Apps
AWS logo
About AWS

The public cloud Amazon launched in 2006, now 39 regions and 123 availability zones.

Amazon Web Services started inside Amazon and went public as a paid platform in 2006, with Amazon S3 in March and Amazon EC2 in beta later that summer. The catalog has grown into hundreds of services across compute, storage, database, networking, analytics, machine learning, security and developer tooling, all on the same IAM and billing surface. AWS publishes a global footprint of 39 launched regions and 123 availability zones, with announced expansions in Saudi Arabia and Chile, and the EU is covered by regions in Ireland, Frankfurt, Paris, Stockholm, Milan, Spain and Zurich among others, which is what BE/NL procurement contracts care about for residency.

This connector page covers the cross-service umbrella, not one product. The AWS Cost and Usage Reports and the newer Data Exports give you billing data at the line-item level, refreshed at least daily, ready to land next to a Cost Explorer pull for trend and forecast. CloudWatch carries metrics, logs and traces for compute, storage, databases, Lambda and the AI workloads on top. CloudTrail records the management and data-event audit trail that SOC, ISO and DORA reviews ask for. IAM holds the user, role and policy inventory you need to prove least-privilege. GuardDuty surfaces the threat findings on EC2, S3, EKS and RDS. We pull these into one warehouse layout so the FinOps view, the security register and the platform dashboard read from the same tables, instead of three teams each scraping the console for the same answer.

What your AWS data is for

What you get once AWS is connected.

FinOps next to revenue

AWS spend lands next to invoiced revenue and headcount, so the cloud bill reads as a unit cost.

  • Cost and Usage Reports loaded daily into the warehouse
  • Spend per service, account, tag and team in one model
  • Unit cost per customer, per order or per pipeline run instead of a flat monthly total

Cost, security and ops on one cadence

CUR, CloudTrail, CloudWatch and IAM pulls run on a known schedule instead of ad-hoc exports.

  • Daily CUR or Data Exports drop into the warehouse before the morning report
  • CloudTrail and GuardDuty findings replicated for the security register
  • Tag drift and missing-owner reports caught upstream of finance month-end

AI on cloud-operational data

Models read the same warehouse tables as humans for FinOps and security questions.

  • Anomaly models on CUR catch a runaway service before the next invoice
  • RAG over CloudTrail explains what changed across an account in plain language
  • Forecast pulls from Cost Explorer feed budget conversations with finance

Tag and policy hygiene

Resource tagging, IAM policies and bucket exposure get checked from the warehouse, not the console.

  • Untagged or mistagged resources flagged per cost-allocation rule
  • IAM users, roles and policies inventoried for access reviews
  • S3 Block Public Access drift surfaces in the same dashboard as cost
Use cases

Use cases we deliver with AWS data.

A list of concrete reports, automations and AI features we have built on AWS data. Pick the one that matches your situation.

FinOps reportingAWS spend per service, account and tag landed daily next to revenue and headcount.
Unit-cost dashboardsCost per customer, order, tenant or pipeline run instead of a flat monthly bill.
Cost anomaly detectionModels on CUR catch a runaway service before the next invoice closes.
Forecast next to budgetCost Explorer forecasts loaded next to the finance budget for monthly review.
CloudTrail audit registerAccount events queryable in the warehouse for SOC, ISO and DORA reviews.
GuardDuty findings trackThreat findings landed alongside ticket and remediation status.
IAM access reviewsUsers, roles and policies inventoried for periodic least-privilege checks.
Tagging complianceUntagged and mistagged resources reported per cost-allocation rule.
CloudWatch capacity viewMetric trends on EC2, RDS and Lambda in the same warehouse as cost.
Multi-account roll-upOrganizations with several AWS accounts read one consolidated FinOps model.
EU-region inventoryService and data inventory pinned to EU regions for residency reporting.
Real business questions

Answers you will finally get.

We have ten AWS accounts under one Organization. Can you consolidate the FinOps reporting?

Yes, and that is the usual entry point. We pull the consolidated Cost and Usage Reports from the management account, then enrich with per-account tags, owners and cost-allocation rules in the warehouse. The output is one FinOps model that finance reads per legal entity, per product line and per environment, instead of ten Cost Explorer tabs reconciled by hand.

Do we still need Cost Explorer if we land Cost and Usage Reports in the warehouse?

Cost Explorer keeps its place for the quick console check and for the AWS-managed forecast and anomaly view. The warehouse copy is what makes spend joinable to revenue, customer, tag and ticket data, which is what unit-cost reporting and finance month-end need. Most BE/NL teams keep Cost Explorer for the platform engineers and put the warehouse model in front of finance and the steering committee.

We already use AWS S3 as our data lake. Where does this AWS connector fit on top of that?

AWS S3 is the destination for the lake itself; this connector is about the operational data of AWS the platform: cost, audit, identity and observability. The two sit side by side. The S3 connector lands business data into a curated lake; this one lands the cloud's own metadata into the same warehouse so platform teams can answer FinOps, security and capacity questions on the same model.

Value for everyone in the organisation

Where each function gets value.

For finance leaders

The CFO sees AWS spend per legal entity and per product line in the same model as revenue and headcount. The cloud bill stops being a single line on the OPEX sheet and becomes a unit cost that survives a steering-committee question.

For sales leaders

For SaaS and managed-service teams, the warehouse joins AWS unit cost to customer revenue, so account managers know which accounts are gross-margin healthy and which lean on infrastructure that nobody priced in.

For operations

Platform and FinOps leads track CUR, CloudWatch and tagging drift in one view. The console stays open for action; the warehouse holds the trend, the forecast and the per-team breakdown that finance reads with them.

Data model

Tables we make available.

These are the 9 tables we currently pull from AWS into your warehouse. Query them directly in SQL, join them to the rest of your stack, or build reports on top.

  • Ebs Volumes
  • Ec Instances
  • Iam Account Summary
  • Iam Mfa Devices
  • Iam Password Policy
  • Iam Users
  • Rds Instances
  • S Buckets
  • Security Groups

Missing a table you need? We can extend the sync. Tell us what is missing and we will build it for you.

Your existing tools

Your data lands in a warehouse. Your BI tools read from it.

You keep the reporting tool you already have. We connect it to the warehouse where your AWS data lives.

Power BI logo
Power BI Microsoft
Microsoft Fabric logo
Fabric Microsoft
Snowflake logo
Snowflake Data warehouse
Google BigQuery logo
BigQuery Google
Tableau logo
Tableau Visualisation
Microsoft Excel logo
Excel Sheets & pivots
Three steps

From AWS to answers in three steps.

01

Connect securely

OAuth authentication. Read-only by default. We sign a DPA and your admin keeps the keys.

02

Land in your warehouse

Data flows into your warehouse on your schedule. Near real time or nightly, your call. You own the data.

03

Reporting, automation, AI

We build the first dashboard, workflow or AI feature with you, then hand over the keys. Or we stay on for ongoing delivery.

Two ways to work with us

Pick the track that fits how you work.

Track 01

Self-serve

We set up the foundation. Your team builds on top.

  • AWS connector configured and running
  • Warehouse set up in your cloud account
  • Clean access for your Power BI, Fabric or Tableau team
  • Documentation on what's in the data model
  • Sync monitoring so you're warned before reports break

Best fit Teams that already have a BI analyst or data engineer and want to own the build.

Track 02

Done for you

We build the whole thing, end to end.

  • Everything in Self-serve
  • Dashboards built to the questions your team actually asks
  • Automations between your systems
  • AI workflows scoped to real tasks your team runs
  • Custom apps where a dashboard does not cut it
  • Ongoing delivery at a pace that fits your team

Best fit Teams without in-house BI or dev capacity. You tell us what you need and we deliver it.

Before you book

Frequently asked questions.

Who owns the data?

You do. It lands in your warehouse, on your cloud account. We don't resell or aggregate it. If you stop working with us, the warehouse stays yours and keeps running.

How fresh is the data?

Near real time for most operational systems. For heavier sources we schedule hourly or nightly. You pick based on what the reports need.

Do I need a warehouse already?

No. If you don't have one, we help you pick one and set it up as part of the first delivery. Common starting points are Snowflake, Microsoft Fabric, or a small Postgres start.

What is the difference between Cost and Usage Reports and Cost Explorer for what you land?

Cost and Usage Reports (and the newer Data Exports) give the line-item ledger: per resource, per usage type, per tag, refreshed at least daily. Cost Explorer is the AWS-managed analytics surface on top, with up to 12 months of history, forecasts and anomaly insights, accessible by API. We land CUR for the granular FinOps model and pull Cost Explorer for the AWS-managed forecast next to it, so finance reviews trend and forecast on the same screen.

Can we keep all AWS data we land inside the EU?

Yes. AWS publishes EU regions including Ireland (eu-west-1), Frankfurt (eu-central-1), Paris (eu-west-3), Stockholm (eu-north-1), Milan (eu-south-1), Spain (eu-south-2) and Zurich (eu-central-2). The buckets, accounts and warehouse all sit inside EU regions, and the connector pulls run inside the EU footprint. Data-residency clauses in BE/NL procurement contracts read cleanly against this setup.

What permissions does the connector need on our AWS accounts?

Read-only on the cost, observability and audit surfaces it pulls from: the CUR or Data Exports S3 bucket, the Cost Explorer API, CloudWatch GetMetricData, CloudTrail Lookup and S3 read for the trail bucket, IAM list-and-get and GuardDuty list-and-describe. We work from a dedicated IAM role assumed by the connector, scoped per account, with no write access. The role definition is reviewable next to your other IAM policies.

GDPR-compliant
Data stays in the EU
You own the warehouse

A first deliverable live in four to six weeks.

We review your AWS setup and the systems around it. Together we pick the first thing worth building.