Executive Summary

BigQuery's on-demand pricing charges $6.25 per terabyte of data scanned. A single poorly optimised dashboard scanning 500 GB per refresh at hourly intervals costs $27,375 per year. Mid to large enterprises without proper governance see $500,000 to $5,000,000 annually in unexpected BigQuery costs.

Our analysis of 50+ enterprise BigQuery implementations reveals five key findings:

  1. 73% of BigQuery customers on on-demand pricing are past the break-even for Editions. Most organisations scaling past 1.5 to 2.5 TB daily scanned should move to slot-based pricing.
  2. Top 5% of queries generate 60 to 70% of BigQuery on-demand cost. A handful of poorly optimised dashboards and reports often drive the entire bill.
  3. Editions slot commitments are negotiable 15 to 35% below published rates. Volume discounts and multi-year commitments unlock further savings.
  4. Autoscaling without caps consumes 20 to 40% of total BigQuery budget. Most organisations configure autoscale but set no monthly or per-query limits.
  5. BigQuery governance is a data engineering problem, not a procurement problem. Cost reduction requires query optimisation, partitioning, and per-project budgets—not just negotiating rates.

How BigQuery Pricing Works

BigQuery operates on two pricing models: on-demand and Editions. Understanding the mechanics of each is essential to building the right cost governance strategy.

On-Demand Pricing

On-demand charges $6.25 per terabyte (TB) of data scanned, regardless of query complexity or duration. A query scanning 500 GB costs $3.13. A dashboard refreshing that same 500 GB every hour for a business day costs $25.04. Run the same dashboard across a year, and the cost is $6,510 for a single dashboard—before any other workload.

BigQuery Editions

Editions shift to slot-based pricing. One slot provides unlimited queries within a monthly commitment. Three editions exist:

  • Standard Edition: $0.04 per slot-hour (~$2,920/month per slot, annual commitment)
  • Enterprise Edition: $0.06 per slot-hour (~$4,380/month per slot)
  • Enterprise Plus Edition: $0.08 per slot-hour (~$5,840/month per slot)

Break-Even Analysis

The break-even point where Editions become cheaper than on-demand is approximately 1.5 to 2.5 TB daily scanned. At 1.5 TB/day:

  • Annual on-demand cost: 1.5 × 365 × $6.25 = $3,427,500
  • Standard Edition (100 slots): 100 × $2,920 × 12 = $3,504,000

For organisations scanning 2.5 TB+ daily, Editions typically deliver 30 to 50% savings compared to on-demand.

The Hybrid Model

Many organisations use a hybrid approach: a baseline of committed slots for predictable workload, plus on-demand or autoscale for spikes. This balances cost predictability with flexibility.

On-Demand vs. Editions Economics

Choosing between on-demand and Editions requires understanding your workload profile:

Choose On-Demand If:

  • You scan less than 1.5 TB daily on average
  • Workload is highly variable day-to-day
  • You require no upfront commitment

Choose Editions If:

  • You scan 1.5 to 2.5 TB daily or more
  • Workload is predictable (within 20% variance)
  • You can commit to 1-year or 3-year terms
  • You want cost predictability

The Cost Governance Gap

Most organisations implement BigQuery without adequate governance. Cost control requires four pillars:

1. Query-Level Cost Attribution

Extract labels and project-level spending from INFORMATION_SCHEMA.JOBS_BY_PROJECT. Assign each query to a cost centre, business unit, or application. Without this, you cannot identify the top cost drivers.

2. Per-Project Budgets & Alerts

Use Google Cloud's budgets API to set monthly spend caps per project. On-demand projects should have strict limits (e.g., $10K/month). Editions projects need slot management, not spend caps.

3. Custom Quotas

Set per-project daily quotas. If a data engineering team has a 50 GB daily quota and approaches it by 2 PM, they'll optimise their queries before month-end.

4. Query Validator & Dry-Run Mandate

Require all new queries to run --dry-run first. Document the expected scan size. Reject queries scanning more than 50% of a table without partitioning.

5 Patterns Driving BigQuery Overspend

Pattern 1: Full Table Scans on Unpartitioned Data

A 5 TB unpartitioned table queried 20 times per day costs 5 × 20 × $6.25 = $625 per day, or $228,000 per year. Partition the table by date, and the same query costs $25 per day, or $9,000 per year. The fix: partition every table larger than 1 GB by date or a natural key.

Pattern 2: Dashboard Refresh Frequency Mismatch

A 200 GB dashboard refreshing every 15 minutes costs $600 per day. The same dashboard refreshing every 4 hours costs $37.50 per day. Most dashboards don't need sub-hour refresh. Align refresh cadence to source data latency. If data updates daily, refresh daily, not hourly.

Pattern 3: SELECT * Instead of Column Selection

BigQuery charges by columns scanned, not rows. SELECT * on a 100-column table scans all 100 columns, even if you need only 5. Charges apply per-column-scanned. Always specify exact columns.

Pattern 4: Editions Autoscaling Without Caps

44% of Editions customers in our survey had no autoscale cap. When autoscaling is enabled without limits, unexpected spikes can trigger expensive overages. In one case, a runaway query added 50 slots, costing $15,000 in autoscale charges. Always set autoscale caps: typically 150 to 200% of baseline slots.

Pattern 5: Cross-Region Query Charges

Querying data in us-west1 from a connection in eu-west1 triggers cross-region networking charges: $0.08 to $0.12 per GB. These appear in networking costs, not BigQuery costs, and are easily overlooked. Collocate datasets and queries in the same region.

Slot Reservation & Commitment Strategy

Slot management is where cost and governance intersect. Three commitment levels exist:

Commitment Tiers

  • No Commitment: Slots consumed at $0.04 to $0.08 per slot-hour (pay-as-you-go)
  • 1-Year Commitment: 25% discount vs. no commitment
  • 3-Year Commitment: 50% discount vs. no commitment

Baseline-to-Autoscale Ratio

For predictable workloads (e.g., ETL pipelines), use a 90:10 ratio (90% baseline, 10% autoscale). For variable workloads (e.g., ad-hoc analytics), use 60:40. Maximising committed baseline relative to autoscale delivers the lowest blended cost.

7 Negotiation Levers

BigQuery pricing is negotiable. During contract discussions, lever these approaches:

Lever 1: Per-Slot Rate Reduction

Standard pricing: $0.04 per slot-hour. At 500+ slots, expect 15 to 35% reductions. At 1,000+ slots, expect 25 to 40% reductions. Volume matters.

Lever 2: Autoscale Rate Cap

Negotiate a 10 to 20% discount on autoscale overage rates and a monthly spend cap. Autoscale should not add more than 20 to 30% to baseline slot cost.

Lever 3: Commitment Adjustment Rights

Secure quarterly adjustment rights of 15 to 20% without penalty. If you underestimated consumption, you can scale up. If you overestimated, you can scale down.

Lever 4: Edition Tier Downgrade Rights

Negotiate quarterly downgrade rights without penalty. Start with Enterprise, downgrade to Standard if demand doesn't justify the cost.

Lever 5: BigQuery in GCP Committed Spend

Include BigQuery Editions in your broader GCP Committed Use Discount (CUD) agreement. This compounds savings.

Lever 6: Storage Pricing

BigQuery storage is often overlooked. Standard rates are $0.025/GB/month. At 100+ TB, negotiate $0.015/GB. At 500+ TB, target $0.012/GB.

Lever 7: Migration Incentives

If migrating from an on-premises data warehouse, Google often offers free slots for 3 to 6 months or credits of $50K to $250K for migration costs.

7 Priority Actions for Cost Reduction

1. Audit Cost Distribution

Extract 90 days of query data from INFORMATION_SCHEMA.JOBS_BY_PROJECT. Identify the top 10 queries by cost. These likely drive 50 to 70% of your bill. Focus optimisation here first.

2. Move to Editions Once You Cross Break-Even

If you're scanning 1.5+ TB daily, Editions delivers 30 to 50% savings. Run the math with your actual usage, then commit to a 1-year term.

3. Configure Autoscale Caps on Day One

Set autoscale caps at 150 to 200% of baseline slots. A baseline of 100 slots should cap at 150 to 200 additional slots. This prevents runaway costs.

4. Partition Every Table Larger Than 1 GB

Partitioning by date or another natural key reduces query costs by 80 to 90%. This is the single highest-impact optimisation.

5. Implement Query-Level Cost Attribution and Per-Project Budgets

Set monthly spend limits per project. Use labels to track costs by business unit. Make engineers accountable for their queries.

6. Negotiate BigQuery Into Your GCP Committed Spend

BigQuery Editions should be part of your overall GCP CUD agreement, not a separate contract. This unlocks additional discounts.

7. Align Dashboard Refresh Frequency to Source Data Update Cadence

If source data updates daily, refresh daily. If hourly, refresh hourly. Don't refresh more frequently than data changes.

Need Help Reducing Your Google Cloud Analytics Spend?

Redress Compliance has completed 50+ BigQuery cost audits and negotiations. We identify overspend patterns, optimise queries, and negotiate Editions commitments at 15 to 40% below published rates.

Explore BigQuery Services

How Redress Can Help

Redress Compliance is 100% independent. We have no commercial relationships with Google, AWS, Microsoft, or any other vendor. Our advice is always in your interest.

We have completed 50+ BigQuery cost audits and negotiations, representing over $480 million in analytics spend under review.

Our Services Include:

  • BigQuery Cost Audit: Extract and analyse 90 days of query data. Identify top cost drivers. Quantify savings from partitioning, edition migration, and optimisation.
  • Editions Migration & Negotiation: Plan your move to Editions. Negotiate rates 15 to 40% below published. Manage baseline and autoscale configuration.
  • Query Optimisation Programme: Partner with your data engineering team. Refactor top-cost queries. Implement partitioning and clustering strategies.
  • BigQuery Governance Framework: Design cost attribution, per-project budgets, and query validators. Embed cost discipline into your data platform.
  • Data Warehouse Migration Advisory: If moving from on-premises or a competing cloud data warehouse, we guide platform selection, cost modelling, and vendor negotiation.
  • Ongoing FinOps for BigQuery: Monthly cost review, trend analysis, and optimisation recommendations. Stay under budget year-round.

Download the BigQuery Cost Governance White Paper

Deep-dive into slot strategy, pricing economics, and 7 negotiation playbooks. Includes real case studies from 50+ implementations.

Download White Paper