
Cloud data platforms are now a critical enabler for every enterprise. Yet, CFOs and CTOs are finding that their Snowflake bills often grow faster than expected — primarily due to compute credits spent on ELT/ETL workloads.
Our proposition is simple:
For many enterprises, Snowflake’s pricing appears simple at first glance, but the reality is more complex — costs can quickly spiral if workloads aren’t carefully managed. Snowflake charges in two “currencies”:
While storage is predictable, most customers find that 70–80% of their spend comes from compute credits, with only 20–30% from storage. (Source: Cloudchipr)
The issue isn’t understanding Snowflake’s model — it’s how workloads behave within it. Compute-heavy tasks like ETL pipelines often run longer, overlap with BI queries, or require oversized warehouses, making costs balloon in ways storage never does. On top of that, serverless features and egress fees quietly add recurring expenses that are easy to overlook.
Note: Reserved capacity discounts, regional price differences, and promotional credits can impact costs and should be factored into budgeting.
Snowflake charges in compute credits (for warehouses and serverless features) and TB/month (for storage) while Databricks, by contrast, bills in DBUs (Databricks Units) — a per-second unit tied to workload type and the underlying cloud Virtual Machines.
This difference is crucial:
Databricks compute refers to the clusters of virtual machines (VMs) that execute your workloads — jobs, SQL queries, or ML models. The cost equation is simple:
Databricks Cost = DBUs consumed × DBU rate + Cloud VM cost
Each workload type consumes DBUs differently, which gives flexibility but also requires understanding to avoid cost inefficiency.
Databricks offers workload-specific compute types, making it easier to align cost with purpose.
Compute Type | Best For | DBU Cost Range | Key Notes |
Jobs Compute / Jobs Photon | ETL/ELT pipelines, batch jobs, production workloads | ~$0.15/DBU (AWS Premium) | Lowest cost. Photon improves throughput per DBU → cheaper pipelines. |
All-Purpose Compute | Interactive notebooks, collaboration, data exploration | ~$0.40–$0.55/DBU | 3–4× more expensive than Jobs. Use only for exploration, not production. |
SQL Warehouses | BI dashboards, analytics queries | Varies by edition; Serverless is pricier | “Classic” or Serverless (no cluster mgmt). Serverless trades control for convenience. |
Model Serving | Deploying ML models as APIs (real-time inference) | DBUs + per-request cost | For ML/AI workloads; auto-scales serverless endpoints. |
Note: While this cost advantage is compelling, workloads with complex SQL transformations or compliance restrictions may face migration challenges, and full evaluation should precede any changes.
The most effective model isn’t choosing between Snowflake or Databricks — it’s combining them, with each platform doing what it does best.
It is important to acknowledge that managing a hybrid environment requires integration efforts, cross-team coordination, and possible overhead related to maintaining two systems, connectors, and monitoring.
To illustrate the savings potential, let’s model a mid-market customer running ETL pipelines and BI workloads on Snowflake today.
Baseline: All-in Snowflake
In this setup, ETL pipelines consume oversized warehouses, running for hours alongside BI queries — driving most of the compute spend.
By offloading ETL pipelines to Databricks Jobs/Photon, Snowflake credits are cut in half, and Databricks executes pipelines at a fraction of the cost.
The Impact
Over a year, this equals $46,000+ in savings for a mid-sized deployment — without losing any Snowflake functionality for BI users.
Actual savings depend on workload specifics, regional costs, reserved pricing contracts, and execution efficiency. Using pricing calculators and testing with your own workloads is essential.
Tools to Validate Your Numbers
Every organization’s rate and region differ. To test this model with your own workloads, use:
Moving to a Databricks + Snowflake hybrid is not a risky overhaul but a fast, structured path to savings:
Through this approach, you cut idle credit waste, separate ETL from BI, and unlock efficiency — without changing how analysts use Snowflake.
What stays the same: BI workflows, governance, and security.
What improves: Faster queries, lower ETL costs, and open data formats for future flexibility.
At Nallas, we will develop playbooks, financial models, and migration frameworks to help your enterprises achieve ~30% cost savings in just 3 months — while keeping analytics uninterrupted.
The takeaway: The challenge with Snowflake isn’t storage, it’s runaway compute. The solution isn’t to replace Snowflake, but to complement it with Databricks. Together, they deliver the best of both worlds — cost efficiency, performance, and future readiness.

Lead - Strategy