The Problem With Databricks Billing Visibility
Databricks bills in DBUs — Databricks Units — which map to dollars via per-SKU pricing. A single runaway job, an accidentally left-on SQL warehouse, or a pipeline running on All-Purpose compute instead of Jobs compute can easily add hundreds of dollars before anyone notices.
The official billing UI lives in the Databricks console under Account > Usage. It's fine for finance teams doing monthly reviews. It's not designed for engineers who want to check their spend while they're writing code.
That's the gap CatalystOps fills.
How It Works
CatalystOps queries the system.billing.usage Unity Catalog system table directly,
joining against system.billing.list_prices to convert DBUs into dollar figures.
The query runs through your existing SQL warehouse — no extra infrastructure, no external services.
SELECT CAST(u.usage_date AS STRING) AS date, u.billing_origin_product AS workloadType, u.identity_metadata.run_as AS runAs, u.usage_metadata.job_id AS jobId, u.sku_name AS skuName, SUM(u.usage_quantity) AS dbus, SUM(u.usage_quantity * lp.pricing.default) AS dollars FROM system.billing.usage u LEFT JOIN system.billing.list_prices lp ON u.sku_name = lp.sku_name AND u.usage_start_time >= lp.price_start_time WHERE u.usage_date >= '${startDate}' AND u.usage_date <= '${endDate}' AND u.record_type = 'ORIGINAL' GROUP BY date, workloadType, runAs, jobId, skuName ORDER BY dollars DESC
Results are aggregated into a BillingSummary object and surfaced in the sidebar tree view,
broken down three ways: by user, by workload type, and by job.
Each view is sorted by dollars descending so the most expensive items surface immediately.
What You See in the Dashboard
Time periods
The dashboard supports three built-in periods — last 24 hours, last 7 days, and last 30 days — plus custom date ranges. Period selection is remembered between sessions via the 1-hour result cache, so you're not re-running the SQL every time the panel opens.
Warehouse auto-discovery
CatalystOps automatically selects a SQL warehouse to run the billing query, preferring serverless
warehouses that are already running. If you want to pin a specific warehouse, set
catalystops.billing.warehouseId in your VS Code settings.
CatalystOps shows a confirmation prompt before fetching billing data to avoid unexpected warehouse charges.
The query runs against system.billing.usage — a Unity Catalog system table — which requires Unity Catalog to be enabled on your workspace.
Why This Matters for Engineers
Finance teams already have billing dashboards. This feature is for the engineer who just submitted a job and wants to know if it's eating $50 or $500 before it becomes a Slack message from their manager.
- Catch All-Purpose compute being used where Jobs compute should be (3x cost difference)
- Spot runaway jobs before they finish their full run
- Attribute spend to users so teams can self-govern their usage
- Correlate cost spikes with code changes — the data is right next to your editor
MCP Integration
The billing data is also exposed through the CatalystOps MCP server via the
get_billing_summary and refresh_billing tools.
This means you can ask Claude directly: "Which of my jobs cost the most last week?"
and get a real answer backed by live data from your workspace.
Getting Started
- Install CatalystOps from the VS Code Marketplace
- Connect your Databricks workspace (
CatalystOps: Configure Connection) - Open the CatalystOps sidebar panel
- Click Fetch Billing Data — confirm the prompt
- Switch between Last 24h / 7d / 30d to explore your spend
Unity Catalog with System Tables must be enabled on your workspace. If you see a "TABLE_OR_VIEW_NOT_FOUND" error, contact your Databricks workspace admin to enable system tables.
Free and open source
See where your Databricks money is going
Install CatalystOps and get your billing breakdown in under a minute.