Snowflake Query Tags: Implementation Guide with Python, dbt & SQL Examples
Snowflake query tags let you attach meaningful metadata to every SQL query you run. That sounds simple, but used the right way, query tags unlock visibility most teams never get: fine-grained cost attribution, performance tracking, and clear workload classification.
Here’s how they work, why they matter, and how to get value from them quickly.
What Are Snowflake Query Tags?
A query tag in Snowflake is a string value (up to 2,000 characters) you can attach to any query. Think of it like a label or GPS tag for your compute work. Once set, Snowflake surfaces the tag in your QUERY_HISTORY views so you can group, filter, and report on queries with that label.
Query tags are not the same as object tags. Object tags live on tables, views, functions, and other database objects to help with governance. Query tags live on the work itself – the actual SQL queries consuming your credits.
Why Query Tags Matter
Most Snowflake spend comes from queries executed in virtual warehouses. If you can’t categorize that spend, you can’t explain it, optimize it, or reduce it. Here’s why query tags are a game changer:
1. Fine-Grained Cost Attribution
User-level attribution only gets you so far. When you attach tags like “daily_report” or “etl_load_v2” to queries, you can bucket cost and runtime exactly where it matters:
- Queries powering a specific dashboard
- Pipelines tied to particular features
- SaaS tenants, environments, or teams
- AI/ML workloads vs. operational queries
That lets you answer questions like “how much did our analytics layer cost this week?” or “which pipelines burn the most credits?” without guesswork.
2. Better Performance Monitoring
You can group performance metrics (like runtime or execution count) by tag, not just by user. That helps you track:
- SLA trends and violations
- Performance regressions after code changes
- Query acceleration service impact
- Warehouse sizing effectiveness
3. Linking to External Metadata
Query tags can contain structured JSON, meaning metadata from your tools (like dashboard IDs, CI/CD build numbers, pipeline names, or tenant IDs) can be surfaced directly in Snowflake. That opens up powerful joins between your Snowflake query history and your observability or billing systems.
How to Set Query Tags in Snowflake
Query tags are session-level parameters, but you can also set defaults at the user or account level.
Session Level
ALTER SESSION SET QUERY_TAG = 'etl_load_v2';
After this runs, every subsequent query in the session carries that tag.
User or Account Defaults
Set a tag once and avoid repeating ALTER SESSION. For example, you can set a default tag for a user:
ALTER USER alice SET QUERY_TAG = '{"team":"analytics","env":"prod"}';
Now Alice’s queries always carry context.
JSON Tags
Using JSON for tags makes filtering and parsing easier downstream. Example:
ALTER SESSION SET QUERY_TAG = '{"pipeline":"day_end","env":"prod","version":"v2.1"}';
When you query QUERY_HISTORY, you can segment by JSON keys and values using Snowflake’s semi-structured data functions.
Query History: Where Your Tags Live
Snowflake stores query tags in two places:
- SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY – 365 days of history, up to 45 minutes latency
- INFORMATION_SCHEMA.QUERY_HISTORY() – 7 days of history, near real-time
For cost analysis and trend reporting, use ACCOUNT_USAGE. For operational monitoring, use INFORMATION_SCHEMA.
Quick Examples with SQL
Track Cost by Tag
SELECT
query_tag,
COUNT(*) AS query_count,
SUM(credits_used_cloud_services) AS total_credits,
AVG(total_elapsed_time) AS avg_runtime_ms
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE start_time > DATEADD(day, -7, CURRENT_TIMESTAMP())
AND query_tag IS NOT NULL
GROUP BY query_tag
ORDER BY total_credits DESC;
query_tag,
COUNT(*) AS query_count,
SUM(credits_used_cloud_services) AS total_credits,
AVG(total_elapsed_time) AS avg_runtime_ms
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE start_time > DATEADD(day, -7, CURRENT_TIMESTAMP())
AND query_tag IS NOT NULL
GROUP BY query_tag
ORDER BY total_credits DESC;
This shows credit use by logical workload, not just user.
Performance by Workload
SELECT
query_tag,
COUNT(*) AS runs,
AVG(total_elapsed_time) AS avg_runtime_ms,
MAX(total_elapsed_time) AS max_runtime_ms,
SUM(CASE WHEN execution_status = 'FAIL' THEN 1 ELSE 0 END) AS failures
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE start_time > DATEADD(day, -30, CURRENT_TIMESTAMP())
AND query_tag IS NOT NULL
GROUP BY query_tag
ORDER BY avg_runtime_ms DESC;
query_tag,
COUNT(*) AS runs,
AVG(total_elapsed_time) AS avg_runtime_ms,
MAX(total_elapsed_time) AS max_runtime_ms,
SUM(CASE WHEN execution_status = 'FAIL' THEN 1 ELSE 0 END) AS failures
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE start_time > DATEADD(day, -30, CURRENT_TIMESTAMP())
AND query_tag IS NOT NULL
GROUP BY query_tag
ORDER BY avg_runtime_ms DESC;
You get performance patterns by workload, not by query text.
Parse JSON Tags
WITH parsed_tags AS (
SELECT
query_id,
start_time,
total_elapsed_time,
credits_used_cloud_services,
PARSE_JSON(query_tag) AS tag_json
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE query_tag IS NOT NULL
AND start_time > DATEADD(day, -7, CURRENT_TIMESTAMP())
)
SELECT
tag_json:pipeline::STRING AS pipeline,
tag_json:env::STRING AS environment,
COUNT(*) AS query_count,
SUM(credits_used_cloud_services) AS total_credits
FROM parsed_tags
WHERE tag_json:pipeline IS NOT NULL
GROUP BY 1, 2
ORDER BY total_credits DESC;
SELECT
query_id,
start_time,
total_elapsed_time,
credits_used_cloud_services,
PARSE_JSON(query_tag) AS tag_json
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE query_tag IS NOT NULL
AND start_time > DATEADD(day, -7, CURRENT_TIMESTAMP())
)
SELECT
tag_json:pipeline::STRING AS pipeline,
tag_json:env::STRING AS environment,
COUNT(*) AS query_count,
SUM(credits_used_cloud_services) AS total_credits
FROM parsed_tags
WHERE tag_json:pipeline IS NOT NULL
GROUP BY 1, 2
ORDER BY total_credits DESC;
Using Query Tags in Automation Tools
Modern tools can and should leverage query tags automatically.
dbt
Configure query tags in your dbt_project.yml:
models:
your_project:
+query_tag: "dbt_transformation"
your_project:
marts:
+query_tag: ‘{“layer”:”marts”,”env”:”{{ target.name }}”}’
your_project:
+query_tag: "dbt_transformation"
Or use a macro for dynamic tags based on model metadata. See dbt Snowflake configurations for details.
Python
When creating a Snowflake connection, set QUERY_TAG in your session parameters:
import snowflake.connector
import json
query_tag = {
‘app’: ‘data_pipeline’,
‘environment’: ‘prod’,
‘pipeline’: ‘customer_etl’,
‘version’: ‘v2.1’
}
conn = snowflake.connector.connect(
user=’your_user’,
password=’your_password’,
account=’your_account’,
session_parameters={
‘QUERY_TAG’: json.dumps(query_tag)
}
)
import json
This makes every query in that session automatically tagged with rich metadata.
Power BI, Tableau, Looker
Most BI tools connect via JDBC/ODBC. Set the query tag in the connection string:
jdbc:snowflake://account.snowflakecomputing.com/?query_tag={"source":"powerbi","dashboard":"sales"}
Or use pre-SQL/init scripts in your tool’s connection settings.
Best Practices for 2026
Here’s how successful teams use query tags:
1. Standardize Tag Formats
Use structured JSON with predictable keys like team, env, pipeline, app, tenant, version. That way you can filter and group consistently across all tools and teams.
Good:
{"team":"data_eng","env":"prod","pipeline":"daily_agg","version":"v2.1"}
Bad:
DataEng-Prod-Pipeline
2. Set Sensible Defaults
Where possible, set account or user defaults so you don’t have to set every session manually. This ensures coverage even when developers forget.
3. Use Tool-Native Integrations
Leverage helpers like dbt-snowflake-query-tags or native integrations so tagging is automatic and consistent.
4. Combine With Query History Materializations
Many teams materialize enriched versions of QUERY_HISTORY in their own schemas, joining with business metadata. Query tags become the backbone of cost allocation dashboards.
5. Track AI/ML Workloads Separately
In 2026, AI and ML workloads consume significant credits. Tag them distinctly:
ALTER SESSION SET QUERY_TAG = ‘{“workload”:”ml_training”,”model”:”churn_v3″,”framework”:”snowpark”}’;
This isolates AI spend from operational queries for better budgeting.
6. Monitor Query Acceleration Usage
If you use Snowflake’s query acceleration service, tag queries that benefit from it to measure ROI:
SELECT
query_tag,
SUM(query_acceleration_bytes_scanned) AS accel_bytes,
AVG(query_acceleration_upper_limit_scale_factor) AS avg_scale_factor
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE query_acceleration_bytes_scanned > 0
GROUP BY query_tag;
query_tag,
SUM(query_acceleration_bytes_scanned) AS accel_bytes,
AVG(query_acceleration_upper_limit_scale_factor) AS avg_scale_factor
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE query_acceleration_bytes_scanned > 0
GROUP BY query_tag;
Tagging vs. Commenting Queries
Some teams prefer to add comments at the end of queries instead of, or alongside, query tags. Comments can carry more data (no 2,000 char limit) and don’t require ALTER SESSION, but:
- Snowflake strips comments at the beginning of queries
- Comments aren’t structured or indexed like tags
- You can’t filter QUERY_HISTORY by comment content
Verdict: Query tags plus structured logging wins for clarity and tooling.
Real Value, Quickly
With query tags in place you can:
- Attribute compute spend to logical workloads, not just users
- Surface performance regressions early by tracking tagged workloads
- Monitor dashboards, pipelines, and environments independently
- Integrate usage data with your own analytics and cost allocation systems
- Identify AI/ML credit consumption separately from operational queries
Teams that adopt query tags consistently save time, reduce guesswork, and unlock visibility most Snowflake users live without.
Looking to take your Snowflake cost and performance monitoring to the next level? Seemore Data provides advanced analytics and dashboards specifically built for Snowflake optimization, cost attribution, and performance tracking. Feel free to reach out for a quick Snowflake optimization analysis.
FAQ
What is the character limit for Snowflake query tags?
Snowflake query tags can be up to 2,000 characters long. This is enough for structured JSON with multiple metadata fields.
Can I set query tags at the account level?
Yes. You can set default query tags at the account level, user level, or session level using ALTER ACCOUNT, ALTER USER, or ALTER SESSION commands respectively.
How long does Snowflake store query tags in QUERY_HISTORY?
Query tags are stored in SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY for 365 days. The INFORMATION_SCHEMA.QUERY_HISTORY() table function provides 7 days of history with near real-time data.
Can I use JSON in query tags?
Yes. JSON is recommended for query tags because it allows structured metadata that you can parse using Snowflake’s PARSE_JSON() function and extract specific fields for analysis.
What’s the difference between query tags and object tags?
Query tags are attached to SQL queries and track workload metadata. Object tags are attached to database objects (tables, views, schemas) and are used for governance, classification, and data discovery.
Do query tags affect query performance?
No. Query tags are metadata only and have no impact on query execution performance or cost.
Can I change a query tag mid-session?
Yes. Use ALTER SESSION SET QUERY_TAG = ‘new_tag’ at any point in your session. All subsequent queries will use the new tag.
How do I remove a query tag from a session?
Use ALTER SESSION UNSET QUERY_TAG; to remove the tag. Subsequent queries will run without a tag.