You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.
Alternatively, could try and use a range join strategy to reduce the speed. It's likely the cartesian join in the query_hours CTE which is killing things.
For a query history with >100M queries - model takes 8 min to run on an x-small.
If someone runs this hourly on an XSMALL, it will cost ~2.3K/year (
2/60*8*24*365
).Also add:
ran_on_warehouse
variable instead of doingwarehouse_size is not null
Improve cost per query freshness by using latest rates, closes #57 #62execution_start_time
logic across all queries. Do we need to add inblocked_time
? anything else?The text was updated successfully, but these errors were encountered: