Replies: 4 comments 1 reply
-
Hi @roshkins ! The query is tough because of several reasons:
Sometimes it helps to take only limited piece of time ( Could you please re-check the idea of the query? You use |
Beta Was this translation helpful? Give feedback.
-
@roshkins Well, slow queries are going to be slow and we cannot afford letting running them forever (there is also PostgreSQL replica limitation gets into the play here - long-running queries block new data synced from the master database instance). You will need to slim your query down and compute it in chunks (e.g. compute the stats per day/hour and then sum those values together). See https://github.com/telezhnaya/near-analytics/blob/main/aggregations/db_tables/daily_gas_used.py for inspiration. |
Beta Was this translation helpful? Give feedback.
-
@frol @telezhnaya SELECT t.signer_account_id
FROM transaction_actions AS ta
INNER JOIN transactions AS t
ON ta.transaction_hash = t.transaction_hash
WHERE t.receiver_account_id LIKE '%sputnik-dao.near'
AND ta.args->>'method_name' = 'add_proposal'
--GROUP BY t.receiver_account_id
LIMIT 10; Returns in a few seconds, making me think it's unlikely to be the things you are describing. |
Beta Was this translation helpful? Give feedback.
-
Might adding some indices to the JSONB (see: https://stackoverflow.com/a/17808864/1136754) help? |
Beta Was this translation helpful? Give feedback.
-
I ran this query:
and got this after trying a few times and getting a timeout after 30 seconds:
I am thinking that this might be an indexing problem. I tried it with GROUP BY and the JSON constraint commented out and I still got the error.
I am willing to try poking around to see if adding indices fixes the issue. I am no DB expert, and would love help and advice.
Beta Was this translation helpful? Give feedback.
All reactions