This repository for online course Modern Data Warehouse in Snowflake. In this course use tech stack :
- Data Transformation : dbt
- Workflow Orchestions : Apache Airflow
- Data Platform : OLAP(Snowflake)
- Already installed docker and docker compose
- Already have snowflake account with ACCOUNTADMIN for participate this hands on(register free 1 month free trial snowflake without credit card)
- Already installed vscode or other IDE
- Setup dbt with snowflake as data platform
- ELT Data Architecture & Data Modelling
- Create dbt model
- Create airflow dags for run dbt seed & model
- Go to snowflake query editor and run all query bellow in worksheet or you can copy paste setup_snowflake_dbt.sql
-------------------------------------------
-- dbt credentials
-------------------------------------------
USE ROLE securityadmin;
-- dbt roles
CREATE OR REPLACE ROLE dbt_dev_role;
------------------------------------------- Please replace with your dbt user password
CREATE OR REPLACE USER dbt_user PASSWORD = "<your_secret_password>";
GRANT ROLE dbt_dev_role TO USER dbt_user;
GRANT ROLE dbt_dev_role TO ROLE sysadmin;
-------------------------------------------
-- dbt objects
-------------------------------------------
USE ROLE sysadmin;
CREATE OR REPLACE WAREHOUSE dbt_dev_wh WITH WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
GRANT ALL ON WAREHOUSE dbt_dev_wh TO ROLE dbt_dev_role;
CREATE OR REPLACE DATABASE dbt_dev;
GRANT ALL ON DATABASE dbt_dev TO ROLE dbt_dev_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE dbt_dev TO ROLE dbt_dev_role;
In this course, for Data Modelling use Kimbal's Method - Star Schema
Copy paste all folder warehouse & mart to your local
- Copy paste all dags code in folder dags to your local folder
- Enable airflow dag then check data in snowflake