Skip to content

AndrewRTsao/predictive-maintenance-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

predictive-maintenance-example

Welcome to Continual's predictive maintenance example. For a full walkthrough, please visit our documentation and you can find more guided examples here!

Note: This project is designed to run with Snowflake. You should be able to adapt it though pretty easily to other warehouse vendors, but let us know if you encounter any issues.

Running the example

Download the source data at Kaggle:

kaggle datasets download -d arnabbiswas1/microsoft-azure-predictive-maintenance 

You can then upload it to your cloud data warehouse of choice using your preferred mechanism. For convenience, we've included a few short scripts that you can leverage to upload this data into Snowflake manually.

  1. First, run the ddl.sql file to create your base tables.
  2. Afterwards, using the CSVs downloaded from Kaggle, you can use snowsql to upload the data. Refer to SnowSQL's documentation and execute the snowsql_staging.sql script through snowsql.

For dbt users

If you're using dbt, you'll now just be able to use the dbt project provided. dbt_project.yml is configured to use the continual profile. You'll either need to change the profile accordingly or create this profile in your ~/.dbt/profiles.yml file. Then you can execute:

dbt run

This command will build all the required tables/views. Then, once dbt is finsihed, you can execute the following command to push the necessary configuration to Continual to kick off the model training process:

continual run

Please make sure that you've installed the Continual CLI, created an account in Continual, and have logged in to the CLI with a default project configured. Otherwise, do so first before repeating the above.

You're now done! You can now navigate to the Continual Web UI to monitor the progress of your model training and observe the resutls as it finishes.

Note: This whole process can take around 2 hours to finish.

For non-dbt users.

We highly recommend using dbt for your transformations. If this is not feasible, we've provided the following feature_engineering.sql and prediction_engineering.sql SQL scripts that you can run in Snowflake directly to build out all required tables / views.

Afterwards, from the command line, you can simply exeecute the following (from the home directory of your locally cloned project):

continual push ./continual/featuresets ./continual/models

Please make sure that you've installed the Continual CLI, created an account in Continual, and have logged in to the CLI with a default project configured. Otherwise, do so first before repeating the above.

Note: If you modify any of the table names or schema in the .sql scripts, make sure to update the queries in the corresponding .yaml files in continual/featuresets and continual/models accordingly.

You're now done! You can now navigate to the Continual Web UI to monitor the progress of your model training and observe the resutls as it finishes.

Note: This whole process can take around 2 hours to finish.

Having issues?

Feel free to contact us if you have any issues or open a PR directly with any suggested modifications.

Thank you for your time and we hope you enjoy this guided example!

About

Predictive maintenance guided example

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published