ℹ️ This plugin is only available for self-hosted users. Users of PostHog cloud should look into S3 batch exports and connecting databricks to S3 as an alternative.
- Domain name of the cluster provided by databricks
- Generate an Api key by following this documentation
- Give a temporary filename path for saving the raw data.
- Enter you cluster id used in the system documentation
- Give a database name where you want to store the data.
- Enter events in comma ( , ) seprated way in order to ignore the data.
- Push data from posthog to databricks every minute.
- creates a table and runs scheduled job to perform migration of data from dbfs to database.
- You can't sync the historic data.
- You can't change frequency of data push to databricks.
- You can't minimize frequency to less than 1 minute.
We're here to help you with anything PostHog!