You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Deployment : Kubernetes cluster (with zookeeper ) using druid-operator Metadata store : mysql Deep storage : Multi-attach PVC Druid version : 30.0.0
I am running a cluster using tiny-cluster.yaml with addition to 3 middlemanagers node. When i am doing batch ingestion through tasks api, segment creation and publishing is not happing, though the task is showing success. Deepstorage PVC is empty, historical loaddropqueue is also empty and I don't see any errors in logs also.
I am able to create directories in PVC, so don't think its problem with PVC. What am i doing wrong here?
Also I am ingesting a json only using a python library. Is there any guidelines on correct way to format data to pass it in ingestion schema?
In druid-io/druid-operator#277, I see that it is suggested to use hdfs instead of local for clustered environment, is it still the case?
Deployment : Kubernetes cluster (with zookeeper ) using druid-operator
Metadata store : mysql
Deep storage : Multi-attach PVC
Druid version : 30.0.0
I am running a cluster using tiny-cluster.yaml with addition to 3 middlemanagers node. When i am doing batch ingestion through tasks api, segment creation and publishing is not happing, though the task is showing success. Deepstorage PVC is empty, historical loaddropqueue is also empty and I don't see any errors in logs also.
I am able to create directories in PVC, so don't think its problem with PVC. What am i doing wrong here?
Also I am ingesting a json only using a python library. Is there any guidelines on correct way to format data to pass it in ingestion schema?
In druid-io/druid-operator#277, I see that it is suggested to use hdfs instead of local for clustered environment, is it still the case?
The text was updated successfully, but these errors were encountered: