-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add log forwarding Pebble layer to sparkd #114
Conversation
Add log forwarding layer yaml file, which is used in sparkd.sh and it use environment variables to configure it. The env used are: - LOKI_URL - url, which can be either Loki itself or Grafana-agent - SPARK_APPLICATION_ID - spark application id - SPARK_USER - user running these Pods / Jobs - SPARK_EXECUTOR_POD_NAME - Pod name Signed-off-by: Robert Gildein <[email protected]>
Signed-off-by: Robert Gildein <[email protected]>
a6e71e2
to
ff76be4
Compare
Signed-off-by: Robert Gildein <[email protected]>
ff76be4
to
ef4eb73
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes look fine. I only have some comments about the name of the labels. I would like to keep the labelling as similar as possible to the one that we have in the metrics.
@welpaolo @theoctober19th could you please check (and have a thought) that the label names are consistent with the one we push?
product: charmed-spark | ||
flavour: $FLAVOUR | ||
app: spark | ||
app_id: $SPARK_APPLICATION_ID |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
todo This should be spark_job_id
to be consistent with the label in the metrics, see here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1
services: [all] | ||
labels: | ||
product: charmed-spark | ||
flavour: $FLAVOUR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
todo this should be role
I believe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me after you address the comment made by Enrico about the names of the labels
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good to merge with the change in the labels.
Signed-off-by: Robert Gildein <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
As we discussed, it will be more appropriate to use `pod` as a label instead of `hostname`. The HOSTNAME environment variable represent Pod name.
This PR provides simple tests for checking the that Loki contain some logs from Spark job. related: [charmed-spark-rock#114](canonical/charmed-spark-rock#114)
This PR provides simple tests for checking the that Loki contain some logs from Spark job. related: [charmed-spark-rock#114](canonical/charmed-spark-rock#114)
This PR provides simple tests for checking the that Loki contain some logs from Spark job. related: [charmed-spark-rock#114](canonical/charmed-spark-rock#114)
This PR provides simple tests for checking the that Loki contain some logs from Spark job. Update TF plan and bundles to use history-revision rev33 due #53 and integration-hub rev22, since this revision provides new relation. related: [charmed-spark-rock#114](canonical/charmed-spark-rock#114) Signed-off-by: Robert Gildein <[email protected]>
This PR provides simple tests for checking the that Loki contain some logs from Spark job. Update TF plan and bundles to use history-revision rev33 due #53 and integration-hub rev22, since this revision provides new relation. related: [charmed-spark-rock#114](canonical/charmed-spark-rock#114) Signed-off-by: Robert Gildein <[email protected]>
Signed-off-by: Robert Gildein <[email protected]>
Signed-off-by: Robert Gildein <[email protected]>
Add log forwarding layer yaml file, which is used in sparkd.sh and it
use environment variables to configure it.
The env used are:
Add integration tests.
How I tested it: