Skip to content
This repository was archived by the owner on Feb 18, 2025. It is now read-only.

Commit 66ce3c3

Browse files
authored
Merge pull request #28 from raddaoui/master
DOC review and cleaning
2 parents 205e812 + ccd172f commit 66ce3c3

File tree

3 files changed

+6
-12
lines changed

3 files changed

+6
-12
lines changed

README.md

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -115,19 +115,13 @@ The terraform configuration takes two parameters to determine where the Kubernet
115115
* project
116116
* zone
117117

118-
For simplicity, these parameters should be specified in a file named terraform.tfvars, in the terraform directory. To generate this file based on your glcoud defaults, run:
119-
120-
./generate-tfvars.sh
121-
This will generate a terraform/terraform.tfvars file with the following keys. The values themselves will match the output of gcloud config list:
118+
For simplicity, these parameters should be specified in a file named terraform.tfvars, in the terraform directory. To generate this file based on your glcoud defaults, a script will be used `./scripts/generate-tfvars.sh` to produce a `terraform/terraform.tfvars` file with the following keys. The values themselves will match the output of gcloud config list:
122119
```
123120
# Contents of terraform.tfvars
124121
project="YOUR_PROJECT"
125122
zone="YOUR_ZONE"
126123
```
127124

128-
If you need to override any of the defaults, simply replace the desired value(s) to the right of the equals sign(s). Be sure your replacement values are still double-quoted.
129-
130-
131125
#### Deploying the cluster
132126

133127
There are three Terraform files provided with this example. The first one, `main.tf`, is the starting point for Terraform. It describes the features that will be used, the resources that will be manipulated, and the outputs that will result. The second file is `provider.tf`, which indicates which cloud provider and version will be the target of the Terraform commands--in this case GCP. The final file is `variables.tf`, which contains a list of variables that are used as inputs into Terraform. Any variables referenced in the `main.tf` that do not have defaults configured in `variables.tf` will result in prompts to the user at runtime.
@@ -161,7 +155,7 @@ Using the IP:Port value you can now access the application. Go to a browser and
161155

162156
### Logs in the Stackdriver UI
163157

164-
Stackdriver provides a UI for viewing log events. Basic search and filtering features are provided, which can be useful when debugging system issues. The Stackdriver Logging UI is best suited to exploring more recent log events. Users requiring longer-term storage of log events should consider some the tools in following sections.
158+
Stackdriver provides a UI for viewing log events. Basic search and filtering features are provided, which can be useful when debugging system issues. The Stackdriver Logging UI is best suited to exploring more recent log events. Users requiring longer-term storage of log events should consider some of the tools in following sections.
165159

166160
To access the Stackdriver Logging console perform the following steps:
167161

@@ -220,7 +214,7 @@ To access the Stackdriver logs in BigQuery perform the following steps:
220214
![BigQuery](docs/bigquery.png)
221215

222216
5. Click on the **Query Table** towards the top right to perform a custom query against the table.
223-
6. This opens the query window. You can simply add an asterisk (*) after the **Select** in the window to pull all details from the current table. **Note:**A 'Select *' query is generally very expensive and not advised. For this tutorial the dataset is limited to only the last hour of logs so the overall dataset is relatively small.
217+
6. This opens the query window. You can simply add an asterisk (*) after the **Select** in the window to pull all details from the current table. **Note:** A 'Select *' query is generally very expensive and not advised. For this tutorial the dataset is limited to only the last hour of logs so the overall dataset is relatively small.
224218
7. Click the **Run Query** button to execute the query and return some results from the table.
225219
8. A popup window till ask you to confirm running the query. Click the **Run Query** button on this window as well.
226220
9. The results window should display some rows and columns. You can scroll through the various rows of data that are returned, or download the results to a local file.
@@ -239,7 +233,7 @@ Since Terraform tracks the resources it created it is able to tear them all down
239233

240234
### Next Steps
241235

242-
Having used Terraform to deploy an application to Kubernetes Engine, generated logs, and viewed them in Stackdriver, you might consider exploring [Stackdriver Monitoring](https://cloud.google.com/monitoring/) and [Stackdriver Tracing](https://cloud.google.com/trace/). Examples for these topics are available [here](../README.md) and build on the work performed with this document.
236+
Having used Terraform to deploy an application to Kubernetes Engine, generated logs, and viewed them in Stackdriver, you might consider exploring [Stackdriver Monitoring](https://cloud.google.com/monitoring/) and [Stackdriver Tracing](https://cloud.google.com/trace/). Examples for these topics are available [here](https://github.com/GoogleCloudPlatform?q=gke-tracing-demo++OR+gke-monitoring-tutorial) and build on the work performed with this document.
243237

244238
## Troubleshooting
245239

scripts/validate.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ EXT_IP=""
5959
for ((i=0; i < RETRY_COUNT ; i++)); do
6060
EXT_IP=$(kubectl get svc "$APP_NAME" -n default \
6161
-ojsonpath='{.status.loadBalancer.ingress[0].ip}')
62-
[ ! -z "$EXT_IP" ] && break
62+
[ -n "$EXT_IP" ] && break
6363
sleep 2
6464
done
6565
if [ -z "$EXT_IP" ]

terraform/main.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ limitations under the License.
1818
//
1919
// This configuration will create a GKE cluster that will be used for creating
2020
// log information to be used by Stackdriver Logging. The configuration will
21-
// also create the resources and Stackdriver Logging exports for Cloud Storage
21+
// also create the kubernetes resources and Stackdriver Logging exports for Cloud Storage
2222
// and BigQuery.
2323
//
2424
///////////////////////////////////////////////////////////////////////////////////////

0 commit comments

Comments
 (0)