Skip to content

Commit

Permalink
Update references to ODF
Browse files Browse the repository at this point in the history
  • Loading branch information
theckang committed Nov 18, 2021
1 parent f2f5b91 commit fd71612
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 6 deletions.
4 changes: 4 additions & 0 deletions workshop/content/lab1.5_kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,10 +150,14 @@ Stop viewing the logs after you are done.
### Create Topic
We have a Kafka topic, we should now go ahead and create the Kafka topic. First set your USER_NUMBER in both terminals.

First terminal:

```execute
USER_NUMBER=$(oc whoami | sed 's/user//')
```

Second terminal:

```execute-2
USER_NUMBER=$(oc whoami | sed 's/user//')
```
Expand Down
8 changes: 4 additions & 4 deletions workshop/content/lab2.2_ml_prediction.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Build machine learning API

In this lab, you will build and deploy the NLP Prediction Service using OpenShift Serverless. The prediction service needs a Natural Language Processing (NLP) model to verify if a message describes a legitimate disaster. To make things easier, we pre-created this model for you and stored this in [OpenShift Container Storage][1].
In this lab, you will build and deploy the NLP Prediction Service using OpenShift Serverless. The prediction service needs a Natural Language Processing (NLP) model to verify if a message describes a legitimate disaster. To make things easier, we pre-created this model for you and stored this in [OpenShift Data Foundation][1] (ODF).

The model was trained on a [Twitter dataset][2] originally used for a Kaggle competition, in which tweets were labeled **1** (the tweet is about a real disater) or **0** (the tweet is not about a real disaster). If you're curious, the model uses a scikit-learn [Multinomial Naive Bayes classifier][3] to make its predictions. The training code is [here][4] if you want to take a look.

Don't worry too much about the ML details. The model isn't perfect (it's not super accurate and the data is skewed in favor of 'tweet' messages), but it's a good starting point. More importantly, we gave you a model you can use to run the prediction service!

## NLP Model

First, let's make sure a storage bucket was created in OCS:
First, let's make sure a storage bucket was created in ODF:

```execute
oc get objectbucket
Expand Down Expand Up @@ -64,7 +64,7 @@ prediction-1-build 0/1 Completed 0 2m4s

After the container image is ready, we need to prepare the external configuration for the service. This includes:

1. Credentials to access OCS
1. Credentials to access ODF
2. Storage bucket and endpoint (where the model is hosted) and file name (the name of the model file itself)

The secret `serverless-workshop-ml` already exists for #1. For #2, we'll set these directly as environment variables when you deploy the service.
Expand Down Expand Up @@ -121,7 +121,7 @@ The NLP model should have predicted that this is a legitimate message. What hap

You built the NLP Prediction Service and deployed it using OpenShift Serverless with our pre-built ML model. However, the prediction service seems to be broken. We will debug and figure out what is going on in the next lab.

[1]: https://www.redhat.com/en/technologies/cloud-computing/openshift-container-storage
[1]: https://www.redhat.com/en/technologies/cloud-computing/openshift-data-foundation
[2]: https://www.kaggle.com/vbmokin/nlp-with-disaster-tweets-cleaning-data
[3]: https://scikit-learn.org/stable/modules/generated/sklearn.naive_bayes.MultinomialNB.html
[4]: https://github.com/RedHatGov/serverless-workshop-code/blob/workshop/model/training/train.py
Expand Down
2 changes: 1 addition & 1 deletion workshop/content/lab2.3_ml_bug.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ In this lab, you will debug the NLP Prediction Service using [CodeReady Workspac
Get the endpoint to CodeReady and our devfile:

```execute
echo $'\n'$(oc get route codeready -n openshift-workspaces --template='{{.spec.host}}')/f?url=https://github.com/RedHatGov/serverless-workshop-code/tree/workshop$'\n'
echo $'\n'https://$(oc get route codeready -n openshift-workspaces --template='{{.spec.host}}')/f?url=https://github.com/RedHatGov/serverless-workshop-code/tree/workshop$'\n'
```

Open the link in your browser. Login using your username and password. Authorize access to your account when requested.
Expand Down
2 changes: 1 addition & 1 deletion workshop/content/lab2.4_ml_twilio.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ kn service update prediction --min-scale 1
Make note of your prediction endpoint:

```execute
echo $(oc get route.serving.knative.dev prediction --template='{{.status.url}}/predict')
echo $'\n'$(oc get route.serving.knative.dev prediction --template='{{.status.url}}/predict')$'\n'
```

Here are the instructions:
Expand Down

0 comments on commit fd71612

Please sign in to comment.