Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: clarify gRPC quickstart example #322

Merged
merged 1 commit into from
Feb 3, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ since a normal Service has issues load balancing gRPC requests. See more info

### gRPC request

To test out **gRPC** inference requests, you can port-forward the headless service _in a separate terminal window_:
To test out gRPC inference requests, you can port-forward the headless service **in a separate terminal window**:

```shell
kubectl port-forward --address 0.0.0.0 service/modelmesh-serving 8033 -n modelmesh-serving
Expand All @@ -200,10 +200,9 @@ kubectl port-forward --address 0.0.0.0 service/modelmesh-serving 8033 -n modelme
Then a gRPC client generated from the KServe [grpc_predict_v2.proto](https://github.com/kserve/kserve/blob/master/docs/predict-api/v2/grpc_predict_v2.proto)
file can be used with `localhost:8033`. A ready-to-use Python example of this can be found [here](https://github.com/pvaneck/model-serving-sandbox/tree/main/grpc-predict).

Alternatively, you can test inference with [grpcurl](https://github.com/fullstorydev/grpcurl). This can easily be installed with `brew install grpcurl` if on macOS.
Alternatively, you can test inferences using [grpcurl](https://github.com/fullstorydev/grpcurl). This can be installed easily with `brew install grpcurl` if on macOS.

With `grpcurl`, a request can be sent to the SKLearn MNIST model like the following. Make sure that the `MODEL_NAME`
variable below is set to the name of your `InferenceService`.
An example that uses `grpcurl` to send a request to the SKLearn MNIST model is provided below. The example should be run from `modelmesh-serving`'s root directory and `MODEL_NAME` should be set to the name of the deployed `InferenceService`.

```shell
MODEL_NAME=example-sklearn-isvc
Expand Down