-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: TensorFlow RunInference signature dtype issue #23756
Comments
This is related to #23584 which is caused by https://github.com/tensorflow/tfx-bsl/blob/49baf317527cc53d2dfb7d6c37b59757cd6d83a8/tfx_bsl/beam/run_inference.py#L626 - basically tfx-bsl's implementation of their model handler requires a string You can workaround it by adding a tf.function like we do in our notebook to convert from string records to uint8 records, and then convert your input types to string tf.examples - https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow.ipynb - its ugly, but should be workable |
Thanks for the info. This confirms what I was seeing. @AnandInguva also pointed me to his PR #23456 that introduces a helper function to assist with this conversion. |
To fix this, use the function below https://github.com/apache/beam/blob/8f85a6cdf8d0d71218137fbec149d13ae6fc595b/sdks/python/apache_beam/examples/inference/tfx_bsl/build_tensorflow_model.py#L71 Make sure you have the saved_model of the TF model. Then run the following in your Python shell
|
I think we can close this issue. Please reopen it if there is anything else. |
.close-issue |
What happened?
I am trying to use the TensorFlow MobileNet v2 320x320 model model in RunInference using the API from tfx_bsl.
But I'm running into this error. Please see this #23754. Looks like a signature dtype issue, where the expected type is a string (which has the enum value of
7
), but thedtype
of the MobileNet model isuint8
(which corresponds to the enum value of4
)Original command:
Issue Priority
Priority: 2
Issue Component
Component: run-inference
The text was updated successfully, but these errors were encountered: