Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add GRPC error codes to GRPC streaming if enabled by user. #7499
feat: Add GRPC error codes to GRPC streaming if enabled by user. #7499
Changes from 18 commits
33c1e93
3affd3a
7f86c6a
b65fd74
36e461f
bd549b1
85ccd72
faa742c
37b15e8
a65f8c3
080985c
cc34d41
40344d5
f40f695
0792bc1
cdd60bf
6661f21
6342524
22e5359
a31ba09
af451a2
0cb7db0
8ea2647
e8c3242
72097e3
350af25
e473f29
370c449
b87c3fc
cb548ff
1b6b3a7
70ce279
631b352
887aaa2
0e7670c
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a draft TEP (Was not required, I thought would be a good idea) https://docs.google.com/document/d/1TfNAMYLsPuLrduBtAKo64YqV55IWQhcn6zayc_CBY18/edit?pli=1
headers seemed logical after discussions with the SA on all possible options.
Not sure how we would do it via request params.
We need the triton_grpc_error flag set only once while the stream starts so headers seemed logica also has no backward compatibility issues.
Updated the PR with TEP link
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to move the model to the
L0_backend_python/lifecycle
folder? I think it might be easier this way to track which test the models belong to.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't have any models or model folder in
https://github.com/triton-inference-server/server/tree/main/qa/L0_backend_python/lifecycle
In test.sh we copy over from python_models and create the models folder with versions inside.
I have kept this new model in parallel to existing models used in
L0_backend_python/lifecycle
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can always create a
models
subfolder underL0_*
testsThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks Olga, that's what I meant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the existing test.sh we do
rm -fr *.log ./models
before we start the test.server/qa/L0_backend_python/lifecycle/test.sh
Line 37 in 5320009
For me to add models to
L0_backend_python/lifecycle
I will need to remove this.We cleanup the models before every test, I will need to remove all the instances where we delete the models folder.
server/qa/L0_backend_python/lifecycle/test.sh
Line 103 in 5320009
server/qa/L0_backend_python/lifecycle/test.sh
Line 170 in 5320009
Not sure if I should change existing design might impact other tests too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
name the folder differently and copy from it to
models
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@tanmayv25 suggesnted I move all the changes from L0_lifecycle to a new L0_ dedicated for this feature going forward.
The reason I made changes here was, L0_lifecycle/models had models where we send errors pragmatically helped me reuse all the models.
Will resolve this comment along with the original comment by @tanmayv25 in a new PR after the cherrypick.
Keeping it unresolved for now will mark this resolved after the new PR