-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add OVMS on Kserve GPU test #1212
Add OVMS on Kserve GPU test #1212
Conversation
Signed-off-by: Luca Giorgi <[email protected]>
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
ods_ci/tests/Resources/Page/ODH/ODHDashboard/ODHDataScienceProject/ModelServer.resource
Fixed
Show fixed
Hide fixed
Signed-off-by: Luca Giorgi <[email protected]>
Signed-off-by: Luca Giorgi <[email protected]>
Signed-off-by: Luca Giorgi <[email protected]>
Robot Results
|
Signed-off-by: Luca Giorgi <[email protected]>
Signed-off-by: Luca Giorgi <[email protected]>
Signed-off-by: Luca Giorgi <[email protected]>
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
Verified in rhods-ci-pr-test/2512 but need to figure out why the nvidia prometheus query is returning 0 |
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I put one comment, otherwise LGTM, but I don't have much knowledge behind this 🙂
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Outdated
Show resolved
Hide resolved
${node}= Get Node Pod Is Running On namespace=${PRJ_TITLE_GPU} | ||
... label=serving.kserve.io/inferenceservice=${MODEL_NAME_GPU} | ||
${type}= Get Instance Type Of Node ${node} | ||
Should Be Equal As Strings ${type} "g4dn.xlarge" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a way to check the node has gpu without pinning to the node title? We may run on different GPU types
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only other option I can think of is looking at the labels, but that is dependent on the nfd operator / nvidia label which wouldn't work for other accelerator types
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we'll need to make the test compatible with different GPU node types. I think it can be done in another PR once we understand how
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Outdated
Show resolved
Hide resolved
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Outdated
Show resolved
Hide resolved
Signed-off-by: Luca Giorgi <[email protected]>
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Fixed
Show fixed
Hide fixed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
Signed-off-by: Luca Giorgi <[email protected]>
|
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
...ci/tests/Tests/400__ods_dashboard/420__model_serving/425__model_serving_ovms_on_kserve.robot
Dismissed
Show dismissed
Hide dismissed
Validated again with rhods-ci-pr-test/2514, the TC fails but it's tagged with it's own product bug ID |
${node}= Get Node Pod Is Running On namespace=${PRJ_TITLE_GPU} | ||
... label=serving.kserve.io/inferenceservice=${MODEL_NAME_GPU} | ||
${type}= Get Instance Type Of Node ${node} | ||
Should Be Equal As Strings ${type} "g4dn.xlarge" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if we use any other GPU this test will fail correct ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, as I said in another comment above the only other option I can think of is looking at the labels, but that is dependent on the nfd operator / nvidia label which wouldn't work for other accelerator types
No description provided.