Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fix] return prompt from get_prompt_inputs utility method #2635

Merged
merged 1 commit into from
Dec 12, 2024

Conversation

siddvenk
Copy link
Contributor

@siddvenk siddvenk commented Dec 12, 2024

Description

There was a missing return in the utility method that formats the request for vllm/lmi-dist.

Type of change

Please delete options that are not relevant.

  • [x ] Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Checklist:

  • [ x] Have you manually built the docker image and verify the change?
  • [x ] Have you run related tests? Check how to set up the test environment here; One example would be pytest tests.py -k "TestCorrectnessLmiDist" -m "lmi_dist"
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

Before this change, an inference request would result in:

INFO  PyProcess W-1674-08a20b291e447c9-stdout: INFO::[RequestId=d43e0d40-5a8f-4bb8-b588-de2f08976c93] parsed and scheduled for inference
INFO  PyProcess W-1674-08a20b291e447c9-stdout: ERROR::Rolling batch inference error. There are 1 requests impacted. Dumping the impacted requestIds
INFO  PyProcess W-1674-08a20b291e447c9-stdout: Traceback (most recent call last):
INFO  PyProcess W-1674-08a20b291e447c9-stdout:   File "/tmp/.djl.ai/python/0.32.0-SNAPSHOT/djl_python/rolling_batch/rolling_batch.py", line 47, in try_catch_handling
INFO  PyProcess W-1674-08a20b291e447c9-stdout:     return func(self, *args, **kwargs)
INFO  PyProcess W-1674-08a20b291e447c9-stdout:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
INFO  PyProcess W-1674-08a20b291e447c9-stdout:   File "/tmp/.djl.ai/python/0.32.0-SNAPSHOT/djl_python/rolling_batch/vllm_rolling_batch.py", line 137, in inference
INFO  PyProcess W-1674-08a20b291e447c9-stdout:     self.engine.add_request(request_id=request_id,
INFO  PyProcess W-1674-08a20b291e447c9-stdout:   File "/usr/local/lib/python3.11/dist-packages/vllm/utils.py", line 1063, in inner
INFO  PyProcess W-1674-08a20b291e447c9-stdout:     return fn(*args, **kwargs)
INFO  PyProcess W-1674-08a20b291e447c9-stdout:            ^^^^^^^^^^^^^^^^^^^
INFO  PyProcess W-1674-08a20b291e447c9-stdout:   File "/usr/local/lib/python3.11/dist-packages/vllm/engine/llm_engine.py", line 792, in add_request
INFO  PyProcess W-1674-08a20b291e447c9-stdout:     assert prompt is not None and params is not None
INFO  PyProcess W-1674-08a20b291e447c9-stdout:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INFO  PyProcess W-1674-08a20b291e447c9-stdout: AssertionError

Now we get:

INFO  PyProcess W-2153-08a20b291e447c9-stdout: INFO::[RequestId=032b931d-17ee-4758-aa01-0aee01f926ec] parsed and scheduled for inference

@siddvenk siddvenk requested review from zachgk and a team as code owners December 12, 2024 19:09
@siddvenk siddvenk merged commit f8caad0 into deepjavalibrary:master Dec 12, 2024
9 checks passed
@siddvenk siddvenk deleted the lmi-vllm-fix branch December 12, 2024 19:22
siddvenk added a commit to siddvenk/djl-serving that referenced this pull request Jan 9, 2025
siddvenk added a commit to siddvenk/djl-serving that referenced this pull request Jan 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants