Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed an issue where non-ASCII characters were changed to Unicode characters within the prompt #1490

Merged
merged 1 commit into from
Oct 14, 2024

Conversation

Youngrok123
Copy link
Contributor

  • There is an issue where non-ASCII characters are converted to Unicode characters within the prompt.
  • Due to this issue, Users that use non-English characters will receive an abnormal response when communicating with LLM.
  • This issue can be resolved by adding the ensure_ascii=False option when using the json.dumps function.

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Oct 14, 2024
@jjmachan jjmachan changed the base branch from main to v0.1 October 14, 2024 12:33
@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels Oct 14, 2024
@jjmachan jjmachan changed the base branch from v0.1 to main October 14, 2024 12:34
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. and removed size:XXL This PR changes 1000+ lines, ignoring generated files. labels Oct 14, 2024
@jjmachan
Copy link
Member

hey @Youngrok123 thanks a lot for this fix - I will merge this in and release this with v0.1 as well

but with v0.2 it will be in the https://docs.ragas.io/en/latest/references/prompt/#ragas.prompt.PydanticPrompt so I'll make the changes for that too

@jjmachan jjmachan merged commit d74be96 into explodinggradients:main Oct 14, 2024
18 checks passed
jjmachan pushed a commit to jjmachan/ragas that referenced this pull request Oct 14, 2024
…racters within the prompt (explodinggradients#1490)

- There is an issue where non-ASCII characters are converted to Unicode
characters within the prompt.
- Due to this issue, Users that use non-English characters will receive
an abnormal response when communicating with LLM.
- This issue can be resolved by adding the ensure_ascii=False option
when using the json.dumps function.
@jjmachan
Copy link
Member

@Youngrok123 also created something for v0.2 prompts #1494

do let me know if there is any mistakes :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants