Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Smoothen event source parser by feeding in smaller chunks #151

Merged
merged 4 commits into from
Feb 11, 2025
Merged

Conversation

ankrgyl
Copy link
Contributor

@ankrgyl ankrgyl commented Feb 11, 2025

It seems like several things, including the event parser (straight up CPU performance), and node's ability to interleave stream segments when running lots of concurrent streams, are worsened by having streams with enormous chunks.

This change splits the stream into newlines, which seems to produce much better results.

Copy link

vercel bot commented Feb 11, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
ai-proxy ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 11, 2025 7:37am

@ankrgyl ankrgyl merged commit bc16819 into main Feb 11, 2025
4 checks passed
sachinpad added a commit that referenced this pull request Feb 11, 2025
ankrgyl added a commit that referenced this pull request Feb 12, 2025
Unverts #151 and
fixes it. The issue is that when replaying the cache, for the first
`0...n-1` chunks, we need to include a newline.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants