-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download attempts should be retried #18
Comments
I know this is heinous code, but this is what I've been using to get around this. In
|
Downloads for DocRef attachments should also be retried in the same way, while we're here. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I've had the situation recently where Cerner likes to give many many tiny files as the bulk export result. Each file has no more than 200 lines and no more than one patient. (So some files are very short, like one line, if the patient doesn't have many resources.) In one recent example, the server wanted me to download over 20k files.
In this situation, the bulk-data-client starts tearing through those files as fast as it can (and by default that means 5 files at once). But that means the server quickly gets mad at it, giving 429 errors and the like.
To avoid 429 errors, I've had to reduce the
parallelDownloads
setting to 1. But even then, I get the occasional http hiccup (20k requests after all) like a 502 gateway error.So my feature request is to retry download attempts in general, with some backoff for 429 results specifically.
The text was updated successfully, but these errors were encountered: