Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate limits and slow download speed #4

Open
RisingOrange opened this issue Jul 27, 2022 · 10 comments
Open

Rate limits and slow download speed #4

RisingOrange opened this issue Jul 27, 2022 · 10 comments

Comments

@RisingOrange
Copy link
Collaborator

RisingOrange commented Jul 27, 2022

When downloading from google drive a response with this message was returned after some files were downloaded:
"We're sorry but your computer or network may be sending automated queries. To protect our users, we can't process your request right now."

Also the download is pretty slow (it took 3 minutes for 20 mbs).

Resources:
https://developers.google.com/drive/api/guides/performance

@andrewsanchez
Copy link
Collaborator

Maybe the "official," recommendation should be to just download the files locally and then pass the local directory to the importer until the code is more robust? @AnKingMed

@AnKingMed
Copy link
Collaborator

I'm fine with that for now

@BlueGreenMagick
Copy link
Contributor

BlueGreenMagick commented Jul 27, 2022

If I understand this correctly, I think we need to use oauth2 to download files. The rate limit is supposed to be enough for our purposes at 20,000 requests / 100sec (200req/s for all the users)

If using oauth2 doesn't help, I'm not really sure what we can do to improve performance though. All three solutions(gzip, partial, batch) listed on the linked google page can't be applied to downloading image files.

Apparantly gzips don't do much for images as they are already compressed.

Batch can't be used either.

Note: Currently, Google Drive does not support batch operations for media, either for upload or download.

@RisingOrange
Copy link
Collaborator Author

Maybe we could have a compressed folder of images and download that?

@BlueGreenMagick
Copy link
Contributor

I think one use case we have to consider is when new images are added to the collection. This could happen when new cards are added, or images replaced with better ones.

The add-on is pretty efficient with downloading 10 new images out of 1000 existing right now.

We could just prompt the user to download the folder via web browser if it detects more than 100 images. That'd achieve the same thing with a couple more clicks. Especially if local folder import is made to handle zip files.

@AnKingMed
Copy link
Collaborator

That's a good idea for now

@andrewsanchez
Copy link
Collaborator

Can the add-on just handle the download via requests or wget instead of having them manually do it? Sorry, haven't looked at the code at all, no idea how it works.

@AnKingMed
Copy link
Collaborator

In the future that'd be nice, but this isn't necessary for launch

@RisingOrange
Copy link
Collaborator Author

If I understand this correctly, I think we need to use oauth2 to download files.

Does the current key have oauth2 activated?
@BlueGreenMagick

@BlueGreenMagick
Copy link
Contributor

No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants