-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory error when using S3 as staging storage for Snowflake #5277
Comments
OS Version / Instance: Linux EC2 m5.2xlarge <-- doubled up LogsLOG
|
@sherifnada any chance there's capacity for this one? |
HI @rparrapy. Many thanks for reporting this issue and for your interest in the product. Could you please provide a little bit more info about your env where you migrated data? many many streams\rows\records? How much ram and heap memory had been allocated to the container when it failed? Did you try to allocate more memory? Many thanks in advance! |
Just a hint for those who will work on it. As an option, you may try to use this LoadTest. Just add it to the DestinationAcceptanceTest `
}` |
Hi @etsybaev ! I believe this issue to be related to this: #3439 As for RAM and heap memory, all I know is I used a t2.xlarge and the default Airbyte settings. Not sure what the defaults are for RAM and heap memory are. |
Another slack convo with same problem.
|
@marcosmarxm, I published a new version of the snowflake (0.3.16), in which this bug was fixed. Please try running your connector again and make sure the bug is not reproduced. |
🚀 🚀 🚀 🚀 @andriikorotkov looking good so far. my 60M records table, which previously would hang, has successfully loaded using CDC + S3 staging. |
updating: now tried it with multiple tables and unfortunately back at the issue where the sync hangs. reset entire connector ~250M rows 👎 |
|
@danieldiamond could you possibly provide the latest connector log? |
it looks to me that the initial root cause with OOM was fixed in the scope of this ticket, but @danieldiamond was faced with another issue that was described in this ticket #6003 |
SGTM, the OOM sounds addressed, let's follow up on the CPU issue in the other thread. But let's make sure to move daniel's comments to the other thread as well to retain the context |
Enviroment
Current Behavior
Enabling AWS S3 staging for the Snowflake destination results in out of memory errors. Standard inserts work but a 12 GB sync takes 3+ hours.
Expected Behavior
MySQL to Snowflake sync with S3 as staging storage works and is faster than standard inserts.
Logs
LOG
logs-sugar-s3.txt
Steps to Reproduce
Are you willing to submit a PR?
No
The text was updated successfully, but these errors were encountered: