-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 AWS S3 Staging COPY is writing records from different table in the same raw table #6949
Conversation
…s3-copy-fix � Conflicts: � docs/integrations/destinations/redshift.md � docs/integrations/destinations/snowflake.md
/test connector=connectors/destination-snowflake
|
/test connector=connectors/destination-redshift
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good.
I have two minor comments:
- You can extract the suffix length as a constant at the beginning of the class for easy tracking.
- Would it be more straightforward to use incrementing integers as file suffix instead of random characters? That seems easier to debugging.
…s3-copy-fix � Conflicts: � airbyte-integrations/connectors/destination-jdbc/src/main/java/io/airbyte/integrations/destination/jdbc/copy/s3/S3StreamCopier.java � airbyte-integrations/connectors/destination-redshift/src/main/java/io/airbyte/integrations/destination/redshift/RedshiftStreamCopier.java
/publish connector=connectors/destination-snowflake
|
/publish connector=connectors/destination-redshift
|
/publish connector=connectors/destination-redshift
|
/test connector=connectors/destination-redshift
|
/publish connector=connectors/destination-redshift
|
…same raw table (airbytehq#6949) * updated jdbc destination * updated snowflake and redshift destination version * updated documentation * updated documentation * updated prefix length for snowflake and redshift streams * fixed remarks * updated new redshift version
What
AWS S3 Staging COPY is writing records from different table in the same raw table
How
For the latest changes that were made for destination-jdbc working well, but destination-snowflake and destination-redshift versions have not been changed. I have updated the prefix size for S3 and GCS files to avoid duplication with a large number of files. I also added new destination-snowflake and destination-redshift versions.
Pre-merge Checklist
Expand the relevant checklist and delete the others.
New Connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/SUMMARY.md
docs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampledocs/integrations/README.md
airbyte-integrations/builds.md
Airbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing./publish
command described hereUpdating a connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing./publish
command described hereConnector Generator
-scaffold
in their name) have been updated with the latest scaffold by running./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates
then checking in your changes