Skip to content

Commit

Permalink
🎉 Destination S3: parquet output (#3908)
Browse files Browse the repository at this point in the history
* Add skeleton code for parquet writer

* Refactor s3 destination code

* Add parquet to spec

* Complete parquet writer

* Change testing data from int to double

* Add acceptance test for parquet writer

* Handle special schema field names

* Format code

* Add parquet config

* Add documentation

* Add unit tests

* Fix typo

* Update document

* Bump version

* Fix date format

* Fix credential filename

* Update doc

* Update test and publish commands

* Refactor s3 format config

* Append compression codec file extension

* Update doc

* Remove compression codec file extension

* Add comments

* Add README, CHANGELOG, and sample configs

* Move changelog

* Use switch statement

* Move filename helper method to base writer

* Rename converter

* Separate test cases

* Drop union type length restriction

* Support array with multiple types

* Move comments to connector doc

* Share config between acceptance tests

* Add doc about additional properties

* Move shared code out of if branch

* Add doc about adding a new format

* Format code

* Bump version to 0.1.4

* Fix default max padding size
  • Loading branch information
tuliren authored Jun 14, 2021
1 parent a721429 commit 87552b2
Show file tree
Hide file tree
Showing 41 changed files with 2,385 additions and 436 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@
"destinationDefinitionId": "4816b78f-1489-44c1-9060-4b19d5fa9362",
"name": "S3",
"dockerRepository": "airbyte/destination-s3",
"dockerImageTag": "0.1.3",
"dockerImageTag": "0.1.5",
"documentationUrl": "https://docs.airbyte.io/integrations/destinations/s3"
}
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
- destinationDefinitionId: 4816b78f-1489-44c1-9060-4b19d5fa9362
name: S3
dockerRepository: airbyte/destination-s3
dockerImageTag: 0.1.3
dockerImageTag: 0.1.5
documentationUrl: https://docs.airbyte.io/integrations/destinations/s3
- destinationDefinitionId: f7a7d195-377f-cf5b-70a5-be6b819019dc
name: Redshift
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -370,8 +370,8 @@ public void testSecondSync() throws Exception {
.put("id", 1)
.put("currency", "USD")
.put("date", "2020-03-31T00:00:00Z")
.put("HKD", 10)
.put("NZD", 700)
.put("HKD", 10.0)
.put("NZD", 700.0)
.build()))),
new AirbyteMessage()
.withType(Type.STATE)
Expand Down Expand Up @@ -403,8 +403,8 @@ public void testLineBreakCharacters() throws Exception {
.put("id", 1)
.put("currency", "USD\u2028")
.put("date", "2020-03-\n31T00:00:00Z\r")
.put("HKD", 10)
.put("NZD", 700)
.put("HKD", 10.0)
.put("NZD", 700.0)
.build()))),
new AirbyteMessage()
.withType(Type.STATE)
Expand Down Expand Up @@ -459,8 +459,8 @@ public void testIncrementalSync() throws Exception {
.put("id", 1)
.put("currency", "USD")
.put("date", "2020-03-31T00:00:00Z")
.put("HKD", 10)
.put("NZD", 700)
.put("HKD", 10.0)
.put("NZD", 700.0)
.build()))),
new AirbyteMessage()
.withType(Type.STATE)
Expand Down
2 changes: 1 addition & 1 deletion airbyte-integrations/connectors/destination-s3/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,5 @@ COPY build/distributions/${APPLICATION}*.tar ${APPLICATION}.tar

RUN tar xf ${APPLICATION}.tar --strip-components=1

LABEL io.airbyte.version=0.1.3
LABEL io.airbyte.version=0.1.5
LABEL io.airbyte.name=airbyte/destination-s3
27 changes: 27 additions & 0 deletions airbyte-integrations/connectors/destination-s3/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# S3 Test Configuration

In order to test the D3 destination, you need an AWS account (or alternative S3 account).

## Community Contributor

As a community contributor, you will need access to AWS to run the integration tests.

- Create an S3 bucket for testing.
- Get your `access_key_id` and `secret_access_key` that can read and write to the above bucket.
- Paste the bucket and key information into the config files under [`./sample_secrets`](./sample_secrets).
- Rename the directory from `sample_secrets` to `secrets`.
- Feel free to modify the config files with different settings in the acceptance test file (e.g. `S3CsvDestinationAcceptanceTest.java`, method `getFormatConfig`), as long as they follow the schema defined in [spec.json](src/main/resources/spec.json).

## Airbyte Employee

- Access the `destination s3 * creds` secrets on Last Pass. The `*` here represents the different file format.
- Replace the `config.json` under `sample_secrets`.
- Rename the directory from `sample_secrets` to `secrets`.

## Add New Output Format
- Add a new enum in `S3Format`.
- Modify `spec.json` to specify the configuration of this new format.
- Update `S3FormatConfigs` to be able to construct a config for this new format.
- Create a new package under `io.airbyte.integrations.destination.s3`.
- Implement a new `S3Writer`. The implementation can extend `BaseS3Writer`.
- Write an acceptance test for the new output format. The test can extend `S3DestinationAcceptanceTest`.
8 changes: 8 additions & 0 deletions airbyte-integrations/connectors/destination-s3/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,18 @@ dependencies {
implementation project(':airbyte-integrations:connectors:destination-jdbc')
implementation files(project(':airbyte-integrations:bases:base-java').airbyteDocker.outputs)

// csv
implementation 'com.amazonaws:aws-java-sdk-s3:1.11.978'
implementation 'org.apache.commons:commons-csv:1.4'
implementation 'com.github.alexmojaki:s3-stream-upload:2.2.2'

// parquet
implementation group: 'org.apache.hadoop', name: 'hadoop-common', version: '3.3.0'
implementation group: 'org.apache.hadoop', name: 'hadoop-aws', version: '3.3.0'
implementation group: 'org.apache.hadoop', name: 'hadoop-mapreduce-client-core', version: '3.3.0'
implementation group: 'org.apache.parquet', name: 'parquet-avro', version: '1.12.0'
implementation group: 'tech.allegro.schema.json2avro', name: 'converter', version: '0.2.10'

testImplementation 'org.apache.commons:commons-lang3:3.11'

integrationTestJavaImplementation project(':airbyte-integrations:bases:standard-destination-test')
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"s3_bucket_name": "paste-bucket-name-here",
"s3_bucket_path": "integration-test",
"s3_bucket_region": "paste-bucket-region-here",
"access_key_id": "paste-access-key-id-here",
"secret_access_key": "paste-secret-access-key-here"
}
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,8 @@
import io.airbyte.commons.json.Jsons;
import io.airbyte.integrations.base.AirbyteStreamNameNamespacePair;
import io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer;
import io.airbyte.integrations.destination.s3.writer.S3Writer;
import io.airbyte.integrations.destination.s3.writer.S3WriterFactory;
import io.airbyte.protocol.models.AirbyteMessage;
import io.airbyte.protocol.models.AirbyteMessage.Type;
import io.airbyte.protocol.models.AirbyteRecordMessage;
Expand All @@ -50,21 +52,21 @@ public class S3Consumer extends FailureTrackingAirbyteMessageConsumer {

private final S3DestinationConfig s3DestinationConfig;
private final ConfiguredAirbyteCatalog configuredCatalog;
private final S3OutputFormatterFactory formatterFactory;
private final S3WriterFactory writerFactory;
private final Consumer<AirbyteMessage> outputRecordCollector;
private final Map<AirbyteStreamNameNamespacePair, S3OutputFormatter> streamNameAndNamespaceToFormatters;
private final Map<AirbyteStreamNameNamespacePair, S3Writer> streamNameAndNamespaceToWriters;

private AirbyteMessage lastStateMessage = null;

public S3Consumer(S3DestinationConfig s3DestinationConfig,
ConfiguredAirbyteCatalog configuredCatalog,
S3OutputFormatterFactory formatterFactory,
S3WriterFactory writerFactory,
Consumer<AirbyteMessage> outputRecordCollector) {
this.s3DestinationConfig = s3DestinationConfig;
this.configuredCatalog = configuredCatalog;
this.formatterFactory = formatterFactory;
this.writerFactory = writerFactory;
this.outputRecordCollector = outputRecordCollector;
this.streamNameAndNamespaceToFormatters = new HashMap<>(configuredCatalog.getStreams().size());
this.streamNameAndNamespaceToWriters = new HashMap<>(configuredCatalog.getStreams().size());
}

@Override
Expand Down Expand Up @@ -97,14 +99,14 @@ protected void startTracked() throws Exception {
Timestamp uploadTimestamp = new Timestamp(System.currentTimeMillis());

for (ConfiguredAirbyteStream configuredStream : configuredCatalog.getStreams()) {
S3OutputFormatter formatter = formatterFactory
S3Writer writer = writerFactory
.create(s3DestinationConfig, s3Client, configuredStream, uploadTimestamp);
formatter.initialize();
writer.initialize();

AirbyteStream stream = configuredStream.getStream();
AirbyteStreamNameNamespacePair streamNamePair = AirbyteStreamNameNamespacePair
.fromAirbyteSteam(stream);
streamNameAndNamespaceToFormatters.put(streamNamePair, formatter);
streamNameAndNamespaceToWriters.put(streamNamePair, writer);
}
}

Expand All @@ -121,20 +123,20 @@ protected void acceptTracked(AirbyteMessage airbyteMessage) throws Exception {
AirbyteStreamNameNamespacePair pair = AirbyteStreamNameNamespacePair
.fromRecordMessage(recordMessage);

if (!streamNameAndNamespaceToFormatters.containsKey(pair)) {
if (!streamNameAndNamespaceToWriters.containsKey(pair)) {
throw new IllegalArgumentException(
String.format(
"Message contained record from a stream that was not in the catalog. \ncatalog: %s , \nmessage: %s",
Jsons.serialize(configuredCatalog), Jsons.serialize(recordMessage)));
}

UUID id = UUID.randomUUID();
streamNameAndNamespaceToFormatters.get(pair).write(id, recordMessage);
streamNameAndNamespaceToWriters.get(pair).write(id, recordMessage);
}

@Override
protected void close(boolean hasFailed) throws Exception {
for (S3OutputFormatter handler : streamNameAndNamespaceToFormatters.values()) {
for (S3Writer handler : streamNameAndNamespaceToWriters.values()) {
handler.close(hasFailed);
}
// S3 stream uploader is all or nothing if a failure happens in the destination.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@
import io.airbyte.integrations.base.IntegrationRunner;
import io.airbyte.integrations.destination.jdbc.copy.s3.S3Config;
import io.airbyte.integrations.destination.jdbc.copy.s3.S3StreamCopier;
import io.airbyte.integrations.destination.s3.writer.ProductionWriterFactory;
import io.airbyte.integrations.destination.s3.writer.S3WriterFactory;
import io.airbyte.protocol.models.AirbyteConnectionStatus;
import io.airbyte.protocol.models.AirbyteConnectionStatus.Status;
import io.airbyte.protocol.models.AirbyteMessage;
Expand Down Expand Up @@ -65,7 +67,7 @@ public AirbyteConnectionStatus check(JsonNode config) {
public AirbyteMessageConsumer getConsumer(JsonNode config,
ConfiguredAirbyteCatalog configuredCatalog,
Consumer<AirbyteMessage> outputRecordCollector) {
S3OutputFormatterFactory formatterFactory = new S3OutputFormatterProductionFactory();
S3WriterFactory formatterFactory = new ProductionWriterFactory();
return new S3Consumer(S3DestinationConfig.getS3DestinationConfig(config), configuredCatalog, formatterFactory, outputRecordCollector);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,12 @@

package io.airbyte.integrations.destination.s3;

import io.airbyte.integrations.destination.ExtendedNameTransformer;

public final class S3DestinationConstants {

// These parameters are used by {@link StreamTransferManager}.
// See this doc about how they affect memory usage:
// https://alexmojaki.github.io/s3-stream-upload/javadoc/apidocs/alex/mojaki/s3upload/StreamTransferManager.html
// Total memory = (numUploadThreads + queueCapacity) * partSize + numStreams * (partSize + 6MB)
// = 31 MB at current configurations
public static final int DEFAULT_UPLOAD_THREADS = 2;
public static final int DEFAULT_QUEUE_CAPACITY = 2;
public static final int DEFAULT_PART_SIZE_MD = 5;
public static final int DEFAULT_NUM_STREAMS = 1;
public static final String YYYY_MM_DD_FORMAT_STRING = "yyyy_MM_dd";
public static final ExtendedNameTransformer NAME_TRANSFORMER = new ExtendedNameTransformer();

private S3DestinationConstants() {}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,5 +25,18 @@
package io.airbyte.integrations.destination.s3;

public enum S3Format {
CSV

CSV("csv"),
PARQUET("parquet");

private final String fileExtension;

S3Format(String fileExtension) {
this.fileExtension = fileExtension;
}

public String getFileExtension() {
return fileExtension;
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,34 @@

package io.airbyte.integrations.destination.s3;

import com.fasterxml.jackson.databind.JsonNode;

public interface S3FormatConfig {

S3Format getFormat();

static String withDefault(JsonNode config, String property, String defaultValue) {
JsonNode value = config.get(property);
if (value == null || value.isNull()) {
return defaultValue;
}
return value.asText();
}

static int withDefault(JsonNode config, String property, int defaultValue) {
JsonNode value = config.get(property);
if (value == null || value.isNull()) {
return defaultValue;
}
return value.asInt();
}

static boolean withDefault(JsonNode config, String property, boolean defaultValue) {
JsonNode value = config.get(property);
if (value == null || value.isNull()) {
return defaultValue;
}
return value.asBoolean();
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -27,20 +27,30 @@
import com.fasterxml.jackson.databind.JsonNode;
import io.airbyte.commons.json.Jsons;
import io.airbyte.integrations.destination.s3.csv.S3CsvFormatConfig;
import io.airbyte.integrations.destination.s3.csv.S3CsvFormatConfig.Flattening;
import io.airbyte.integrations.destination.s3.parquet.S3ParquetFormatConfig;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class S3FormatConfigs {

protected static final Logger LOGGER = LoggerFactory.getLogger(S3FormatConfigs.class);

public static S3FormatConfig getS3FormatConfig(JsonNode config) {
JsonNode formatConfig = config.get("format");
S3Format formatType = S3Format.valueOf(formatConfig.get("format_type").asText());
LOGGER.info("S3 format config: {}", formatConfig.toString());
S3Format formatType = S3Format.valueOf(formatConfig.get("format_type").asText().toUpperCase());

if (formatType == S3Format.CSV) {
Flattening flattening = Flattening.fromValue(formatConfig.get("flattening").asText());
return new S3CsvFormatConfig(flattening);
switch (formatType) {
case CSV -> {
return new S3CsvFormatConfig(formatConfig);
}
case PARQUET -> {
return new S3ParquetFormatConfig(formatConfig);
}
default -> {
throw new RuntimeException("Unexpected output format: " + Jsons.serialize(config));
}
}

throw new RuntimeException("Unexpected output format: " + Jsons.serialize(config));
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
/*
* MIT License
*
* Copyright (c) 2020 Airbyte
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/

package io.airbyte.integrations.destination.s3.csv;

public class S3CsvConstants {

// These parameters are used by {@link StreamTransferManager}.
// See this doc about how they affect memory usage:
// https://alexmojaki.github.io/s3-stream-upload/javadoc/apidocs/alex/mojaki/s3upload/StreamTransferManager.html
// Total memory = (numUploadThreads + queueCapacity) * partSize + numStreams * (partSize + 6MB)
// = 31 MB at current configurations
public static final int DEFAULT_UPLOAD_THREADS = 2;
public static final int DEFAULT_QUEUE_CAPACITY = 2;
public static final int DEFAULT_PART_SIZE_MB = 5;
public static final int DEFAULT_NUM_STREAMS = 1;

}
Loading

0 comments on commit 87552b2

Please sign in to comment.