Skip to content

Commit

Permalink
✨ Source Braze : Migrate to Manifest-only (#47329)
Browse files Browse the repository at this point in the history
Co-authored-by: Octavia Squidington III <[email protected]>
Co-authored-by: Danylo Jablonski <[email protected]>
Co-authored-by: Natik Gadzhi <[email protected]>
Co-authored-by: ChristoGrab <[email protected]>
  • Loading branch information
5 people authored Jan 29, 2025
1 parent f9de659 commit ae23d5a
Show file tree
Hide file tree
Showing 44 changed files with 3,584 additions and 1,233 deletions.
6 changes: 0 additions & 6 deletions airbyte-integrations/connectors/source-braze/.dockerignore

This file was deleted.

38 changes: 0 additions & 38 deletions airbyte-integrations/connectors/source-braze/Dockerfile

This file was deleted.

73 changes: 30 additions & 43 deletions airbyte-integrations/connectors/source-braze/README.md
Original file line number Diff line number Diff line change
@@ -1,78 +1,65 @@
# Braze Source
# Braze source connector

This is the repository for the Braze configuration based source connector.
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/braze).
This directory contains the manifest-only connector for `source-braze`.
This _manifest-only_ connector is not a Python package on its own, as it runs inside of the base `source-declarative-manifest` image.

## Local development
For information about how to configure and use this connector within Airbyte, see [the connector's full documentation](https://docs.airbyte.com/integrations/sources/braze).

#### Create credentials
## Local development

**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/braze)
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_braze/spec.yaml` file.
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
See `integration_tests/sample_config.json` for a sample config file.
We recommend using the Connector Builder to edit this connector.
Using either Airbyte Cloud or your local Airbyte OSS instance, navigate to the **Builder** tab and select **Import a YAML**.
Then select the connector's `manifest.yaml` file to load the connector into the Builder. You're now ready to make changes to the connector!

**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source braze test creds`
and place them into `secrets/config.json`.
If you prefer to develop locally, you can follow the instructions below.

### Locally running the connector docker image
### Building the docker image

#### Build
You can build any manifest-only connector with `airbyte-ci`:

**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
2. Run the following command to build the docker image:

```bash
airbyte-ci connectors --name=source-braze build
```

An image will be built with the tag `airbyte/source-braze:dev`.
An image will be available on your host with the tag `airbyte/source-braze:dev`.

**Via `docker build`:**
### Creating credentials

```bash
docker build -t airbyte/source-braze:dev .
```
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/braze)
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `spec` object in the connector's `manifest.yaml` file.
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.

#### Run
### Running as a docker container

Then run any of the connector commands as follows:
Then run any of the standard source connector commands:

```
```bash
docker run --rm airbyte/source-braze:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braze:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braze:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-braze:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
```

## Testing
### Running the CI test suite

You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):

```bash
airbyte-ci connectors --name=source-braze test
```

### Customizing acceptance Tests

Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.

## Dependency Management

All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:

- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
- required for the testing need to go to `TEST_REQUIREMENTS` list

### Publishing a new version of the connector

You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
## Publishing a new version of the connector

1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-braze test`
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
3. Make sure the `metadata.yaml` content is up to date.
4. Make the connector documentation and its changelog is up to date (`docs/integrations/sources/braze.md`).
If you want to contribute changes to `source-braze`, here's how you can do that:
1. Make your changes locally, or load the connector's manifest into Connector Builder and make changes there.
2. Make sure your changes are passing our test suite with `airbyte-ci connectors --name=source-braze test`
3. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
- bump the `dockerImageTag` value in in `metadata.yaml`
4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/braze.md`).
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
6. Pat yourself on the back for being an awesome contributor.
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
3 changes: 0 additions & 3 deletions airbyte-integrations/connectors/source-braze/__init__.py

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ connector_image: airbyte/source-braze:dev
acceptance_tests:
spec:
tests:
- spec_path: "source_braze/spec.yaml"
- spec_path: "manifest.yaml"
connection:
tests:
- config_path: "secrets/config.json"
Expand Down
111 changes: 111 additions & 0 deletions airbyte-integrations/connectors/source-braze/components.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
#
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
#

import datetime
import operator
from dataclasses import dataclass, field
from typing import Any, Iterable, Mapping, MutableMapping, Optional

import dpath
import requests

from airbyte_cdk.sources.declarative.extractors.dpath_extractor import DpathExtractor
from airbyte_cdk.sources.declarative.incremental import DatetimeBasedCursor
from airbyte_cdk.sources.declarative.interpolation import InterpolatedString
from airbyte_cdk.sources.declarative.requesters import RequestOption
from airbyte_cdk.sources.declarative.requesters.request_option import RequestOptionType
from airbyte_cdk.sources.declarative.transformations import AddFields
from airbyte_cdk.sources.types import Config, Record, StreamSlice, StreamState


@dataclass
class TransformToRecordComponent(AddFields):
def transform(
self,
record: Record,
config: Optional[Config] = None,
stream_state: Optional[StreamState] = None,
stream_slice: Optional[StreamSlice] = None,
) -> Record:
"""
Transforms incoming string to a dictionary record.
"""
_record = {}
kwargs = {"record": record, "stream_state": stream_state, "stream_slice": stream_slice}
for parsed_field in self._parsed_fields:
value = parsed_field.value.eval(config, **kwargs)
dpath.new(_record, parsed_field.path, value)
return _record


@dataclass
class DatetimeIncrementalSyncComponent(DatetimeBasedCursor):
"""
Extends DatetimeBasedCursor for Braze's API requirements where instead of using explicit
start_time/end_time parameters, the API expects:
- An end_time (ending_at)
- A length parameter indicating how many days before end_time to fetch
The length parameter represents the number of days in the time window, counting both
start and end dates inclusively. For example, a window from 2023-01-01 to 2023-01-03
has a length of 3 days (counting Jan 1, 2, and 3). Length must be between 1-100 days
as per Braze's API requirements.
Example API request:
GET /campaigns/data_series?campaign_id=xxx&ending_at=2023-01-03&length=3
This would fetch data from 2023-01-01 to 2023-01-03 inclusive.
Args:
step_option: Configuration for injecting the length parameter into requests
"""

step_option: Optional[RequestOption] = field(default=None)

def __post_init__(self, parameters: Mapping[str, Any]):
super().__post_init__(parameters=parameters)
if self.step_option is None:
raise ValueError("step_option is required for DatetimeIncrementalSyncComponent")

def _get_request_options(self, option_type: RequestOptionType, stream_slice: Optional[StreamSlice] = None) -> Mapping[str, Any]:
options: dict[str, Any] = {}
if stream_slice is not None and self.step_option is not None:
base_options = super()._get_request_options(option_type, stream_slice)
options.update(base_options)

if self.step_option.inject_into == option_type:
# Get start and end times from the stream slice
start_field = self._partition_field_start.eval(self.config)
end_field = self._partition_field_end.eval(self.config)

start_str = stream_slice.get(start_field)
end_str = stream_slice.get(end_field)

if isinstance(start_str, str) and isinstance(end_str, str):
start_time = self._parser.parse(start_str, self.datetime_format)
end_time = self._parser.parse(end_str, self.datetime_format)

# Add 1 to include both start and end dates in the count
# e.g., 2023-01-01 to 2023-01-03 = 3 days (Jan 1, 2, and 3)
length_days = min(100, max(1, (end_time - start_time).days + 1))

field_name = (
self.step_option.field_name.eval(config=self.config)
if isinstance(self.step_option.field_name, InterpolatedString)
else self.step_option.field_name
)

options[field_name] = length_days

return options


@dataclass
class EventsRecordExtractor(DpathExtractor):
def extract_records(self, response: requests.Response) -> Iterable[MutableMapping[Any, Any]]:
response_body = next(self.decoder.decode(response))
events = response_body.get("events")
if events:
return [{"event_name": value} for value in events]
else:
return []
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,6 @@
"sync_mode": "incremental",
"destination_sync_mode": "append"
},
{
"stream": {
"name": "events_analytics",
"json_schema": {},
"supported_sync_modes": ["full_refresh"]
},
"sync_mode": "full_refresh",
"destination_sync_mode": "append"
},
{
"stream": {
"name": "cards_analytics",
Expand Down
9 changes: 0 additions & 9 deletions airbyte-integrations/connectors/source-braze/main.py

This file was deleted.

Loading

0 comments on commit ae23d5a

Please sign in to comment.