forked from elastic/kibana
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Automatic import] readme input types templates (elastic#194308)
- Loading branch information
1 parent
3515a0f
commit aff9217
Showing
17 changed files
with
398 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
5 changes: 5 additions & 0 deletions
5
.../plugins/shared/integration_assistant/server/templates/readme/setup/aws-cloudwatch.md.njk
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
### Collecting logs from AWS CloudWatch | ||
|
||
When collecting logs from CloudWatch is enabled, users can retrieve logs from all log streams in a specific log group. `filterLogEvents` AWS API is used to list log events from the specified log group. Amazon CloudWatch Logs can be used to store log files from Amazon Elastic Compute Cloud(EC2), AWS CloudTrail, Route53, and other sources. | ||
|
||
{% include "ssl-tls.md.njk" %} |
26 changes: 26 additions & 0 deletions
26
...platform/plugins/shared/integration_assistant/server/templates/readme/setup/aws-s3.md.njk
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
### Collecting logs from Amazon S3 bucket | ||
|
||
When S3 bucket log collection is enabled, users can retrieve logs from S3 objects that are pointed to by S3 notification events read from an SQS queue, or by directly polling list of S3 objects in an S3 bucket. | ||
|
||
The use of SQS notification is preferred; polling list of S3 objects is expensive in terms of performance and costs and should be preferably used only when no SQS notification can be attached to the S3 buckets. This input integration also supports S3 notification from SNS to SQS. | ||
|
||
The SQS notification method is enabled setting `queue_url` configuration value. The S3 bucket list polling method is enabled setting `bucket_arn` configuration value and `number_of_workers` value. Exactly one of the `queue_url` and `bucket_arn` configuration options must be set. | ||
|
||
#### To collect data from AWS SQS, follow the below steps: | ||
1. If data forwarding to an AWS S3 Bucket hasn't been configured, then first setup an AWS S3 Bucket as mentioned in the above documentation. | ||
2. Follow the steps below for each data stream that has been enabled: | ||
1. Create an SQS queue | ||
- To setup an SQS queue, follow "Step 1: Create an Amazon SQS queue" mentioned in the [Amazon documentation](https://docs.aws.amazon.com/AmazonS3/latest/userguide/ways-to-add-notification-config-to-bucket.html). | ||
- While creating an SQS Queue, please provide the same bucket ARN that has been generated after creating an AWS S3 Bucket. | ||
2. Setup event notification from the S3 bucket using the instructions [here](https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications.html). Use the following settings: | ||
- Event type: `All object create events` (`s3:ObjectCreated:*`) | ||
- Destination: SQS Queue | ||
- Prefix (filter): enter the prefix for this data stream, e.g. `alert_logs/` | ||
- Select the SQS queue that has been created for this data stream | ||
|
||
**Note**: | ||
- A separate SQS queue and S3 bucket notification is required for each enabled data stream. | ||
- Permissions for the above AWS S3 bucket and SQS queues should be configured according to the [Filebeat S3 input documentation](https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-aws-s3.html#_aws_permissions_2) | ||
- Data collection via AWS S3 Bucket and AWS SQS are mutually exclusive in this case. | ||
|
||
{% include "ssl-tls.md.njk" %} |
29 changes: 29 additions & 0 deletions
29
...gins/shared/integration_assistant/server/templates/readme/setup/azure-blob-storage.md.njk
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
### Collecting logs from Azure Blob Storage | ||
|
||
#### Create a Storage account container | ||
|
||
To create the storage account: | ||
|
||
1. Sign in to the [Azure Portal](https://portal.azure.com/) and create your storage account. | ||
2. While configuring your project details, make sure you select the following recommended default settings: | ||
- Hierarchical namespace: disabled | ||
- Minimum TLS version: Version 1.2 | ||
- Access tier: Hot | ||
- Enable soft delete for blobs: disabled | ||
- Enable soft delete for containers: disabled | ||
|
||
3. When the new storage account is ready, you need to take note of the storage account name and the storage account access keys, as you will use them later to authenticate your Elastic application’s requests to this storage account. | ||
|
||
##### How many Storage account containers? | ||
|
||
The Elastic Agent can use one Storage account container for all integrations. | ||
|
||
#### Running the integration behind a firewall | ||
|
||
When you run the Elastic Agent behind a firewall, to ensure proper communication with the necessary components, you need to allow traffic on port `443` for the Storage Account container. | ||
|
||
##### Storage Account Container | ||
|
||
Port `443` is used for secure communication with the Storage Account container. This port is commonly used for HTTPS traffic. By allowing traffic on port 443, the Elastic Agent can securely access and interact with the Storage Account container, which is essential for storing and retrieving checkpoint data for each event hub partition. | ||
|
||
{% include "ssl-tls.md.njk" %} |
Oops, something went wrong.