-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More generic docs for use in APM Server #6184
Merged
Merged
Changes from 23 commits
Commits
Show all changes
24 commits
Select commit
Hold shift + click to select a range
16833cd
Use specific beat name's instead of 'the Beat'
54cc9f4
Dont show Logstash info unless it's supported.
0539799
Dont show Logstash info if not supported.
74f2e49
Make the index pattern decoupled from the beat name
0b16e74
Make it possible to skip the pipeline docs in output.elasticsearch
4396b35
Use the specific beat name.
5745abf
Fix typo as suggested by @dedemorton.
97e3648
More beat_default_index_prefix.
3d41272
Use name of beat in shared-kibana-config.asciidoc
dd8aef8
Add .\ for PS instruction and make it possible to remove logstash men…
fe3fb16
Introduce 'has_ml_jobs'
81b40cf
Added `html_docs` to .gitignore
29e1fec
Better outputconfig for Elasticsearch only beats.
0219cff
Less blamy wording and ..
e3f055d
Fix list in output config
5b25d58
Only talk about Filebeat for filebeat docs
06ff123
It's 'warn', not 'warning'. Found by @simitt
5319f24
Update the rest of the max_retries sections to only talk about Filebe…
5880999
Special case for apm-server as it was only introduced in 6.0
98dd57a
expand beat_default_index_prefix use (#2)
graphaelli 5344b7b
Revert 'warn' -> 'warning' change.
6ab153b
Add missing quotes.
9232834
Fix conditionals
5a4de2b
Fixed according to @dedemorton's comments.
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -7,6 +7,7 @@ | |
/*/logs | ||
/*/fields.yml | ||
/*/*.template*.json | ||
**/html_docs | ||
|
||
# Files | ||
.DS_Store | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -13,6 +13,12 @@ | |
[[configuring-output]] | ||
== Configure the output | ||
|
||
ifdef::only-elasticsearch[] | ||
You configure {beatname_uc} to write to Elasticsearch by setting options in | ||
the `output.elasticsearch` of the +{beatname_lc}.yml+ config file. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'd say either "in the |
||
endif::[] | ||
|
||
ifndef::only-elasticsearch[] | ||
You configure {beatname_uc} to write to a specific output by setting options | ||
in the `output` section of the +{beatname_lc}.yml+ config file. Only a single | ||
output may be defined. | ||
|
@@ -26,14 +32,16 @@ The following topics describe how to configure each supported output: | |
* <<file-output>> | ||
* <<console-output>> | ||
|
||
endif::[] | ||
|
||
[[elasticsearch-output]] | ||
=== Configure the Elasticsearch output | ||
|
||
++++ | ||
<titleabbrev>Elasticsearch</titleabbrev> | ||
++++ | ||
|
||
When you specify Elasticsearch for the output, the Beat sends the transactions directly to Elasticsearch by using the Elasticsearch HTTP API. | ||
When you specify Elasticsearch for the output, {beatname_uc} sends the transactions directly to Elasticsearch by using the Elasticsearch HTTP API. | ||
|
||
Example configuration: | ||
|
||
|
@@ -222,6 +230,7 @@ output.elasticsearch: | |
message: "ERR" | ||
------------------------------------------------------------------------------ | ||
|
||
ifndef::no-pipeline[] | ||
===== `pipeline` | ||
|
||
A format string value that specifies the ingest node pipeline to write events to. | ||
|
@@ -266,25 +275,29 @@ output.elasticsearch: | |
when.equals: | ||
fields.type: "normal" | ||
------------------------------------------------------------------------------ | ||
endif::[] | ||
|
||
===== `max_retries` | ||
|
||
The number of times to retry publishing an event after a publishing failure. | ||
After the specified number of retries, the events are typically dropped. | ||
Some Beats, such as Filebeat, ignore the `max_retries` setting and retry until all | ||
events are published. | ||
|
||
ifeval::["{beatname_lc}" == "filebeat"] | ||
Filebeat will ignore the `max_retries` setting and retry until all | ||
events are published. | ||
endif::[] | ||
ifeval::["{beatname_lc}" != "filebeat"] | ||
Set `max_retries` to a value less than 0 to retry until all events are published. | ||
endif::[] | ||
|
||
The default is 3. | ||
|
||
===== `bulk_max_size` | ||
|
||
The maximum number of events to bulk in a single Elasticsearch bulk API index request. The default is 50. | ||
|
||
If the Beat sends single events, the events are collected into batches. If the Beat publishes | ||
a large batch of events (larger than the value specified by `bulk_max_size`), the batch is | ||
split. | ||
Events can be collected into batches. {beatname_uc} will split batches larger than `bulk_max_size` | ||
into multiple batches. | ||
|
||
Specifying a larger batch size can improve performance by lowering the overhead of sending events. | ||
However big batch sizes can also increase processing times, which might result in | ||
|
@@ -307,6 +320,7 @@ Elasticsearch. | |
|
||
See <<configuration-ssl>> for more information. | ||
|
||
ifndef::only-elasticsearch[] | ||
|
||
[[logstash-output]] | ||
=== Configure the Logstash output | ||
|
@@ -515,10 +529,14 @@ The number of seconds to wait for responses from the Logstash server before timi | |
|
||
The number of times to retry publishing an event after a publishing failure. | ||
After the specified number of retries, the events are typically dropped. | ||
Some Beats, such as Filebeat, ignore the `max_retries` setting and retry until all | ||
events are published. | ||
|
||
ifeval::["{beatname_lc}" == "filebeat"] | ||
Filebeat will ignore the `max_retries` setting and retry until all | ||
events are published. | ||
endif::[] | ||
ifeval::["{beatname_lc}" != "filebeat"] | ||
Set `max_retries` to a value less than 0 to retry until all events are published. | ||
endif::[] | ||
|
||
The default is 3. | ||
|
||
|
@@ -706,10 +724,15 @@ brokers, topics, partition, and active leaders to use for publishing. | |
|
||
The number of times to retry publishing an event after a publishing failure. | ||
After the specified number of retries, the events are typically dropped. | ||
Some Beats, such as Filebeat, ignore the `max_retries` setting and retry until all | ||
|
||
ifeval::["{beatname_lc}" == "filebeat"] | ||
Filebeat will ignore the `max_retries` setting and retry until all | ||
events are published. | ||
endif::[] | ||
|
||
ifeval::["{beatname_lc}" != "filebeat"] | ||
Set `max_retries` to a value less than 0 to retry until all events are published. | ||
endif::[] | ||
|
||
The default is 3. | ||
|
||
|
@@ -928,13 +951,18 @@ The Redis connection timeout in seconds. The default is 5 seconds. | |
|
||
The number of times to retry publishing an event after a publishing failure. | ||
After the specified number of retries, the events are typically dropped. | ||
Some Beats, such as Filebeat, ignore the `max_retries` setting and retry until all | ||
events are published. | ||
|
||
ifeval::["{beatname_lc}" == "filebeat"] | ||
Filebeat will ignore the `max_retries` setting and retry until all | ||
events are published. | ||
endif::[] | ||
ifeval::["{beatname_lc}" != "filebeat"] | ||
Set `max_retries` to a value less than 0 to retry until all events are published. | ||
endif::[] | ||
|
||
The default is 3. | ||
|
||
|
||
===== `bulk_max_size` | ||
|
||
The maximum number of events to bulk in a single Redis request or pipeline. The default is 2048. | ||
|
@@ -1162,3 +1190,5 @@ When specified, the `cloud.auth` overwrites the `output.elasticsearch.username` | |
`output.elasticsearch.password` settings. Because the Kibana settings inherit | ||
the username and password from the Elasticsearch output, this can also be used | ||
to set the `setup.kibana.username` and `setup.kibana.password` options. | ||
|
||
endif::[] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think asciidoc supports this style of conditional coding. When I ran with has_ml_jobs: no, the description was blank.
You might need to wrap the text in its own statement: