-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[filebeat] Wrong "gc_overhead" parsed by Elasticsearch module #9513
Labels
Comments
This was created in filebeat version 6.4.3 |
Pinging @elastic/infrastructure |
Pinging @elastic/stack-monitoring |
@tsouza Thanks for reporting this. I'm working on a fix now. When looking at the parsed document for |
ycombinator
added a commit
that referenced
this issue
Dec 27, 2018
Resolves #9513. This PR: * removes the incorrectly-parsed `gc_overhead` field. Turns out what we were parsing was actually an insignificant sequential number, not GC overhead, * parses out a new `gc.collection_duration` field, e.g. `1.2s`, which is the time spent performing GC, and * parses out a new `gc.observation_duration` field, e.g. `1.8s`, which is the overall time over which GC collection was performed It also splits up the long grok expression in the ingest pipeline into smaller patterns and references those patterns, hopefully for easier readability.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Version: 6.5.3
The Elasticsearch Filebeat module will parse the following GC log message:
Into a document that has:
There are a couple of issues here:
elasticsearch.server.gc_overhead
is coming from[gc][238436] overhead
. According to this https://github.com/elastic/elasticsearch/blob/master/server/src/main/java/org/elasticsearch/monitor/jvm/JvmGcMonitorService.java#L309 the number238436
is just a sequential number that holds no significant meaning in GC timings. This is not the GC overheadspent [650ms] collecting in the last [1s]
The text was updated successfully, but these errors were encountered: