You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I set logging.json=true, it seems that the logs that was output to STDOUT is not 100% json... there are some additional fields that's there ... I am using heartbeat 6.1.1 in a docker container, and the logs look like this... There are some additional fields that's there before the JSON part starts.
When I run logstash in docker with its logs output setup as JSON, it's fully 100% json, which I was able to have them processed with... logstash to have it sliced and diced ....
While I can do an additional logstash logic to parse this.... it would be great if there's some sort of standard that we should expect from elastic products when I specify the logs to be in JSON format.
The text was updated successfully, but these errors were encountered:
When I set logging.json=true, it seems that the logs that was output to STDOUT is not 100% json... there are some additional fields that's there ... I am using heartbeat 6.1.1 in a docker container, and the logs look like this... There are some additional fields that's there before the JSON part starts.
2018/01/12 05:40:44.257192 manager.go:110: {"level":"INFO","message":"Select (active) monitor http","timestamp":"2018-01-12T05:40:44Z"}
When I run logstash in docker with its logs output setup as JSON, it's fully 100% json, which I was able to have them processed with... logstash to have it sliced and diced ....
While I can do an additional logstash logic to parse this.... it would be great if there's some sort of standard that we should expect from elastic products when I specify the logs to be in JSON format.
The text was updated successfully, but these errors were encountered: