-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Importing CSV in Elasticsearch and display it in Kibana. #1310
Comments
1)Use grok filter instead of csv filter, and check for lines which fail with "_grokparsefailure" in the tags. The once with grok parse failures are the one which are failing. |
Hi thejaspm,
By the way, I have elasticsearch which consists of 4 indices, I am unable to display the index which I want in kibana dashboard. Please help me..:) |
You can configure the Index in the "configure Dashboard" option, look for a "gear icon" on the page top For setting the pagination limit go to "All events" section . open the configure menu and change the limit. "500" is just the default value. |
Hi thejaspm, Thanks a lot! Actually I had tried all.. I was just a step behind. What I did is I forgot to REFRESH the kibana..:) lol I found. Those links are really helpful. May be out off topic(issue) : Its will be helpful for me.. Many Thanks :) |
Closing this since it is not a bug |
Hi Rashid.. |
how can i upload csv file to elastic search without the logstash file? |
Hi,
I had imported a CSV file with 5000 lines of data into Elasticsearch using Logstash by creating a Logstash.conf file.
How can I recover the other data? and is there any data limitation in Kibana?
logstash.conf-->ES-->Kibana
the latest csv is displayed, now my question is how do I get my data of the 1st csv?
I am a newbie in Log Centralization I am less aware to the work arounds.

Please help me.
My questions may not be structured because I have a lots of doubts in ELK Stack.
:)
My Logstash.conf file ::
input {
stdin {
type => "stdin-type"
}
file {
path => ["C:/tomcat/webapps/logstash-1.4.1/bin/var/ServiceConnect.csv"]
start_position => "beginning"
}
}
filter {
csv {
columns => ["Id","Stamp","Comments"]
separator => ","
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
The text was updated successfully, but these errors were encountered: