-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Terms Panel is strippling Field contents with space #611
Comments
Here we go again: #364. |
Sorry, but the mapping of that field is String. Its correct, because it is a string. To what kind of type I need to change the field for the search ignore spaces? |
notice |
Ok. I want to change my old indexes to this new one, but I've no success: curl -XPUT 'http://localhost:9200/logstash-2013.10.25/logs/_mapping' -d '{"logs":{"properties":{"@timestamp":{"type":"date","format":"dateOptionalTime"},"@Version":{"type":"string"},"Category":{"type":"long"},"CategoryString":{"type":"string","index":"not_analyzed"},"ComputerName":{"type":"string"},"Data":{"type":"string"},"EventCode":{"type":"long"},"EventIdentifier":{"type":"long"},"EventType":{"type":"long"},"InsertionStrings":{"type":"string"},"Logfile":{"type":"string"},"RecordNumber":{"type":"long"},"SourceName":{"type":"string"},"TimeGenerated":{"type":"string"},"TimeWritten":{"type":"string"},"Type":{"type":"string"},"User":{"type":"string"},"host":{"type":"string"},"message":{"type":"string"},"path":{"type":"string"},"type":{"type":"string"}}}}' |
You can not change the analyzer of an existing field, you must reindex. |
hmm ok. I'll try with this procedure: http://www.elasticsearch.org/blog/changing-mapping-with-zero-downtime/ |
this does not seem to be working for fields that have an array/list type. how should i specify the analyzer for such fields? please advise |
* Add default content for Markdown Closes elastic/kibana-canvas#526 * simplify the fix
If you put a Field where it content has space between two words (like Credential Validation), Terms Panel will strip it into two words, displaying a wrong information into graph.
The text was updated successfully, but these errors were encountered: