-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow setup adhoc queries in ConnectionSettings #330
Comments
@asafcombo thanks a lot for the suggestions, look like to resolve |
if |
We could also limit the value to the time interval of the dashboard itself? |
@dsztykman I think if we use $from and $to we need also know which field is defined as "timestamp" field on specified {database}.{table}, but currently ad-hoc interface doesn't have ability to specified, we can try to parse system.columns before make |
@dsztykman feel free to make pull request if you figure-out with plugin source |
I'm going to try replacing with: |
@dsztykman any progress with this approach? |
@Slach I actually have some issues testing, which aren't related:
|
@dsztykman could you run: git remote add altinity [email protected]:Altinity/clickhouse-grafana.git
git pull altinity master I added |
Unfortunately what I'm doing doesn't work... I'm trying to do a variable called But I really can't make it work sorry |
There are 2 performance issues with how adhoc query gets the values.
https://github.com/Vertamedia/clickhouse-grafana/blob/d92ff14d529f5e02e835860adf4fa4d58192ed4b/src/adhoc.ts#L3
issue 1
Happens when cardinality of values is lower than 300. In this case, the engine will scan the entire table to try and get to 300 values.
I suggest modify to
The query above limits the total amount of data to scan.
issue2
The 2nd issue happens with tiered storage setups. The scan is not limited to fast tiered drives. Usually the latest data is located on fast disks so I propose to do
Although the above case is very specifc to the timestamp field name.
The text was updated successfully, but these errors were encountered: