-
Notifications
You must be signed in to change notification settings - Fork 761
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: scrape timeout #558
base: master
Are you sure you want to change the base?
feat: scrape timeout #558
Conversation
cc3f6c3
to
2b6b78e
Compare
Prometheus sends the scrape timeout in HTTP headers, we should reuse it. |
Signed-off-by: Johannes Würbach <[email protected]>
2b6b78e
to
aee4ffd
Compare
While I've seen that, I wasn't sure how well this plays with caching of slow queries 🤔 Could there be queries, where the cache was previously populated after 2-3 fetches, which would always fail with a small timeout? |
Looking at https://github.com/prometheus/client_golang support for the This exporter is using the handler provided by the golang client at the moment
|
I do not think that the context is propagated to any of the collectors at this time. While the http.Request may be cancelled by the header timeout, the downstream functions in the collector have no way to access it. I think this implementation is reasonable at this time based on the interfaces that we have to work with. |
Any further steps I could do to drive this forward @sysadmind @roidelapluie @SuperQ ? |
Ping :-) |
Yes, in order to do this the "usual way", you need to implement The mysqld_exporter does this. I think we should focus on getting #618 finished so we have the new multi-target handler functionality. |
Allow to configure a scrape timeout to prevent queries from hanging forever.