-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Datastore integration with Collector API
#1109
Comments
Thx. |
Hello @johnaohara , I have drafted a rough CollectorAPI datastore that gets a json array (as an
However, Furthermore, looking at what the new DataStore is doing, it's essentially getting the data from the Collector and putting them in Horreum, i.e. it duplicates the data while I thought it would use them on demand. @Karm is that what you had in mind? In that case what's the benefit of keeping the collector around instead of hosting a Horreum instance instead and uploading directly to it? |
Hey @zakkak that error is not expected, and this code path is well tested :( |
Horreum allows users to dynamically create Labels (i.e. derived data/metrics) retrospectively after data has been "indexed". e.g. if a user adds a new change detection variable, or a the calculation for a Labels changes, Horreum can retrospectively calculate values from all retrieved documents. The calculation of Labels occurs within the database, and therefore we are reliant on data being in the backend store before it is processed. If order to handle Label calculation for downstream datastores, we cache the original document. This allows us to process the JSON documents, but also saves us from having to retrieve all historic documents if the user defines a new or updates a Label |
Not that I saw, I will try to reproduce and have a better look. |
Hi @zakkak I am coming back to this, the error is strange and I have not seen this happen before. All the code paths that lead to that method call |
Hi @johnaohara, sorry for the late reply I was on vacation. No, I didn't have the time to reproduce yet. I am aware that you are going to give it a try yourself, let me know if you need anuthing more. |
Hi @johnaohara, I am trying this out again and am now getting a different error when trying to import some runs (this time on a Linux machine): Logs:
What I did:
to import the runs of the Any hints on what I might be doing wrong? |
HI @zakkak you are not doing anything wrong, it is just that all the documents are being processed by a single blocking call within a single transaction, and that transaction is timing out. For now, you can increase the tx timeout : https://quarkus.io/guides/transaction#configuring-the-transaction-timeout This is a short term workaround, and will not scale for large number of runs. I had started working on change that offloads the processing to a message queue, I will open a PR for that so we can get the changes in at the same time |
Thank you @johnaohara I will do that. Note that it shouldn't be necessary after the initial load of the data as I expect us to push the new builds on a daily basis after that, in which case the timeout could be enough. |
An external Collector API tool (https://github.com/Karm/collector) allows performance metrics to be pushed and queried as JSON documents via a REST API
In order to integrate existing datasets into Horreum for analysis, a new
Collector API
datastore needs to be created that allows Horreum Users to query and existing instance.The
Collector API
datastore should be defined as a new Datastore Type (https://github.com/Hyperfoil/Horreum/blob/master/horreum-backend/src/main/java/io/hyperfoil/tools/horreum/datastore/Datastore.java), similar to how the Elasticsearch datastore is integrated (https://github.com/Hyperfoil/Horreum/blob/master/horreum-backend/src/main/java/io/hyperfoil/tools/horreum/datastore/ElasticsearchDatastore.java)Documentation on how the Elasticsearch document store is used can be found here: https://horreum.hyperfoil.io/docs/tutorials/query-elasticserach/
The text was updated successfully, but these errors were encountered: